Splunk Search

efficient search to retrieve information about user sessions with errors

juniormint
Communicator

index=devdata session=* "ERROR"| eval errorSession=session | join type=outer session [search index=devdata session=errorSession | dedup session sortby _time | eval startTime = _time ] | table _time, session, startTime, _raw

In what sessions do ERRORs occur? When did the error sessions start?

I'm hitting some problems and would like some advice on how to write an efficient search to answer these questions.
1) The above session=errorSession does not appear to work.
2) join seems pretty slow

Thanks!

Tags (1)
0 Karma
1 Solution

somesoni2
Revered Legend

Try this

index=devdata [search index=devdata "ERROR" | dedup session | table session] 
| stats earliest(_time) as startTime by session 
| append [search index=devdata session=* "ERROR" | table _time, session, _raw]
| stats first(startTime) as startTime first(_time) as _time first(_raw) as _raw by session

This should give you startTime of the session, _time and _raw from the Error session.

View solution in original post

somesoni2
Revered Legend

Try this

index=devdata [search index=devdata "ERROR" | dedup session | table session] 
| stats earliest(_time) as startTime by session 
| append [search index=devdata session=* "ERROR" | table _time, session, _raw]
| stats first(startTime) as startTime first(_time) as _time first(_raw) as _raw by session

This should give you startTime of the session, _time and _raw from the Error session.

aweitzman
Motivator

This search will give you all of the session IDs with errors:

index=devdata "ERROR" | dedup session | table session

Use this as a subsearch to get all events for each such session:

index=devdata [search index=devdata "ERROR" | dedup session | table session]

Now tweak it to get the just first event of each such session, and make that your output:

index=devdata [search index=devdata "ERROR" | dedup session | table session] | dedup session sortby +_time | table _time, session, _raw

juniormint
Communicator

Pretty good answer...capturing the error message rather than first message seen more aligned with my use case.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...