Splunk Search

Searching for matches during specific times

jfarns
New Member

My search is something like:
index=foo "get /foo/bar"| eval a=_time+1s| eval b=_time+10m | table a,b,ip, field1, field2

How would I search these results for events between times a, b and where field1 and field2 match?

0 Karma
1 Solution

elliotproebstel
Champion

If I understand you correctly, this is your goal:

  1. Identify in index=foo events containing the string "get /foo/bar".
  2. Given the set of events from (1), search for other events that occurred between 1 second and 10 minutes later - where such events contain the fields field1 and field2 (or maybe where such events contain field1=arg1 and field2=arg2).

I think you'll need a map search that populates the search time like this:
index=foo "get /foo/bar" | eval a=relative_time(_time, "+1s"), b=relative_time(_time, "+10m") | map search="search earliest=$a$ latest=$b$ index=foo field1 field2 | eval a=$a$, b=$b$ | table a, b, ip, field1, field2" maxsearches=1000

If you need to search for events where field1=arg1 and field2=arg2, then it would look like this:
index=foo "get /foo/bar" | eval a=relative_time(_time, "+1s"), b=relative_time(_time, "+10m") | map search="search earliest=$a$ latest=$b$ index=foo field1=arg1 field2=arg2 | eval a=$a$, b=$b$ | table a, b, ip, field1, field2" maxsearches=1000

As a note, using map is very slow, because it kicks off a new search for each result from the first result set. Also, you can set the maxsearches argument to a smaller number if you want to constrain the number of times the map command runs. If you don't specify a value for maxsearches, the default is 10 - regardless of how many events were in the first result set.
http://docs.splunk.com/Documentation/SplunkCloud/6.6.3/SearchReference/Map

View solution in original post

0 Karma

elliotproebstel
Champion

If I understand you correctly, this is your goal:

  1. Identify in index=foo events containing the string "get /foo/bar".
  2. Given the set of events from (1), search for other events that occurred between 1 second and 10 minutes later - where such events contain the fields field1 and field2 (or maybe where such events contain field1=arg1 and field2=arg2).

I think you'll need a map search that populates the search time like this:
index=foo "get /foo/bar" | eval a=relative_time(_time, "+1s"), b=relative_time(_time, "+10m") | map search="search earliest=$a$ latest=$b$ index=foo field1 field2 | eval a=$a$, b=$b$ | table a, b, ip, field1, field2" maxsearches=1000

If you need to search for events where field1=arg1 and field2=arg2, then it would look like this:
index=foo "get /foo/bar" | eval a=relative_time(_time, "+1s"), b=relative_time(_time, "+10m") | map search="search earliest=$a$ latest=$b$ index=foo field1=arg1 field2=arg2 | eval a=$a$, b=$b$ | table a, b, ip, field1, field2" maxsearches=1000

As a note, using map is very slow, because it kicks off a new search for each result from the first result set. Also, you can set the maxsearches argument to a smaller number if you want to constrain the number of times the map command runs. If you don't specify a value for maxsearches, the default is 10 - regardless of how many events were in the first result set.
http://docs.splunk.com/Documentation/SplunkCloud/6.6.3/SearchReference/Map

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...