Splunk Search

inputlookup search timerange

jpenetra
Explorer

Hello,I have created a csv similar to the one present on the musicdashboard tutorial

"_time", "origin", "destiny" ..
"1384792901.868352", "example",
"example" ..

and then I created the following searchmanager

{% searchmanager id="search-newsource"
search='| inputlookup newsource.csv | search
origin=$originatorKey$ OR
destiny=$recipientKey$
cache="False"
earliest_time="$et$"|token_safe
latest_time="$lt$"|token_safe
%}

and the timerange is defined like this:

{% timerange id="timeKey"
earliest_time="$et$"|token_safe
latest_time="$lt$"|token_safe %}

Am I doing something wrong? This does not work.

0 Karma
1 Solution

lguinn2
Legend

Yes. The problem is that you are setting earliest_time and latest_time - but Splunk does not know how to relate that to the _time field that you have defined in your lookup table. Also, it doesn't look like you closed the search=; it appears to be missing a closing '

When Splunk indexes events, it creates an internal field called _time for each event, and it knows how to relate that timestamp to the time range picker, as well as the search criteria for start and end times.

None of that applies to lookup tables. In fact, you should not be creating field names in your lookup table that start with an underscore - user-created fields must start with an alphabetic character.

You could, however, do the following and it should work:

A. Change the field name in your .csv file --

"timestamp", "origin", "destiny"
etc

B. Change your search

search="| inputlookup newsource.csv | search origin=$originatorKey$ OR destiny=$recipientKey$ cache=False timestamp >= $et$  timestamp <= $lt$" | token_safe

Now, this assumes the data in $et$ and $lt$ is in the same numeric epoch time format as the csv file. But if it isn't, you will need to convert it before you do the comparison. Here's how:

search='| inputlookup newsource.csv | search origin=$originatorKey$ OR destiny=$recipientKey$ cache=False 
| eval et=$et$ | eval lt=$lt$ 
| eval startTime = case(et=="now",now(), et=="", now(), 1==1, relative_time(now(), et)
| eval endTime = case(lt=="now",now(), lt=="", now(), 1==1, relative_time(now(), lt)
| search timestamp >= startTime timestamp <= endTime' | token_safe

View solution in original post

lguinn2
Legend

Yes. The problem is that you are setting earliest_time and latest_time - but Splunk does not know how to relate that to the _time field that you have defined in your lookup table. Also, it doesn't look like you closed the search=; it appears to be missing a closing '

When Splunk indexes events, it creates an internal field called _time for each event, and it knows how to relate that timestamp to the time range picker, as well as the search criteria for start and end times.

None of that applies to lookup tables. In fact, you should not be creating field names in your lookup table that start with an underscore - user-created fields must start with an alphabetic character.

You could, however, do the following and it should work:

A. Change the field name in your .csv file --

"timestamp", "origin", "destiny"
etc

B. Change your search

search="| inputlookup newsource.csv | search origin=$originatorKey$ OR destiny=$recipientKey$ cache=False timestamp >= $et$  timestamp <= $lt$" | token_safe

Now, this assumes the data in $et$ and $lt$ is in the same numeric epoch time format as the csv file. But if it isn't, you will need to convert it before you do the comparison. Here's how:

search='| inputlookup newsource.csv | search origin=$originatorKey$ OR destiny=$recipientKey$ cache=False 
| eval et=$et$ | eval lt=$lt$ 
| eval startTime = case(et=="now",now(), et=="", now(), 1==1, relative_time(now(), et)
| eval endTime = case(lt=="now",now(), lt=="", now(), 1==1, relative_time(now(), lt)
| search timestamp >= startTime timestamp <= endTime' | token_safe

lguinn2
Legend

Edited my answer to show how to convert time to epoch

jpenetra
Explorer

Considering that I want the results of the Last 24 hours and $et$ matches to -24h@h and $lt$ to "now", how can I convert those to epoch time? Can't seem to find that answer.

0 Karma

yong_ly
Path Finder

to further add onto this, you can use an eval with strptime to convert your timestamps into epoch time for comparisons and then strftime to convert them back into readable strings..

Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...