Hi folks,
We had a major issue with one of our downstream systems. Hence we have been requested to provide splunk data in csv (which we provide on a daily basis) for the last 120 days.
The logic of the daily job is to run a report between 00:01 hrs to 00:00 hrs for the previous day and export to a csv file. Now I need to repeat this same search for last 120 days and create 120 individual reports. !!
Is there a way (or intelligent search) where I can export the CSV file on a daily basis for the last 120 days? (One Option I thought is to run the search by CLI and call the dates a parameter within a script, but its bit ugly)
Would gentimes
and map
be an option for you?
...| gentimes start=-120 increment=1d | map search="search <insert your search here> starttimeu::$start$ endtimeu::$end$" maxsearches=120
I don't really have too much experience with map/gentimes, and perhaps 120 parallell searches may be too much (don't know how it queues up).
http://docs.splunk.com/Documentation/Splunk/5.0.5/SearchReference/Gentimes
http://docs.splunk.com/Documentation/Splunk/5.0.5/SearchReference/Map
UPDATE:
Managed to get this search working, and it spits out a separate .csv file in $SPLUNK_HOME/var/run/splunk
| gentimes start=-4 | map search="search index=_internal sourcetype=splunkd component!=Metrics starttimeu=$starttime$ endtimeu=$endtime$ |head 10| table component, log_level | outputcsv dummy_$starttime$ " maxsearches=4
/K
Would gentimes
and map
be an option for you?
...| gentimes start=-120 increment=1d | map search="search <insert your search here> starttimeu::$start$ endtimeu::$end$" maxsearches=120
I don't really have too much experience with map/gentimes, and perhaps 120 parallell searches may be too much (don't know how it queues up).
http://docs.splunk.com/Documentation/Splunk/5.0.5/SearchReference/Gentimes
http://docs.splunk.com/Documentation/Splunk/5.0.5/SearchReference/Map
UPDATE:
Managed to get this search working, and it spits out a separate .csv file in $SPLUNK_HOME/var/run/splunk
| gentimes start=-4 | map search="search index=_internal sourcetype=splunkd component!=Metrics starttimeu=$starttime$ endtimeu=$endtime$ |head 10| table component, log_level | outputcsv dummy_$starttime$ " maxsearches=4
/K
Odd. In my case the the filenames were like dummy_1380405600.csv
, i.e. the epoch timestamp representing that particular day.
Unfortunately, I could not get it translate that to a YY-mm-dd-style timestamp. Maybe it's easy, maybe it's impossible. I just don't know.
my output filename is "dummy_$starttime$.csv"
I expected it to be something like "dummy_201309290000.csv"
Sorry, but I don't know what you mean by 'physical value'. If you want a slightly more readable filename, you can try to use this naming convention instead;
outputcsv dummy_$_serial_id$
where $_serial_id$
will be replaced with an incremental number for each run, i.e. day.
Unfortunately I have stretched the limit of my map/gentimes knowledge here.
/K
A workaround to have dynamic names, will be to use as script to call the CLI to run the searches and script the name of the outputs.
You are a star mate. It worked. Just only small issue is the file is output as "dummy_$starttime$.csv"
(any method to replace the $starttime$ with physical value?)