Hi great knowledgeable splunkers!
I have a number of queries that I need to chain in specific order so that static lookup tables are updated properly.
For example, imagine I had 10 or so of these that need different updates
| inputlookup ref_data_mac_address_daily.csv | search NOT [ref_macaddresses_all_time.csv
| fields src_mac] | outputlookup ref_data_new_data.csv
|input lookup ref_data_new_data.csv| search <for something| outputlookup ref_cleandata.csv
Is there a simple way to call 10 or more queries in a row as one scheduled task? (without having to write an external script?)
cheers,
ag
The only solid thing I can think of is quite ugly, and that's to chain them together as subsearches. You could add a fields nonexistentField
clause to make sure that the subsearches didn't yield any searchterms out to the parent search, and for good measure you could be doubly safe by dumping the terms out into a separate search *
clause after the outputlookup
.
| inputlookup someOther_daily.csv | search NOT [someOther_all_time.csv
| fields keyField] | outputlookup someOther_new_data.csv | search * [| inputlookup ref_data_mac_address_daily.csv | search NOT [ref_macaddresses_all_time.csv
| fields src_mac] | outputlookup ref_data_new_data.csv | fields nonexistentField ]]
etc....
The problem that you might hit, is that subsearches will get autofinalized if they take too long, and if you chain 10 of these together, I think the outer searches here might start to hit those limits.
Nice thinking outside of the box, but I might have to do an external command to run a list of splunk queries in a row.