Splunk Search

Attempt to workaround 10k subsearch limit -- how to combine multiple lookup files?

the_wolverine
Champion

I'm breaking up my search and outputting the results into separate files. How can I combine these files into a single file once I'm done? .. Using Splunk UI, of course 😉

Example would be something like: | inputlookup lookupfile1.csv lookupfile2.csv lookupfile3.csv | dedup fieldname

1 Solution

ziegfried
Influencer

The append flag of the inputlookup command to the rescue (inputcsv supports it as well)

| inputlookup lookupfile1.csv append=1 | inputlookup lookupfile2.csv append=1 | inputlookup lookupfile2.csv append=1 | dedup fieldname

View solution in original post

ziegfried
Influencer

The append flag of the inputlookup command to the rescue (inputcsv supports it as well)

| inputlookup lookupfile1.csv append=1 | inputlookup lookupfile2.csv append=1 | inputlookup lookupfile2.csv append=1 | dedup fieldname

Ayn
Legend

It would be interesting to hear more about why the 10000 limit is there in the first place. I've encountered it loads of time by now (using inputlookup for loading blacklists of IP's that should be matched for instance) and it's of course frustrating to be limited like this. The question is if there's a very good reason for this limit to exist and have this specific value, like for instance that past 10000 search terms you won't increase performance with any additional terms?

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...