I've added an ability to an Advanced XML report to export search results using the Export module.
Exporting is by default limited to 50k results.
Can this limit be set to 100k in the limits.conf file?
I tried different settings but nothing seems to do the trick.
Splunk's Export module just turns around and hits a REST API endpoint on splunkd. In particular it hits /servicesNS/
The Export module does not pass a count
param to this endpoint although the endpoint does accept one. Regardless, if you were to pass count=100000 to the REST endpoint, the behavior is still restricted by the following stanza in limits.conf
[restapi]
maxresultrows = 50000
Which limits the maximum number of returned rows presumably across the entire search API.
(Note that this is not a limit on number of rows that are piped around between different commands in a report - just the number of rows you can get out using the REST endpoints .)
Anyway, changing that limits.conf to 100,000 seems to work fine, in that I'm able to hit the endpoint directly and download a csv containing all rows of a 90,000 row search result.
However I urge strong caution as limits are not placed in limits.conf without good reason. Quite possibly or even probably, setting this value too high can result in seriously degraded performance or instability on the server. If I had to guess you might be able to send the system into swap because 50,000 rows is approximately the safe limit of how many rows you can fit into memory at once on minimum spec hardware.
You might also take a look at the Splunk For Excel Export app, which claims to be able to export millions of events.
http://splunk-base.splunk.com/apps/29336/splunk-for-excel-export
I believe the app actually does the export in a streaming fashion and does not use the export
endpoint at all so it might somehow not be subject to this limit.
Splunk's Export module just turns around and hits a REST API endpoint on splunkd. In particular it hits /servicesNS/
The Export module does not pass a count
param to this endpoint although the endpoint does accept one. Regardless, if you were to pass count=100000 to the REST endpoint, the behavior is still restricted by the following stanza in limits.conf
[restapi]
maxresultrows = 50000
Which limits the maximum number of returned rows presumably across the entire search API.
(Note that this is not a limit on number of rows that are piped around between different commands in a report - just the number of rows you can get out using the REST endpoints .)
Anyway, changing that limits.conf to 100,000 seems to work fine, in that I'm able to hit the endpoint directly and download a csv containing all rows of a 90,000 row search result.
However I urge strong caution as limits are not placed in limits.conf without good reason. Quite possibly or even probably, setting this value too high can result in seriously degraded performance or instability on the server. If I had to guess you might be able to send the system into swap because 50,000 rows is approximately the safe limit of how many rows you can fit into memory at once on minimum spec hardware.
You might also take a look at the Splunk For Excel Export app, which claims to be able to export millions of events.
http://splunk-base.splunk.com/apps/29336/splunk-for-excel-export
I believe the app actually does the export in a streaming fashion and does not use the export
endpoint at all so it might somehow not be subject to this limit.
Great, that worked perfect!
I can confirm the Export module can now export more than 50k records.
I will also take a look at the Excel Export module, but a few months ago I ran into a problem with it not handling chinese charsets well.
But that is something for another post.
what version are you running?