Hi,
We need to have a copy of a big SQL table in a CSV file to speed up some lookups...
We do retrieve the data using a savedsearch, and we schedule it to run every hour and save the result to a CSV file.
The search is like this:
| dbxquery maxrows=0
query="query string" connection="db_connection"
| fields field1, field2, field3, field4, field5, field6, field7, field8, field9
Adding the maxrows=0 allow to retrieve all data. If we run the search thru Splunk web, we do see 507.000 results.
If we use the API to get the results as explained in this link:
Exporting Large Result Sets to CSV
We get the full CSV, with 507.000 rows, and we can use it for lookups.
However, if we create a schedule to the savedsearch and a trigger to export to a lookup CSV file, we only get 50.000 lines...
How can we save the whole 500.000 lines to a CSV using the scheduler?
Thanks in advance!
... View more