I am trying to do a db connect input bu the max rows that it will take is 1000000. My query is going to return more than that. Is there a way to up the max rows, or to just tell it to take whatever is returned by the query?
I just recently ran into the same scenario. I had to batch load 30million DB records into Splunk for auditing.
I updated the inputs.conf file to allow a max results of 30million and refreshed Splunk. Worked flawlessly.
If you have a rising column and need to tail your results then you may want to consider a resource pool.
I just recently ran into the same scenario. I had to batch load 30million DB records into Splunk for auditing.
I updated the inputs.conf file to allow a max results of 30million and refreshed Splunk. Worked flawlessly.
If you have a rising column and need to tail your results then you may want to consider a resource pool.
I found where the input is stored ... local/inputs.conf. If I manually change the max_rows value, restart the server, and then don't edit it through the interface would that work?