Is there a max file count a single forwarder can monitor? I have some oracle applications that generate 10,000's of new files daily, various sizes. I have a dedicated host that has these shared mounts mounted up and just scanning for new data. Every how and again it crashes the box. I can't seem to make anything out on the console.
Am I running into some sort of file limit?
Can you use a local forwarder on the source data, so we can determine if it's Splunk or mount related issues?
The Splunk on Splunk app will give you an idea of what's causing the queue blockage.
It's possible for a single Universal Forwarder to saturate an indexer if it has sufficient event throughput.
09-04-2012 00:10:36.644 -0500 INFO TailingProcessor - Could not send data to output queue (parsingQueue), retrying...
09-04-2012 00:10:37.836 -0500 INFO TailingProcessor - ...continuing.
For one of the data sources, I have, and it works just fine with about 100,000 files. It's when I added several other data sources it broke down.
I do have these errors in splunkd.log that I've just noticed:
09-04-2012 03:06:06.730 -0500 INFO TailingProcessor - File descriptor cache is full (100), trimming...