My saml environment is one search head/indexer box, one indexer peer box and one forwarder.
I placed about 30gb worth of .gz logs (15 files total) into the monitor directory on the forwarder. Splunkd.log said it handled, read and processed the each log correctly. However, only about half of the files (sources) actually made their way to the indexers. Why is this?. The solution was to copy each file into the monitor directory manually, wait for it to finish processing, then copy the next one in.
Secondly, how does batch differ from monitor and would it solve this problem?
Here are some settings on the forwarder:
maxkbps = 0
max_mem_usage_mb= 200
parallelIngestionpipelines = 1
The "failure" was due to not having the index setup on our second indexer during ingestion. We thought that the forwarder would be smart enough to know not to send data to a peer if the index didn't exist on it.
Our fault.
The "failure" was due to not having the index setup on our second indexer during ingestion. We thought that the forwarder would be smart enough to know not to send data to a peer if the index didn't exist on it.
Our fault.
hello there,
for the second question:
from inputs.conf.spec:
NOTE: Batch should only be used for large archives of historic data. If you
want to continuously monitor a directory or index small archives, use 'monitor'
(see above). 'batch' reads in the file and indexes it, and then deletes the
file on disk.
[batch://<path>]
* A one-time, destructive input of files in <path>.
* For continuous, non-destructive inputs of files, use 'monitor' instead.
as for the first question,
did you get a massage like file is to large, waiting ... on the forwarder?
hope it semi helps