Getting Data In

Which input provides better performance batch or monitor

Marinus
Communicator

I'm looking to forward data collected via a lightweight forwarder. Which input provides better performance batch or monitor? I'm trying to reduce the disk footprint but I'd like to get the data to the indexer as quickly as possible.

0 Karma
1 Solution

Stephen_Sorkin
Splunk Employee
Splunk Employee

Batch and monitor are both delivered by the exact same subsystem within Splunk, so there shouldn't be any significant difference in performance.

View solution in original post

Stephen_Sorkin
Splunk Employee
Splunk Employee

Batch and monitor are both delivered by the exact same subsystem within Splunk, so there shouldn't be any significant difference in performance.

vbumgarner
Contributor

Is this 20MB value tunable? I would like to have a forwarder reading from many files and fan them out to many indexers as fast as it can. The single threaded nature is killing me.

0 Karma

Stephen_Sorkin
Splunk Employee
Splunk Employee

Both batch and monitoring single thread reading of files that have more than 20MB remaining to read. This behavior has a positive impact on performance since reading is fundamentally very fast, but parsing, which is often single threaded is the main bottleneck and performs better with coherent streams of data.

Marinus
Communicator

Stephen, does the batch input eat files one at a time? If it does I'd expect the file monitor to perform better?

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...