Splunk Search

Is the search concurrency and limits.conf too high?

skirven
Communicator

Hi!
I'm wrestling with performance on our Production Splunk installation and have been reading on Search Concurrency and limits.conf. I'm trying to reconcile my information to make sure I'm understanding what I'm looking at.

In my current limits.conf (which I contend is too high)
[search]

base_max_searches=100
max_searches_per_cpu=10
dispatch_dir_warning_size = 10000
max_rawsize_perchunk = 0

We have 15 SH's with 16 CPUs each. What I'm trying to wrap my head around is in the DMC, in the SH section, drop down to "Search Concurrency". and I'm seeing one or two servers with a higher number, and most with low or 0 in there. My thought is that with the numbers being so high, the system is trying to overtax the system with processes, and bogging down one or 2 SHs and never really leveraging the power of the 15 SH cluster.

We experience crashes of the system, or where the Search Head goes down at the API level, etc.

Would it be best to put the limits.conf to something like:

base_max_searches=6
max_searches_per_cpu=1
dispatch_dir_warning_size = 10000
max_rawsize_perchunk = 0
max_searches_perc = 50

On a 16 Core server, that may give us:

Max Total Searches of 28
Max Scheduled Searches of 14

I'm still learning and reading, so I'd like some input to validate the findings.
Thank you,
Stephen Kirven

adonio
Ultra Champion
0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...