Splunk Search

Search Query consuming high memory utilization on indexers

harshsri21
New Member

Hi,

I am trying to find a list of search queries in a specific time frame that consumed high memory on the indexers.
We have an indexer cluster of 40 indexers and search head cluster of 4 SHs, suddenly for a short span of time we experienced high memory utilization on 33 indexers and consequently 2 SHs also went down.

Please help in generating the query and understanding the cause of such behavior.

Tags (1)
0 Karma

DalJeanis
Legend

Try something like this...

index=_audit action="search" info="completed" NOT user="splunk-system-user"
| table user, is_realtime, total_run_time, exec_time ,result_count 
| eval exec_time=strftime(exec_time,"%m/%d/%Y %H:%M:%S:%3Q") 
| sort 0 - total_run_time

If something is chewing up a lot of resources, it's going to have a high total_run_time, so that query should float it up to the top. You can limit it to the time in question, plus a little before and after, and it should give you a few candidates to check for a resource hog.

You can also add to the initial search is_realtime=1, to look just at any realtime searches. They tend to be massive cpu sucks, so check them out as well.

0 Karma

harshsri21
New Member

Thanks, Can we also get a splunk query to know which processes are consuming high memory on indexers...

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...