Splunk Search

Why is parsing taking so long even with basic searches and filters?

rdschmidt
Explorer

I am having an issue with our Splunk Server. Every search you run, no matter all the filters, or if you create it basic, the search will say "Parsing" for almost 100 - 120 seconds before it searches. Sometimes if i search a block of records for 7 days and it contains 300,000 records it will parse for 120 seconds and the search will only take 80 seconds.

I have this running on a Virtual Machine that has 8 CPU and 16 GB of RAM dedicated. I cant find that this is a CPU or Memory Issue.

Any Help would be greatly Appreciated.

Tags (4)
1 Solution

cheinlein
Engager

Found that we needed a bigger more distributed environment. We are using Amazon Web Services and initially went to a larger EC2 unit. We also went with a CPU optimized configuration, we had been using a memory optimized unit. Still not the performance we wanted so we broke up our environment into a forwarder, 2 indexers and 1 large search head. We get awesome performance now.

View solution in original post

matthieu_araman
Communicator

I've got the same pb here.
10 minutes delay during parsing phase for any schedule or adhoc search (except if it's directly |), even when it should be low hour usage.
the server is loaded and does too much things but this doesn't explain the behaviour and what I should monitor.
splunk version is 6.2.2 , linux 64 bit, 8 core
this looks to me like there's some queue limit somewhere.
Any idea where to look for ? (every search included on splunk logs is delayed so it's slow to debug...)

0 Karma

cheinlein
Engager

Found that we needed a bigger more distributed environment. We are using Amazon Web Services and initially went to a larger EC2 unit. We also went with a CPU optimized configuration, we had been using a memory optimized unit. Still not the performance we wanted so we broke up our environment into a forwarder, 2 indexers and 1 large search head. We get awesome performance now.

aalanisr26
Path Finder

did you find an answer for this? we are experiencing the same problem,deja vu... did you find a solution? we are seeing the same thing

0 Karma

lguinn2
Legend

64 bit, I assume? How many IOPS on this machine? Have you looked at the Search Job Inspector, which will give you better information about this particular job? Have you looked at the Splunk _internal index for warnings or errors that might indicate performance bottlenecks?

0 Karma

rdschmidt
Explorer

Linux CentOS

0 Karma

lguinn2
Legend

I'd open a support ticket. What OS are you running?

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...