Splunk IT Service Intelligence

CPU-bound search. seems related to KV processing.

bcavagnolo
Explorer

Ahoy. We've been experiencing a search performance problem and I'm having trouble figuring out what to do about it. I've been following the advice and techniques outlined here:

http://wiki.splunk.com/Community:PerformanceTroubleshooting

The very simple search I am performing is this:


splunk> sourcetype="mysource"

...for the last 60 minutes. The search trudges away revealing "0 matching events" for about 10 minutes before revealing anything. During this period, the CPU performing the search shoots to 100% usage. iostat reports low tps (like ~10). vmstat shows no swap activity. So I'm pretty sure this is CPU bound. The search log shows AMPLE time consumption by "SearchOperator:kv", and all of the default kv searches are being performed. The "access-extractions" system default extraction seems to take about 5 whole minutes. The job inspector reports practically all of the time in "dispatch.evaluate.search". (I found it odd that the job inspector didn't identify command.search.kv considering the contents of the search log.) Note that according to the high-level summary on the search page, "mysource" has about 276 million events. Not sure if this is a lot or not. Any ideas?

UPDATE:

So many of the time-consuming "SearchOperator:kv" lines in the search.log file seem to be coming from specs in the config files that are not related to my custom sourcetype. For example, the access-extractions transform is referenced by a bunch of default sourcetype specs, but not by my custom sourcetype spec. The following search:


| metadata type=sourcetypes index="main"

...shows a single result that is my custom sourcetype.

Tags (2)
0 Karma
1 Solution

bcavagnolo
Explorer

For the record, I think we figured out the root cause of this issue. We ended up having a crazy number of sourcetypes defined. We ended up rebuilding the index on a fresh splunk install. All is well now.

View solution in original post

bcavagnolo
Explorer

For the record, I think we figured out the root cause of this issue. We ended up having a crazy number of sourcetypes defined. We ended up rebuilding the index on a fresh splunk install. All is well now.

bcavagnolo
Explorer

Yes. Practically all of the events in splunk are of of the sourcetype that I specify. And the results do eventually appear. It just takes 10 minutes.

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

Do you have representative sample events?

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...