Splunk Search

Is there max number of events per time frame in splunk?

suhprano
Path Finder

Is there a performance hit if it has a large number of events per minute? I have custom logs configured and my searches are really slow to load. In a 24 hour window, I'm indexing well above 7million events. I assume since these are custom logs with a custom sourcetype, it only indexes the default fields: source, sourcetype, index, and _time.

Is there anything I can do to optimize my search time? Should I be indexing some extracted fields in index time?

Tags (3)
0 Karma

jrodman
Splunk Employee
Splunk Employee

Not sure if you're saying searches are slow, or indexing is slow. If it's searches, you generally want to look at the specific search. If it's indexing, configuring timestamping and event boundary detection are the biggest boosts.

Its possible you're hitting machine constraints. Normal system analysis is very helpful to define the problem.

Note that Splunk indexes all events by keywords, so there's usually no need to add special indexed fields, for most searches and most data patterns they only hurt.

0 Karma

kevintelford
Path Finder

How are you doing field extraction? If you have many individual regular expressions for example, that will make searching much less performant. Also, is your data a good candidate for summary indexing? Those are two things I've found helpful in the past. You may also be writing inefficient queries which could be causing long query times. If you just search for a single word does the query go much faster?

Also, I believe the max number of events per second to now be set to 1 million.

0 Karma

netwrkr
Communicator

Kevin makes good points above. I will also add that having Splunk read in gzip'd files is pretty resource intensive. gzip is not multi-threaded and depending on the number of CPU's in your server, and the number of log files it is simultaneously uncompressing, that alone could chew up a lot of your processor(s).

0 Karma

netwrkr
Communicator

A couple of quick thoughts - are you reading in compressed (gz) files? Is your server possibly undersized for the amount of data you are ingesting?

Have you looked at the "Status" charts available within the search UI? They contain a lot of useful information that may help you track down whats making things slow.

0 Karma

suhprano
Path Finder

The raw index data is in .gz. Server has 430GB of disk, I configured this off the main index(200GB) to cap at 80GB. Is that undersized? I also have other streams indexing in the main index, tested both in main and off the main index and it's still slow.

I've checked the status numbers, but there's nothing obvious narrowing the issue down for me. Thanks for your feedback so far.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...