Deployment Architecture

500MB diskUsage limit damage?

Yarsa
Path Finder

Hi,
I was wondering how harmful is it to get near that limit in a single search query?
If some of my searches take more than a few minutes to return, should I be questioning the way I built them?

By the way I am already using summary indexes and jobs in other places.

Tags (1)
0 Karma

bmacias84
Champion

@yarsa, Not sure of your hardware configuration, how many concurrent searches/users, or searches. Reaching that limit isn't too harmful other than slow searches and possible slower indexing if your hardware resources have high utilization.

TroubleshootingSearchQuotas

  1. I would install SOS (Splunk On Splunk), this will help determine which searches/users are eating up your resources.
  2. Once you identified your malformed Searches using the "Seach Job Inspector" to find where your highest Excecution costs are.
  3. Also you may want to limit how time range and number of searches each users can run
  4. If you find that your dashboards are taking up most of your searches, you many want to invest time in trying to combine your searches into post-process searches.
  5. Also look at how you have your indexes broken out.
  6. Evaluate your saved searches. Can some of those be ran over night or off hours.
  7. Review your disk metrics (Disk Q length, read/write per sec, etc).
  8. Consider using bloom filters.

I also recommend obtaining a copy of Exploring Splunk.

Seach Tips from "Exploring Splunk":

  1. Filter out unneeded fields as soon as possible
  2. Filter out results before calculations
  3. Turn of Field Discovery.
  4. Use Advanced Charting view over Timeline view. Timeline has higher costs.

Other things to note dense searches are faster than sparse search. Rare term search have high IO cost. Low cardinality seaches are also faster.

Additional Reading:

Bloomfilters

SearchJobInspector

OptimizeSearchSpeed

Exploring_Splunk

OptimizeSplunkforpeakperformance

Typesofsearches Other types of searches "Super-­‐Sparse" and "Rare Term"

PostProcessSeaches

Hope this helps you.

DaveSavage
Builder

Yarsa - I've never heard of issues on single search limits after all that is big data is all about, but would certainly check out your 'expensive searches', check which ones are machine intensive (Search>Status>Search...etc). You can also set the default time period a search covers changing it from 'All Time' to something more reasonable. If you run a lot of adhoc searches, see some results of interest, then also finalise the search...unless you need the full set...hope this helps!
Br
D

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...