All Apps and Add-ons

default database

chiraghchandani
New Member

We need to deploy Splunk into staging and production environments. We shall have around 50 GB of static log data to monitor daily. Splunk documentation wrote that it requires around three times of the quantity of data being indexed. So my question is:
Is it possible to reduce the disk space which Splunk requires before execution?
How to limit the number of indexes it internally creates? Any documentation which can help
Thanks

Tags (1)
0 Karma

alacercogitatus
SplunkTrust
SplunkTrust
  1. Splunk is always on, always consuming data, so I'm not sure what you mean by "before execution".
  2. Splunk only 3 or so internal indexes. The rest are created by you as needed.

I think you have a different understanding when it comes to how Splunk consumes data. Please review this and see if it helps: http://docs.splunk.com/Documentation/Splunk/6.0/Indexer/Howindexingworks

0 Karma

alacercogitatus
SplunkTrust
SplunkTrust

Does this answer your question? Please accept if it does. Thanks!

0 Karma

chiraghchandani
New Member

Thanks for your answer. Indeed, Splunk is always on, before execution is not the correct word.. It is always executing stuff!!

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...