Getting Data In

Filter logs according to specific criteria before indexing

ricotries
Communicator

I am trying to limit the amount of data that is stored in the indexers; I only want to keep data that would be considered "useful" for our administrators. For example, we have a *nix add-on that runs a script that calls the command 'top' every 2 minutes, but the results from this command blow up the daily indexing usage.

Is there a way to pre-filter logs according to specific criteria? Could I only store logs where the reported CPU usage is greater than 75%, and not store any logs where that value is lower than the criteria for example? If there is a procedure that would help me tailor many other results to create more compact but useful events that would great.

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...