Getting Data In

Filter logs according to specific criteria before indexing

ricotries
Communicator

I am trying to limit the amount of data that is stored in the indexers; I only want to keep data that would be considered "useful" for our administrators. For example, we have a *nix add-on that runs a script that calls the command 'top' every 2 minutes, but the results from this command blow up the daily indexing usage.

Is there a way to pre-filter logs according to specific criteria? Could I only store logs where the reported CPU usage is greater than 75%, and not store any logs where that value is lower than the criteria for example? If there is a procedure that would help me tailor many other results to create more compact but useful events that would great.

0 Karma
Get Updates on the Splunk Community!

Index This | Forward, I’m heavy; backward, I’m not. What am I?

April 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

A Guide To Cloud Migration Success

As enterprises’ rapid expansion to the cloud continues, IT leaders are continuously looking for ways to focus ...

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...