Alerting

Splunk alert for failing to read log files.

travelcsa
Engager

Is there an alert in Splunk or one we can set up that will alert us if Splunk hasn't read log files for more than 5 minutes.

Tags (2)
0 Karma
1 Solution

travelcsa
Engager

Sanjay,

Thank you for that. Here is my final query. Since we have to monitor 10 production servers, I saved this and created an alert that looks for 10 hosts. If Splunk stops receiving logs from any of the 10, the alert notifies us and works perfectly.

host=web10 OR host=web11 OR host=web12 OR host=web13 OR host=web14 OR host=soa20 OR host=soa21 OR host=soa22 OR host=soa23 OR host=soa24 earliest=-5m latest=now | stats count(host) by host

Much appreciated.

View solution in original post

0 Karma

travelcsa
Engager

Sanjay,

Thank you for that. Here is my final query. Since we have to monitor 10 production servers, I saved this and created an alert that looks for 10 hosts. If Splunk stops receiving logs from any of the 10, the alert notifies us and works perfectly.

host=web10 OR host=web11 OR host=web12 OR host=web13 OR host=web14 OR host=soa20 OR host=soa21 OR host=soa22 OR host=soa23 OR host=soa24 earliest=-5m latest=now | stats count(host) by host

Much appreciated.

0 Karma

sanjay_shrestha
Contributor

Yes. You can create a alert for a search query as below:

host=yourhost earliest=-5m latest=now|stats count

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...