Alerting

How to handle Splunk alerts running every x mins and monitoring last x mins data during splunk down?

sanchitguptaiit
Explorer

We have setup autosys logs into splunk. Now, I created an alert that runs every 30 mins and looks for events that happened in last 30 mins and if sees any “issues”, it sends mail. It works all fine but…

Say is splunk server is down for some reason or splunk forwarder is down for some reason, once it is up, my understanding is it would get all forward all data new since last time it was able to forward data (please confirm this is correct)?

However, since the data will have older timestamp, alerts won’t be able to catch it. Is there any way to fix this? I tried searching but didn’t find much.

I essentially want to setup an alert that runs periodically and looks for all data not processed time last time in alert and if matches constraints in that data, send mail…

Tags (1)
0 Karma
1 Solution

somesoni2
Revered Legend

I've used this method of alerting for a data where it can be older in range of 1 day (yesterday) to 90 days (kind of your case where the data you're getting right now is for historical period). Before I go into the details, Splunk forwarder will be able to send all the logs since last time as long as they are available (the data may rollover to a new files, say your.log.1 or similar, and if Splunk is monitoring just your.log file, the rollover file data will not send to Splunk.

For your case, what you can use is the index-time timerange modifiers, instead of search-time timerange modifier. The regular search-time timerange modifiers are based on _time field (earliest and latest), and index-time timerange modifiers are based on field _indextime.
(see this for more details http://docs.splunk.com/Documentation/Splunk/6.3.3/SearchReference/SearchTimeModifiers)

So modify your search from

search:    your base search | other commands
                 Start time=-Xm@m Finish time=@m

to this one

search :    your base search _index_earliest=-Xm@m _index_latest=@m | other commands
                 Start time=0  Finish time=now

This way, Splunk will always selects data ingested by Splunk in last X minutes, regardless of whether it was X minute old data or Y days, and will check your alert condition.

View solution in original post

0 Karma

somesoni2
Revered Legend

I've used this method of alerting for a data where it can be older in range of 1 day (yesterday) to 90 days (kind of your case where the data you're getting right now is for historical period). Before I go into the details, Splunk forwarder will be able to send all the logs since last time as long as they are available (the data may rollover to a new files, say your.log.1 or similar, and if Splunk is monitoring just your.log file, the rollover file data will not send to Splunk.

For your case, what you can use is the index-time timerange modifiers, instead of search-time timerange modifier. The regular search-time timerange modifiers are based on _time field (earliest and latest), and index-time timerange modifiers are based on field _indextime.
(see this for more details http://docs.splunk.com/Documentation/Splunk/6.3.3/SearchReference/SearchTimeModifiers)

So modify your search from

search:    your base search | other commands
                 Start time=-Xm@m Finish time=@m

to this one

search :    your base search _index_earliest=-Xm@m _index_latest=@m | other commands
                 Start time=0  Finish time=now

This way, Splunk will always selects data ingested by Splunk in last X minutes, regardless of whether it was X minute old data or Y days, and will check your alert condition.

0 Karma

sanchitguptaiit
Explorer

Thanks a lot! this is exactly what i was looking for!

Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...