Alerting

Alert if number of events *by any one host* rises by 100%

malbery
New Member

Hi,

If I limit by search to a specific host, then I know how to trigger an alert if the number of events for that saved search rises by 100%. Naturally I don't want to create a saved search for each host that I have. Is there a way I can alert for this with a single saved search?

Cheers,
Merlyn

Tags (1)
0 Karma
1 Solution

_d_
Splunk Employee
Splunk Employee

Hi Malbery,

This is is not possible with the current version of Splunk. The next major release (4.3) will have a feature that does exactly that - it will be called per result alerting.

> please upvote and accept answer if you find it useful - thanks!

View solution in original post

rtadams89
Contributor

If you are currently running your search every hour, looking at the event counts in the last hour, and comparing that to an hour before, you could:

Run your search every hour, looking over the last TWO hours, then use a combination of buckets and stats to get the count of events by host, for the first hour and the second hour. You then have a table (you don't have to litterally "| table" it) of each host, the events for that host between two hours ago and one hour ago, and the events for that host between one hour ago and now. At the end of your search, string, use | where eventCount2 > 2 * eventCount1

This will then return you a list of all hosts that had twice as many events in the last hour, as the hour before. Set the alert to trigger if the number of results is > 0. You will then be alerted anytime one or more hosts has 100% more events this hour than last hour, and in the alert will be provided a list of which hosts met that criteria.

0 Karma

kristian_kolb
Ultra Champion

Well, I don't really know what you are planning to do, but the following search will list all hosts which have sent a 100% more messages in the last minute, than compared with a minute half an hour ago.

    * earliest =-30m latest=-29m 
| stats count AS then_events by host 
| join host type=outer 
[search earliest=-1m latest=now 
| stats count AS now_events by host] | fillnull 
| eval change_in_percent=round((now_events/then_events -1) * 100,1) 
| where change_in_percent > 100

Unfortunately this uses the count of events as indexed, not information from a summary or internal index, which means that it will have to go through a lot of data if you're looking at longer time spans. Also it uses join, which is expensive.

You could give it a try to see if it works for you. Should not require any modifications just for a test run.

Hope this helps,

Kristian

0 Karma

_d_
Splunk Employee
Splunk Employee

Hi Malbery,

This is is not possible with the current version of Splunk. The next major release (4.3) will have a feature that does exactly that - it will be called per result alerting.

> please upvote and accept answer if you find it useful - thanks!

Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...