Reporting

How can I be alerted when the average of a field split by another field rises or drops by a certain percent?

lsheridan
Splunk Employee
Splunk Employee

I've got some performance data and I want to be alerted when the avg(total_requests) split by uri rises or drops by 10%. For now, you can aggregate a few hours worth of data in the average window.

sideview
SplunkTrust
SplunkTrust

I think this question may need more detail. However if I make a couple assumptions I can attempt an answer. (total_requests I find to be a somewhat confusing field name so Im going to call it 'request_count')

Assuming you are starting from a search that looks like

<some search> | stats avg(request_count) over uri

which would give you output like:

uri             avg(request_count)
some/uri        12.3
some/other/uri  41.4
....

And you want your alert to fire if any of the rows rises by 10%...

And assuming you want the 'rises by 10%' to be based on a time range like comparing today to yesterday, then here's a way to do it.

<some search> | eval day = if(_time > now()-86400, "today", "yesterday") | chart avg(request_count) over uri by day

That conditional eval command puts a field called 'day' onto each event, which will be 'today' for all events today, and 'yesterday' for anything older than that.
(NOTE: timerange-wise you probably want to run this search over yesterday+today using -1d@d on the earliest side and +1d@d on the latest side)

Anyway, then the chart command after the eval gives you a table that looks like this:

uri             today      yesterday
some/uri        12.3       10.4
some/other/uri  41.4       10.5
....

Finally throw a where command on the end of that and you can filter the results down to only the uri's that actually had a 10% increase today over yesterday, like so:

<some search> | eval day = if(_time > now()-86400, "today", "yesterday") | chart avg(request_count) over uri by day | where today > (1.1 * yesterday)

And there you go. If that search returns any results, then that's bad, and you want to email those results to somebody.

(If you'd prefer the results in the email were actually the full list of URI's today vs yesterday, you could take that whole where clause off the end, and instead put it in the 'custom alerting condition' when you set up the alert. )

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...