Alerting

Realtime Threshold Alerts

pepper_seattle
Path Finder

I have a realtime view that updates every 120s with the overall revenue picture of the last 2 hours, query looks like this:

index=### sourcetype=### | eval amt=round(amt/100,2) | timechart sum(amt) span=120s

What I need is an alert for when this sum drops below a certain value, let's say 500. The biggest problem I'm having is that I can get this information to show up, but I can't figure out how to keep Splunk from also trying to count the current 120s as below this value. Essentially what will happen is that I'll set this to column view and the most recent column will update live as the 120s value is being gathered. If I set this as an alert I'm afraid I'll just get spammed with emails given that it counts the now information before the 120s has passed. Anyone know how to get Splunk to essentially ignore the information that is being actively gathered over the 120s?

Tags (3)
0 Karma

musskopf
Builder

If I understood right, you'll need to restrict your real time search to something like that:

index=xxxx earliest=rt-240s latest=rt-120s | eval amt=round(amt/100,2) | stats sum(amt)

and you can alert if that sum(amt) is below the expected. In other words, you'll ignore the latest 2 minutes 🙂

0 Karma

musskopf
Builder

Btw, if you want to look at the last 2 minutes, just do like that:
index=xxxx earliest=rt-120s latest=now | eval amt=round(amt/100,2) | stats sum(amt)

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...