I have a realtime view that updates every 120s with the overall revenue picture of the last 2 hours, query looks like this:
index=### sourcetype=### | eval amt=round(amt/100,2) | timechart sum(amt) span=120s
What I need is an alert for when this sum drops below a certain value, let's say 500. The biggest problem I'm having is that I can get this information to show up, but I can't figure out how to keep Splunk from also trying to count the current 120s as below this value. Essentially what will happen is that I'll set this to column view and the most recent column will update live as the 120s value is being gathered. If I set this as an alert I'm afraid I'll just get spammed with emails given that it counts the now information before the 120s has passed. Anyone know how to get Splunk to essentially ignore the information that is being actively gathered over the 120s?
If I understood right, you'll need to restrict your real time search to something like that:
index=xxxx earliest=rt-240s latest=rt-120s | eval amt=round(amt/100,2) | stats sum(amt)
and you can alert if that sum(amt) is below the expected. In other words, you'll ignore the latest 2 minutes 🙂
Btw, if you want to look at the last 2 minutes, just do like that:
index=xxxx earliest=rt-120s latest=now | eval amt=round(amt/100,2) | stats sum(amt)