I would like to set up a real time alert that triggers once per hour if no events occur for a search but not on week ends.
What is the best way to implement this?
Thanks
Chris
--- Update ---
Sorry I was not very clear on the actual requirement behind the question. I want to monitor an application that logs successful data transactions. Every 30 minutes a keepalive "transaction" is logged. It is ok if one keepalive event is somehow lost (not logged, not executed, ...). So Splunk should get at least one message per hour if not there is an SLA violation, if this happens on a business week day. I initially thought that the realtime search was a good idea because of the sliding window it looks at. -> This is not necessary/the wrong approach. I can have a historical alert that runs at a certain intervall and checks the time delta between the last received event and now() and alert if it is bigger than 3600s.
Don't use a realtime alert 🙂
Based on what you've said so far, a standard alert running at the top of the hour on weekday hours (should be easy enough to define in scheduler) is more than adequate.
Don't use a realtime alert 🙂
Based on what you've said so far, a standard alert running at the top of the hour on weekday hours (should be easy enough to define in scheduler) is more than adequate.
once per hour or if there is an event in that hour you want to trigger? What is the actual requirement? It would be waste of resource if you keep running a realtime job but trigger a single alert!