We have an alert setup on splunk which runs on every 5 min interval and triggers the email alert action if matching result count is more than 0.
One day, we had the matching events in our app log. However, the alert did not get triggered on that day.
When we checked the internal logs, we found that the alert query ran without any issue but the outcome result for that query was 0 hence the alert did not triggered.
However, when we ran the query directly in splunk for same time window, we found the matching events.
i checked the app log logging and indexed time and there was max delay of 2 sec.
Can you give some insight why the alert query did not picked the matching events on that particular day?
I have earliest -2 d to latest now
*/5 * * * *
but still no email alert
did any one have success in the cron schedule
My expectation is that the alert ran in between the time the event happened and the time it was indexed.
Try earliest=-6m latest=-1m
, or earliest=-310s latest=-10s
, or some similar combination. You could also have some overlap if you wanted, with earliest=-310s latest=now
, depending on how time-sensitive the alert is.