Hello,
I've created an alert which is supposed to trigger when events are less than 25. Sometimes it triggers correctly, but other time it returns 0 events in the last hour(real time).
When I click on view results, it displays 0 events, but if i run the query manually, there will be a lot of events during the same time range.
Any idea of how to fix the false positive alerts?
Thanks
After scratching my head for a couple of hours - decided to give up on real-time search.
Now I run a scheduled search every hour and in my search query, I added the keyword "earliest=-1h"
The alerts are working fine now.
Be sure to click Accept
on your answer to close the question.
To somesoni2's point - when you say "(real time)" are you indicating you are running this as a real time search or more like when you run the search over the last hour's worth of events?
Assuming you aren't running this as a real time search, which I wouldn't, there could be a delay in data ingestion. Try running something like the following
index=<where the data is> sourcetype=<what the data is> | eval delta = _indextime - _time | timechart avg(delta) perc90(delta) max(delta)
Since Splunk places events in their correct chronographical location it might be events are coming in after the search has run but before you manually go searching for them.
I'm running this as a real-time search to check events in the past 1 hour
Consider converting it to a historical search from real-time search (they are inefficient).