Splunk Enterprise Security

In Splunk Enterprise Security, why am I missing alerts due to time gaps?

CodyQ
Explorer

Question: is there a way to append the index time to the time of an event for alerting purposes?

My system failed to catch an alert because the reporting system went down, and when it started forwarding logs again, I missed several potential alerts because the alert was constructed on a "now minus 1-hour" concept for performance reasons. I realize the current solution is to just expand my search hours, but I was wondering if anyone has any other solutions.

Has anyone ever created a retrospective search that can look for events that should have fired, but haven't?

0 Karma
1 Solution

spayneort
Contributor

You can change your search to have a larger time range, then limit it based on the index time by adding something like _index_earliest=-5min@min to your search. Here is an article that covers this:

https://spl.ninja/2017/06/01/its-about-time-to-change-your-correlation-searches-timing-settings/

View solution in original post

spayneort
Contributor

You can change your search to have a larger time range, then limit it based on the index time by adding something like _index_earliest=-5min@min to your search. Here is an article that covers this:

https://spl.ninja/2017/06/01/its-about-time-to-change-your-correlation-searches-timing-settings/

Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...