Splunk Enterprise Security

In Splunk Enterprise Security, why am I missing alerts due to time gaps?

CodyQ
Explorer

Question: is there a way to append the index time to the time of an event for alerting purposes?

My system failed to catch an alert because the reporting system went down, and when it started forwarding logs again, I missed several potential alerts because the alert was constructed on a "now minus 1-hour" concept for performance reasons. I realize the current solution is to just expand my search hours, but I was wondering if anyone has any other solutions.

Has anyone ever created a retrospective search that can look for events that should have fired, but haven't?

0 Karma
1 Solution

spayneort
Contributor

You can change your search to have a larger time range, then limit it based on the index time by adding something like _index_earliest=-5min@min to your search. Here is an article that covers this:

https://spl.ninja/2017/06/01/its-about-time-to-change-your-correlation-searches-timing-settings/

View solution in original post

spayneort
Contributor

You can change your search to have a larger time range, then limit it based on the index time by adding something like _index_earliest=-5min@min to your search. Here is an article that covers this:

https://spl.ninja/2017/06/01/its-about-time-to-change-your-correlation-searches-timing-settings/

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...