Alerting

How to capture missing timeframe when the reports are runned for everyhour

bipin12
New Member

we've a file that is created every 5th minute of an hour for every every hour in a day. Like the file is created at 6:05am, 7:05am, and then at 9:05am. I want to capture that missing 8:05am timestamp to create an alert.

My query is like
index=* sourcetype=* /temp/........*.zip

Tags (1)
0 Karma
1 Solution

adonio
Ultra Champion

hello there,
how often do you want the alert to run? if every hour, you can configure it to inform you if count of event by this sourcetype / source is = 0
if you want it to run in a larger interval you can try to capture the first timestamp of that particular source and see if there is another one exactly one hour before, if not, that is the missing hour, maybe something like this:

index = YOUR_INDEX sourcetype = YOUR_SOURCETYPE
| bin span=60m _time
| stats count as event_count by _time
| eval epoch = _time
| delta epoch as previous_hour p=1
| eval maybe_missing_data = if(previous_hour>3600,epoch-3600,"good")
| eval maybe_missing_data_human = strftime(maybe_missing_data,"%Y-%m-%d %H:%M:%S")
| fillnull value=good maybe_missing_data_human

if there is unique path with time to each of the log files, you can use the above search by rexing the time and checking if the previous time is greater than 3600

hope it helps

View solution in original post

0 Karma

adonio
Ultra Champion

hello there,
how often do you want the alert to run? if every hour, you can configure it to inform you if count of event by this sourcetype / source is = 0
if you want it to run in a larger interval you can try to capture the first timestamp of that particular source and see if there is another one exactly one hour before, if not, that is the missing hour, maybe something like this:

index = YOUR_INDEX sourcetype = YOUR_SOURCETYPE
| bin span=60m _time
| stats count as event_count by _time
| eval epoch = _time
| delta epoch as previous_hour p=1
| eval maybe_missing_data = if(previous_hour>3600,epoch-3600,"good")
| eval maybe_missing_data_human = strftime(maybe_missing_data,"%Y-%m-%d %H:%M:%S")
| fillnull value=good maybe_missing_data_human

if there is unique path with time to each of the log files, you can use the above search by rexing the time and checking if the previous time is greater than 3600

hope it helps

0 Karma
Get Updates on the Splunk Community!

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...