Alerting

Why alert is not triggering on exact time

chandana204
Communicator

Hi,

From the past one week I have been looking into my alert jobs. I found that alerts are triggering 4 minutes before from the actual trigger time. Because of this time change i missed lot of alerts. May I know the reason, why Splunk Enterprice is considering actual time?

I am attaching screenshots for more understanding
The below image shows my alert trigger tim
alt text

The below image shows the alert triggered time. Based on trigger condition it should get trigger today at 12:00 AM but it got triggered yesterday 11:56 PM.
alt text

Please explain me if anyone know the reason behind this issue.

Thanks in Advance,
Chandana

0 Karma
1 Solution

DalJeanis
SplunkTrust
SplunkTrust

I believe you are misinterpreting that time stamp. Just because the search was "created" shortly before the scheduled time does not mean that it ran early.

If you add this code to the end of the alert, you will see the actual time range that is covered.

| append [| makeresults | addinfo
   | eval Time = strftime(_time,"%Y-%m-%d %H:%M:%S")
   | eval Info_min_time=strftime(info_min_time,"%Y-%m-%d %H:%M:%S") 
   | eval Info_max_time=strftime(info_max_time,"%Y-%m-%d %H:%M:%S") 
   | eval Info_search_time=strftime(info_search_time,"%Y-%m-%d %H:%M:%S") 
   | eval Now=strftime(now(),"%Y-%m-%d %H:%M:%S")
   | table Time Now Info_min_time Info_max_time Info_search_time
   ]

info_min_time and info_max_time are the time bounds for events selected by the search. Info_search_time is the time the search was created. Now is the time the search started. Time is the time the makeresults command generated its output event, which is roughly a second after now().


More likely, the problem is that events can take a few seconds (or more) to be indexed. that is the reason that the normal practice is to schedule such a job to run a few minutes after the hour, rather than immediately on the hour.

View solution in original post

DalJeanis
SplunkTrust
SplunkTrust

I believe you are misinterpreting that time stamp. Just because the search was "created" shortly before the scheduled time does not mean that it ran early.

If you add this code to the end of the alert, you will see the actual time range that is covered.

| append [| makeresults | addinfo
   | eval Time = strftime(_time,"%Y-%m-%d %H:%M:%S")
   | eval Info_min_time=strftime(info_min_time,"%Y-%m-%d %H:%M:%S") 
   | eval Info_max_time=strftime(info_max_time,"%Y-%m-%d %H:%M:%S") 
   | eval Info_search_time=strftime(info_search_time,"%Y-%m-%d %H:%M:%S") 
   | eval Now=strftime(now(),"%Y-%m-%d %H:%M:%S")
   | table Time Now Info_min_time Info_max_time Info_search_time
   ]

info_min_time and info_max_time are the time bounds for events selected by the search. Info_search_time is the time the search was created. Now is the time the search started. Time is the time the makeresults command generated its output event, which is roughly a second after now().


More likely, the problem is that events can take a few seconds (or more) to be indexed. that is the reason that the normal practice is to schedule such a job to run a few minutes after the hour, rather than immediately on the hour.

chandana204
Communicator

Thanks for your response. My alerts are triggering before the hour not after the hour. I'll use your query to make sure the selected time range.

Thanks,
Chandana

CarsonZa
Contributor

we need more information.
whats the alert condition?
how are you saving that file?

0 Karma

pruthvikrishnap
Contributor

Hi Chandana,
are there any other trigger conditions mentioned?

0 Karma

chandana204
Communicator

nope, it'll trigger based on number of results.

0 Karma

chandana204
Communicator

it's a scheduled alert. It'll trigger every Monday at 12:00 AM. The output file save as PDF.

I have been using this alert from past one and half month. It was triggering on exact time as specified trigger time. But from the last week it's getting trigger as mentioned above.

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...