Splunk Search

Run a search for every possible 60 minute period in the past 24 hours?

msarro
Builder

Greetings everyone. We are using a search against CDR data to calculate the 60 minute period in a day which has the highest number of calls. Currently, we are using this search:

index=AS AND sourcetype=AS_CDR earliest=-61m@m latest=-1m@m AND (host=wdv-as03-01.mydomain.net OR host=wdv-as03-02.mydomain.net)|stats count AS "Number Of Calls"|eval "Hour Ending"=strftime(tostring(now()-60), "%m/%d/%Y %H:%M")|table "Hour Ending" "Number Of Calls"

It is run every minute, and outputs the number of records from -61 minutes to -1 minutes, and the hour/minute that the search was run. We would love to use timechart to do this at the end of the day as opposed to running the search every minute for scalability purposes, but when we tell timechart to run with a maxspan of 60m, it automatically seems to snap to the hour. The issue this causes is - what if our busiest hour of the day is from 1:32 to 2:32 - that busiest timeframe would be split between the 1pm-2pm period, and the 2pm to 3pm period. We need to be able to see that it was the time between 1:32 and 2:32.

Any advice would be very much appreciated.

Tags (1)

Stephen_Sorkin
Splunk Employee
Splunk Employee

You can use streamstats to calculate this:

index=AS sourcetype=AS_CDR (host=wdv-as03-01.mydomain.net OR host=wdv-as03-02.mydomain.net) earliest=-1441m@m latest=-1m@m
| bin span=1m _time as minute
| stats count as minute_count by minute
| streamstats window=60 sum(count) as hour_count
| eval hour_ending = strftime(minute, "%m/%d/%Y %H:%M")
| table hour_ending hour_count
| sort - hour_count
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...