Splunk Search

Calculate average count by hour & day combined

jackreeves
Explorer

Hi,

I am wanting to calculate the average count of "incidents" per hour/day (i.e. Mon-07:00, Mon-08:00) over a 12 month period. I am using a date field that is not my timestamp. This is the syntax I have so far, any help would be appreciated.

sourcetype=sourcetype1
| eval log_day=strftime(strptime(DATE_ENTERED,"%Y-%m-%d %H:%M"),"%a-%H:00")
| eval sort_field=case(log_day LIKE "%Mon%",1, log_day LIKE "%Tues%",2, log_day LIKE "%Wed%",3, log_day LIKE "%Thu%",4, log_day LIKE "%Fri%",5, log_day LIKE "%Sat%",6, log_day LIKE "%Sun%",7) 
 | stats count by log_day sort_field

So essentially at the moment this search is telling how many incidents I have had in total for Mon-07:00 etc. but I need the average amount per day across the 12 month window. The sort field is only there to ensure the days of the week are in order (Mon to Sun).

Thanks

0 Karma

jplumsdaine22
Influencer

It would help if you posted a sample event, as I think you're being a needlessly complex with the timestamp wrangling. That said, what I think you should do is create a field for the day-hour and a field for the day, then throw in a couple of eventstats to get the average per day. Here's a run anywhere example

|  gentimes start=01/01/2017 end=12/30/2017 increment=1h
|  eval dow_hour=strftime(starttime,"%a-%H"), dow=strftime(starttime,"%a")
|  fields dow dow_hour
|  eventstats count as total_events
|  eventstats count(eval(match(dow_hour,"00"))) as dows by dow
|  eval avg=total_events/dows
|  stats count as events_by_dow_hour max(avg) as average_events_by_dow max(dows) as dows max(total_events)  by dow_hour dow

These will more or less all be the same because I've just done one event per hour - but throw that at your data and you should see something relevant. Note eventstats can get hairy on large events sets - you may be better off appending multiple stats searches together. If the result set is not to large its perfectly safe.

0 Karma

Shan
Builder

@jackreeves

Give a try with below query..

| makeresults
|eval weekdata=mvappend("2018-07-22 13:09:04","2018-07-22 12:09:04","2018-07-22 01:09:04","2018-07-22 20:09:04","2018-07-22 21:09:04","2018-07-22 22:09:04","2018-07-22 23:09:04","2018-07-22 24:09:04","2018-07-23 13:09:04","2018-07-24 13:09:04","2018-07-25 13:09:04","2018-07-26 13:09:04","2018-07-27 13:09:04","2018-07-28 13:09:04","2018-07-29 13:09:04","2018-07-30 13:09:04")
| mvexpand weekdata
| eval week_hours =  strftime(strptime(weekdata, "%Y-%m-%d %H" ), "%H")
| eval week =  strftime(strptime(weekdata, "%Y-%m-%d" ), "%A")
| eval week_in_number =  strftime(strptime(weekdata, "%Y-%m-%d" ), "%w") 
| eval data_forcount=case(week_hours=13 AND weekdata="2018-07-22 13:09:04",13,week_hours=12 AND weekdata="2018-07-22 12:09:04",12,week_hours=13,10) 
| stats avg(data_forcount) as data_forcount,values(week_in_number) as week_in_number by week,week_hours
|  sort week_in_number
| table week,week_hours,data_forcount

Thanks,
Shankarananth

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...