Splunk Search

Timechart Volume per hour same day over several weeks...Seeking alternative way.

nqjpm
Path Finder

This is a working search that charts Volume per hour for the same day (Current day) over multiple weeks. The search time from the Timepicker is set at Today. I was experimenting with timewrap to solve this issue but |timewrap 1week wasn't doing what I needed.
I am trying to avoid using more appends as the my search is becoming long and expensive since I am being asked for multiple panels showing more weeks, month and YTD.

index= foo |fields incidentId _time  | dedup incidentId | eval ReportKey="1. Current day" 
    |append [search index=foo earliest=-7d@d latest=-6d@d |fields incidentId _time | eval
_time=_time+86400*7 | dedup incidentId | eval ReportKey="2. Last week"] 
    |append [search index=foo earliest=-14d@d latest=-13d@d |fields incidentId _time |eval
_time=_time+86400*7*2 | dedup incidentId | eval ReportKey="3. Two weeks ago"] 
    |append [search index=foo earliest=-21d@d latest=-20d@d |fields incidentId _time |eval
_time=_time+86400*7*3 | dedup incidentId | eval ReportKey="4. Three weeks ago"] 
    | timechart span=1h count(incidentId) by ReportKey

Thanks in advance

Here is the failed attempt that removes the hourly component in the visualization in linechart

index=foo |fields incidentId _time
| timechart span=1h dc(incidentId)
| timewrap 1week

I don't yet have enough karma to post an image unfortunately. It removes me from the Hourly view and put the time chart into a weekly view. changing timewrap to 1d shows everyday. Not sure what I am missing here.

0 Karma
1 Solution

woodcock
Esteemed Legend

Let's say that you need 10 weeks' worth, use 10w in the relative_time line and set the timepicker to Last 10 weeks:

index=foo
    [| gentimes 
        [| makeresults |  eval start=strftime(relative_time(now(), "-10w@d"), "%m/%d/%Y")] 
        increment=1d 
    | rename COMMENT1of2 AS "We use '1d' + 'dropme' instead of '1w' because we need the start/end to span 1 day, not 1 week."
    | rename COMMENT2of2 AS "Also, on some versions, the 'increment=1w' option does not work at all and does '1d' instead."
    | streamstats count AS _serial
    | eval dropme = (_serial + 6)%7
    | search dropme = 0
    | table starttime endtime
    | rename starttime AS time>, endtime AS time<
    | format
    | rex field=search mode=sed "s/time/_time/g s/\"//g"
    ]
| dedup incidentId
| fields _time
| eval ReportKey=strftime(_time, "%m/%d/%y)
| timechart span=1h count BY ReportKey

I am not sure what you were doing after that but you can take it from there.

View solution in original post

DalJeanis
SplunkTrust
SplunkTrust

You don't need all the appends. Compare the performance to this. It might be better, or might not. Use the entire time range that you want to compare, like earliest=-22d@d for three weeks.

 index= foo 
| fields incidentId _time 
| dedup incidentId 
| bin _time as day span=1d
| appendpipe [| stats max(day) as maxdate 
    | eval desiredDay=strftime(maxdate,"%w") 
    ]
| eventstats max(maxdate) as maxdate max(desiredDay) as desiredDay 
| eval dayofweek=strftime(_time,"%w") 
| where dayofweek = desiredDay

This part sets up the report fields...

| eval ReportKey=round((maxdate-day)/604800,0)
| eval _time=_time + 604800*ReportKey
| eval ReportKey=if(ReportKey=0,"0.  Current day",ReportKey.". ".ReportKey." Weeks Ago")

This part shows the report...

 | timechart span=1h count(incidentId) by ReportKey
0 Karma

DalJeanis
SplunkTrust
SplunkTrust

Follow @woodcock's search, which is more efficient, then add these lines to produce the report.

| bin _time as day span=1d
| eventstats max(day) as maxdate
| eval ReportKey=round((maxdate-day)/604800,0)
| eval _time=_time + 604800*ReportKey
| eval ReportKey=if(ReportKey=0,"0.  Current day",ReportKey.". ".ReportKey." Weeks Ago")
| timechart span=1h count(incidentId) by ReportKey
0 Karma

woodcock
Esteemed Legend

Let's say that you need 10 weeks' worth, use 10w in the relative_time line and set the timepicker to Last 10 weeks:

index=foo
    [| gentimes 
        [| makeresults |  eval start=strftime(relative_time(now(), "-10w@d"), "%m/%d/%Y")] 
        increment=1d 
    | rename COMMENT1of2 AS "We use '1d' + 'dropme' instead of '1w' because we need the start/end to span 1 day, not 1 week."
    | rename COMMENT2of2 AS "Also, on some versions, the 'increment=1w' option does not work at all and does '1d' instead."
    | streamstats count AS _serial
    | eval dropme = (_serial + 6)%7
    | search dropme = 0
    | table starttime endtime
    | rename starttime AS time>, endtime AS time<
    | format
    | rex field=search mode=sed "s/time/_time/g s/\"//g"
    ]
| dedup incidentId
| fields _time
| eval ReportKey=strftime(_time, "%m/%d/%y)
| timechart span=1h count BY ReportKey

I am not sure what you were doing after that but you can take it from there.

DalJeanis
SplunkTrust
SplunkTrust

I took the extreme liberty of reformatting to indent the subsearches for the unwary. At first read, it seemed like more @woodcock magic.

The above code, inside the outer square brackets, creates a list of time ranges that ends up being formatted like

( ( ( _time>=epochtimedaystartweek1 ) AND (  _time<=epochtimedayendweek1 )  )  OR 
  ( ( _time>=epochtimedatstartweek2 ) AND (  _time<=epochtimedatendweek2 )  )  OR
    ... ) 

where those values are the start and end epoch times created by the inside search. Those times are then used as parameters by the outside search to limit the date/times that are scanned by the search.


Add these lines to complete the report.

| bin _time as day span=1d
| eventstats max(day) as maxdate
| eval ReportKey=round((maxdate-day)/604800,0) 
| eval _time=_time + 604800*ReportKey
| eval ReportKey=if(ReportKey=0,"0.  Current day",ReportKey.". ".ReportKey." Weeks Ago")
| timechart span=1h count(incidentId) by ReportKey

woodcock
Esteemed Legend

It sounds like you approve @daljeanis, you re-editing, non-upvoting interloper! Thanks for expanding on my answer and wrapping up the last bit. I use gentimes like this frequently; it is a nice spell to have in one's bag of magic tricks.

0 Karma

FrankVl
Ultra Champion

Can you perhaps share the search using timewrap that you tried but didn't give the desired results?

0 Karma

nqjpm
Path Finder

Updated. I should have thought to add that.

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...