Splunk Search

How to timechart time with SLA

clarksinthehill
Explorer

I'm trying to replicate the following graph (not based on splunk data) into splunk.
alt text

On Time Batch - Planned Time is equal to SLA.

I'm using the following search:
sourcetype=foo job_status=completed | eval SLA="06:00 AM" | eval Date=strftime(_time, "%m-%d-%y") | eval EndTime=strftime(_time, "%H:%M %p") | table Date EndTime SLAalt text

0 Karma
1 Solution

somesoni2
Revered Legend

How about this

sourcetype=foo job_status=completed | eval SLA=6.0 | eval Date=strftime(_time, "%m-%d-%y") | eval EndTime=strftime(_time, "%H.%M") | table Date EndTime SLA

View solution in original post

0 Karma

somesoni2
Revered Legend

How about this

sourcetype=foo job_status=completed | eval SLA=6.0 | eval Date=strftime(_time, "%m-%d-%y") | eval EndTime=strftime(_time, "%H.%M") | table Date EndTime SLA
0 Karma

clarksinthehill
Explorer

Wow - much easier. I added a sort to the date. I'll work with this some more.

0 Karma

robettinger
Explorer

Guys, this works great for endTime and SLA which happen in them middle of the day, but whenever the SLA is at early hours (e.g. 1 AM), the graph would show missed cut-offs every day as a 11pm endTime is technically before the SLA but the graph will show it as way after the 1.0 mark. Any ideas?

0 Karma

woodcock
Esteemed Legend

Try this:

sourcetype=foo job_status=completed
| eval date_hourmin = strftime(_time, "%H%M")
| eval date_hourmin_SLA="0600"
| where date_hourmin <= date_hourmin_SLA
| append [| noop | stats count AS info_min_time | addinfo 
   | eval info_min_time=strftime(info_min_time, "%m/%d/%Y") 
   | eval info_max_time=strftime(info_max_time, "%m/%d/%Y") 
   | map search="| gentimes start=$info_min_time$ end=$info_max_time$ increment=1d" 
   | fields starttime
   | rename starttime AS _time
   | eval host="SLA Planned Time" 
   | eval date_hourmin="0600" ] 
| timechart span=1d avg(date_hourmin) BY host
0 Karma

clarksinthehill
Explorer

Still getting the above error message, anything else I can try?

0 Karma

woodcock
Esteemed Legend

Wow, I really borked that one. I tested it this time; try the updated answer.

0 Karma

clarksinthehill
Explorer

Thanks for the reply, I am getting two errors with this. They are:

Error in 'timechart' command: You must specify data field(s) to chart.

[subsearch]: [map]: command="gentimes", invalid literal for int() with base 10: '"1462298400.000"'. Traceback: Traceback (most recent call last): File "/opt/isv/splunk/etc/apps/search/bin/gentimes.py", line 66, in generateTimestamps starttime = getTime(startagostr) File "/opt/isv/splunk/etc/apps/search/bin/gentimes.py", line 35, in getTime daysago = int(val) ValueError: invalid literal for int() with base 10: '"1462298400.000"'
0 Karma

woodcock
Esteemed Legend

I updated my original answer; try it again.

0 Karma

clarksinthehill
Explorer

Thanks, getting closer.. Working on formatting Y Axis to show 2,4,6,8 and still trying to display SLA line on line graph.

0 Karma

woodcock
Esteemed Legend

The SLA line is on the graph; it is the one with host = SLA Planned Time

0 Karma

clarksinthehill
Explorer

Hmmm, still not displaying. Maybe the subsearch error is causing it. Getting this..

[subsearch]: Unable to run query '| gentimes start = [|noop|stats count AS info_min_time | eval info_min_time="1462370400.000" | eval info_min_time=strftime(info_min_time, %m/%d/%Y)] end = [|noop|stats count AS info_max_time | eval info_max_time="1462975723.000" | eval info_max_time=strftime(info_max_time, %m/%d/%Y)] increment=1d | eval host="SLA Planned Time" | eval date_hourmin="0600"'.
0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...