Splunk Search

How to create a line graph where for each day, it displays the latest time?

alanxu
Communicator

Hello,

I extracted the time with the variable TIME. I am trying to create a line graph where it shows the latest time. My search right now is

host=... source=... | timechart max(TIME) by Date

However, my y-axis' values are odd. It goes from 7,000 to 10,000, but the x-axis is correct with the dates.

0 Karma

woodcock
Esteemed Legend

Assuming that TIME is actually duration, you can use the tostring to convert to seconds like this:

host=... source= | eval DURATION=tostring(TIME,"duration") | timechart max(DURATION) by Date

alanxu
Communicator

When I do verbose events show up but no visualizations.

0 Karma

alanxu
Communicator

I dont get any results with your line. When I remove timechart there is events. But once I add timechart max(duration) by date nothing vcomes up

0 Karma

somesoni2
Revered Legend

In What format you're extracting the TIME value? Can you run something like this and share result?

 host=... source=... | head 1 | table TIME

alanxu
Communicator

Its really odd that the table is perfect, but the line graph is always wrong

0 Karma

somesoni2
Revered Legend

It could be because you're plotting string values in Y-axis. You want to plot the max time of the day in HH:MM:SS format for each day (last 7 day)?

alanxu
Communicator

Oh they are string values?

0 Karma

somesoni2
Revered Legend

I would need your full current query to confirm the same..

alanxu
Communicator

I can send you a picture of the table that comes up

0 Karma

somesoni2
Revered Legend

That will work too. somesh.soni@gmail.com

alanxu
Communicator

Look at the second email instead

0 Karma

somesoni2
Revered Legend

Yes, that's the work around, instead of HH:MM:SS in string format, we can convert it to HH.MM (HH dot MM) i.e. decimal value which can be plotted . I know it would not look good for Tables but decent work around for graphs. Do you intent to put this in dashboard??

If this workaround is acceptable to you, I can tell the option to convert your already existing TIME field to decimal value.

0 Karma

alanxu
Communicator

Instead of _time cant I just use TIME?

0 Karma

somesoni2
Revered Legend

Your TIME is again a string right? So, to get a decimal out of it OR to convert it to decimal, you can try something like this

 | eval TIME=tonumber(replace(TIME ,"^(\d+):(\d+)",\1.\2")) | timechart max(TIME) as TIME by Date
0 Karma

somesoni2
Revered Legend

Try query like this and let me know if TIME and TIME_decimal are similar (e.g. 02:47:04 will show as 2.47)

your current search giving your _time Date TIME fields | eval TIME_decimal=tonumber(replace(TIME,"(\d+):(\d+):(\d+)","\1.\2")) | table _time TIME TIME_decimal
0 Karma

somesoni2
Revered Legend

And if this looks correct try this

your current search giving your _time Date TIME fields | eval TIME_decimal=tonumber(replace(TIME,"(\d+):(\d+):(\d+)","\1.\2")) | timechart max(TIME_decimal) as TIME by Date
0 Karma

alanxu
Communicator

yeah into a dashboard. But I feel the values are offf.. IF you look at the email I sent you the values should be like 2.3 1.5 if anything.. But isntead I am get nines

0 Karma

alanxu
Communicator

sent the picture!

0 Karma

somesoni2
Revered Legend

As suspected, there are string values (propably output of command like | eval TIME=strftime(_time ,"%H:%M:%S") . The workaround that you can try would be like this

 | eval TIME=tonumber(strftime(_time ,"%H.%M")) | timechart max(TIME) as TIME by Date 

alanxu
Communicator

I am trying to have..
y-axis have a range of times
x-axis have dates

then it will be plotting the latest time for each date.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...