Splunk Search

How do I edit my search to display two time charts in one graph?

tinhuty
Engager

I am using appendcols to put two timecharts in one graph to show the correlation, however, the values are off in different time buckets, which make the graph wrong.

Here is my search:

host=xxx* "url1" | timechart span=30m p50(execute_time) AS xxx50 |appendcols [search host=yyy* "url2" | timechart span=30m p50(execute_time) AS yyy50]

Here is the result in table:

2016-06-01 08:00:00 5101.3962    
2016-06-01 08:30:00 2402.4924   3863
2016-06-01 09:00:00 2402.4924   1196
2016-06-01 09:30:00 7472.6874   958
2016-06-01 10:00:00 2932.9128   5953
2016-06-01 10:30:00 2449.2942   1181
2016-06-01 11:00:00 3088.9188   1140
2016-06-01 11:30:00 3135.7206   895
                                  880

As you see the two values are off, the value 3863 should be aligned with 2016-06-01 08:00:00 instead of 2016-06-01 08:30:00 etc.

What did I do wrong? or is there any other better way to achieve my goal?

Thanks.

0 Karma
1 Solution

gabriel_vasseur
Contributor

I am wondering if specifying explict earliest=... and latest=... with the appropriate snaps in both searches wouldn't help? Something like:

     host=xxx* "url1" earliest=-4h@h latest=@h| timechart span=30m p50(execute_time) AS xxx50 |appendcols [search host=yyy* "url2" earliest=-4h@h latest=@h | timechart span=30m p50(execute_time) AS yyy50]

View solution in original post

0 Karma

tinhuty
Engager

One clarification: host=xxx* and host=yyy* are regular expression to match multiple host names, such as xxx001, xxx002, yyy111, yyy112 etc.

0 Karma

sundareshr
Legend

Try one of these options

*Option 1*
(host=xxx* "url1") OR (host=yyy* "url2") | timechart span=30m p50(execute_time) AS exectime by host | rename * AS *50

*Option 2*
host=xxx* "url1" | timechart span=30m p50(execute_time) AS xxx50 |append [search host=yyy* "url2" | timechart span=30m p50(execute_time) AS yyy50] | timechart span=30m first(*) as *

0 Karma

gabriel_vasseur
Contributor

Was option 2 truncated somehow? It ends with "first() as"...

0 Karma

linu1988
Champion

Hi,
Why dont you keep it simple?

 (host=xxx* "url1") AND (host=yyy* "url2") |eval execute_time2=case(host=yyy*,execute_time)| timechart span=30m p50(execute_time) AS xxx50 |appendcols [search host=yyy* "url2" | timechart  span=30m p50(execute_time) AS xxx50, p50(execute_time2) AS yyy50]

Thanks,
L

0 Karma

tinhuty
Engager

Thanks for the idea. Corrected a few spots and below works for me:

(host=xxx* "url1") OR (host=yyy* "url2") |eval
execute_time2=case(host==yyy11, execute_time)| timechart span=30m p50(execute_time)
AS xxx50, p50(execute_time2) AS yyy50

0 Karma

tinhuty
Engager

Looks that regular express match for host doesn't work on case statement, so need to spell out all host names.

0 Karma

gabriel_vasseur
Contributor

To use a regex inside case() you need to use the match function. Something like that:

...=case( match(host, "^yyy"), ...

You could also use the simpler like operator:

...=case( like(host, "yyy%"), ...

Note that "yyy*" in your initial search is a pattern but not a regular expression. "yyy*" as a regex will match something slightly different!

0 Karma

tinhuty
Engager

works great, thanks.

0 Karma

gabriel_vasseur
Contributor

I am wondering if specifying explict earliest=... and latest=... with the appropriate snaps in both searches wouldn't help? Something like:

     host=xxx* "url1" earliest=-4h@h latest=@h| timechart span=30m p50(execute_time) AS xxx50 |appendcols [search host=yyy* "url2" earliest=-4h@h latest=@h | timechart span=30m p50(execute_time) AS yyy50]
0 Karma

tinhuty
Engager

This worked actually. Not sure why it didn't work when I first try.

However the time period is hard coded. It didn't work with timepicker if I put this in dashboard.

Thanks.

0 Karma

gabriel_vasseur
Contributor

True! I guess you could change your dashboard to a form and use tokens for earliest and latest, but you're probably better off with one of the other answers contributed on this page...

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...