Splunk Search

How to calculate and display the daily, weekly, and monthly differences of a field?

dbcase
Motivator

Hi,

I have the below search

index=mso_statistics sourcetype=ic_connectivity_5min-too_small  stat_name=subscribers |eval mso = upper(substr(mso,1,1)).lower(substr(mso,2)) |chart sparkline(max(stat_val)) as Trend max(stat_val) as "Subscribers" by mso|addcoltotals labelfield=Trend label=Total|eval Subscribers=tostring(Subscribers,"commas")|rename mso as "MSO - Click for Expanded View"

The dashboard panel that this search creates looks like this:alt text

I'd like to show the daily/weekly/monthly difference of the Subscribers field.

0 Karma
1 Solution

somesoni2
SplunkTrust
SplunkTrust

Give this a try

For Today vs Yesterday

index=mso_statistics sourcetype=ic_connectivity_5min-too_small  stat_name=subscribers earliest=-1d@d|eval mso = upper(substr(mso,1,1)).lower(substr(mso,2)) | eval Period=if(_time>=relative_time(now(),"@d"),"Today","Yesterday") |chart max(stat_val) over mso by Period |addcoltotals labelfield=Trend label=Total 
| eval Difference=Today-Yesterday | foreach Today Yesterday Difference [eval "<<FIELD>>"=tostring('<<FIELD>>',"commas")] 
|rename mso as "MSO - Click for Expanded View"

For This Week vs Last Week

index=mso_statistics sourcetype=ic_connectivity_5min-too_small  stat_name=subscribers earliest=-1w@w|eval mso = upper(substr(mso,1,1)).lower(substr(mso,2)) | eval Period=if(_time>=relative_time(now(),"@w"),"ThisWeek","LastWeek") |chart max(stat_val) over mso by Period |addcoltotals labelfield=Trend label=Total 
| eval Difference=ThisWeek-LastWeek| foreach ThisWeek LastWeek Difference [eval "<<FIELD>>"=tostring('<<FIELD>>',"commas")] 
|rename mso as "MSO - Click for Expanded View"

To similar change (earliest, eval Period, column names after chart) for monthly comparison.

View solution in original post

DalJeanis
SplunkTrust
SplunkTrust

As long as the user knows what it means, you're golden.

I've posted a potential method for getting your ongoing comparisons week over week over week etc.

As soon as the moderator reviews it, you can visit my madness... THAT code has never been run, but I do queries like that quite a bit.

Test it out step by step with "| head 5" at each point to make sure the data survives the typos.

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

Okay, first we need to pare down the data to only the essential bits.

index=mso_statistics sourcetype=ic_connectivity_5min-too_small stat_name=subscribers
| eval myday0 = relative_time(_time,"-0d@d")
| eval mso = upper(substr(mso,1,1)).lower(substr(mso,2))
| rename stat_val as subscriberCount
| stats latest(subscriberCount) as subscriberCount, by mso myday0

At this point, we have ONLY the mso and the closing subscriber count for each day. You could use max or average instead, if you prefer.

Now comes the interesting part. We build a record for each offset, all in a single multivalue field for each day, then use mvexpand to release the records. For this technique, I generally use 4-5 exclamation points as a standard delimiter since it never appears in my field values or raw data, whereas commas and semis often do.

| eval TheData = mvappend(
"Offset=0!!!! mydate=".myday0."!!!! SC=".subscribercount,
"Offset=1!!!! mydate=".relative_time(_time,"+1d@d")."!!!! SC=".subscribercount,
"Offset=7!!!! mydate=".relative_time(_time,"+7d@d")."!!!! SC=".subscribercount,
"Offset=30!!!! mydate=".relative_time(_time,"+1m@d")."!!!! SC=".subscribercount)
| fields mso TheData
| mvexpand TheData

Next we decode the values into separate fields, kill any future-dated records, and then pivot the data into buckets based on the offsets. In this case, instead of just killing future-dated records, you 'd want to select only the record with the current date.

| rex field=TheData "^Offset=(?[^!]+)!!!! mydate=(?[^!]+)!!!! SC=(?[^!]+)$"
| search mydate < now()
| stats Sum(Case(Offset=0,SC,True(),0)) as CurrentCount,
Sum(Case(Offset=1,SC,True(),0)) as OneDayCount,
Sum(Case(Offset=7,SC,True(),0)) as OneWeekCount,
Sum(Case(Offset=30,SC,True(),0)) as OneMonthCount
by mydate

Then just you have to calculate your deltas, and display.

Okay, you also have to account for missing dates and so on. All of this assumes you have continuous data over the time periods you care about. If not, then the method needs some adjustments... and it might be worth writing out a csv file and doing lookups rather than recalculating this data all the time. However, that code would look very different.

avoid all that by selecting only the current date instead of < now.

| search mydate = relative_time(now()_time,"-0d@d")

0 Karma

somesoni2
SplunkTrust
SplunkTrust

Give this a try

For Today vs Yesterday

index=mso_statistics sourcetype=ic_connectivity_5min-too_small  stat_name=subscribers earliest=-1d@d|eval mso = upper(substr(mso,1,1)).lower(substr(mso,2)) | eval Period=if(_time>=relative_time(now(),"@d"),"Today","Yesterday") |chart max(stat_val) over mso by Period |addcoltotals labelfield=Trend label=Total 
| eval Difference=Today-Yesterday | foreach Today Yesterday Difference [eval "<<FIELD>>"=tostring('<<FIELD>>',"commas")] 
|rename mso as "MSO - Click for Expanded View"

For This Week vs Last Week

index=mso_statistics sourcetype=ic_connectivity_5min-too_small  stat_name=subscribers earliest=-1w@w|eval mso = upper(substr(mso,1,1)).lower(substr(mso,2)) | eval Period=if(_time>=relative_time(now(),"@w"),"ThisWeek","LastWeek") |chart max(stat_val) over mso by Period |addcoltotals labelfield=Trend label=Total 
| eval Difference=ThisWeek-LastWeek| foreach ThisWeek LastWeek Difference [eval "<<FIELD>>"=tostring('<<FIELD>>',"commas")] 
|rename mso as "MSO - Click for Expanded View"

To similar change (earliest, eval Period, column names after chart) for monthly comparison.

dbcase
Motivator

That. Is. PERFECT!!!! Many thanks Somesoni2!!!

0 Karma

dbcase
Motivator

Well wouldn't you know it..... Once they saw that number they said, "hey can you compare not only this week vs last week but 2weeks ago vs last week." I'm poking at it but no luck yet. Any ideas?

0 Karma

dbcase
Motivator

My stab at it (not working)

index=mso_statistics sourcetype=ic_connectivity_5min-too_small  stat_name=subscribers earliest=-1w@w|eval mso = upper(substr(mso,1,1)).lower(substr(mso,2)) | eval Period=if(_time>=relative_time(now(),"@w"),"ThisWeek","LastWeek") |eval Period1=if(_time>=relative_time(now(),"-1w@w"),"LastWeek","twow") |chart max(stat_val) over mso by Period |addcoltotals labelfield=mso label=Total 
 | eval Difference=ThisWeek-LastWeek|eval Difference1=LastWeek-twow| foreach ThisWeek LastWeek Difference twow Difference1 [eval "<<FIELD>>"=tostring('<<FIELD>>',"commas")] 
 |rename mso as "MSO - Click for Expanded View"
0 Karma

somesoni2
SplunkTrust
SplunkTrust

Give this a shot. Changed if to case for scaling.

index=mso_statistics sourcetype=ic_connectivity_5min-too_small  stat_name=subscribers earliest=-2w@w
|eval mso = upper(substr(mso,1,1)).lower(substr(mso,2)) 
| eval Period=case(_time>=relative_time(now(),"@w"),"ThisWeek",_time>=relative_time(now(),"-1w@w"),"LastWeek",1=1,"2WeekAgo") 
|chart max(stat_val) over mso by Period |addcoltotals labelfield=Trend label=Total 
| eval Difference_ThisWeek_LastWeek=ThisWeek-LastWeek | eval Difference_LastWeek_2WeekAgo=LastWeek-'2WeekAgo'
| foreach ThisWeek LastWeek "2WeekAgo" Difference* [eval "<<FIELD>>"=tostring('<<FIELD>>',"commas")] 
 |rename mso as "MSO - Click for Expanded View"
0 Karma

dbcase
Motivator

Dang! Well I just figured out that I have a LOT to learn!

That is perfect!!! THANKS AGAIN!!!

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

The sparkline would be showing the max stat_value for each segment of time. Max, avg, or latest spark lines will all look more-or-less the same on that. However, the user is going to think the far right column represent the current value, but it is really showing the max of all the data being shown.. the highest point the subscriber count got anywhere on that trend line.

0 Karma

dbcase
Motivator

That is correct. The user is aware that the value is the highest value not necessarily the current value. Confusing I know but it works for the audience 🙂

0 Karma

somesoni2
SplunkTrust
SplunkTrust

The sparkline should still be showing today/thisweek/thismonth data?

0 Karma

dbcase
Motivator

if I can get this data the sparkline would be discarded

0 Karma

somesoni2
SplunkTrust
SplunkTrust

Great what would the columns in the expected output? What you'll be comparing at a time, (today vs yesterday OR today vs sametime last week or month)?

0 Karma

dbcase
Motivator

Ideally it would be the delta between today and yesterday, last week and this week, last month and this month.

Good question on the time... hadn't thought about that. Lets just pick a time and I'll adjust it once I find out what is a good time.

0 Karma

DalJeanis
SplunkTrust
SplunkTrust

I'm dubious about using "max" to calculate the far right "Subscribers" column. As an example of the problem, for Bhn, it would seem that 22,315 Subscribers number would be for beginning of the trend line, rather than the current point on it.

The reason this is important to your question is because any deltas that you would want to display would have to have accurate numbers for the subscribers as-of the particular time frame. First you need that for Now, then you need that for a week ago, and so on.

So, to fix THAT problem, I would suggest you replace "max" with "latest" for Subscribers. You might also consider using "avg" for the sparkline, although "max" wouldn't be a terrible representation of what is going on there.

0 Karma

dbcase
Motivator

The data looks like this:

1/9/17
2:35:01.000 PM  
1483994101 twc res prod intelligenthome.timewarnercable.com connectivity subscribers 129041
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____..___ source =  /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx5.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
1483994101 rogerssmb smb prod smartbusiness.rogers.com connectivity subscribers 1647
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____..___ source =  /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx5.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
1483994101 rogers res prod smarthome.rogers.com connectivity subscribers 80160
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____..___ source =  /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx5.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
1483994101 iTSCOMTouchstone res prod portal.intelligent-home.jp connectivity subscribers 4070
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____.-.___ source = /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx5.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
1483994101 peqTouchstone res prod manage.mypeq.com connectivity subscribers 301
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____..___ source =  /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx5.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
1483994101 comcastTouchstoneB res prod xhpodbportal-po-09.sys.comcast.net connectivity subscribers 34
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____--...___ source =   /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx5.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
1483994101 comcastTouchstone res prod appadmin.xfinityhomesecurity.com connectivity subscribers 22700
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____..___ source =  /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx3.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
1483994101 comcastB res prod xhpodbportal-po-09.sys.comcast.net connectivity subscribers 238966
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____--...___ source =   /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx3.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
1483994101 comcast res prod appadmin.xfinityhomesecurity.com connectivity subscribers 662215
date_hour = 20 date_mday =  9 date_minute = 35 date_month = january date_second =   1 date_wday =   monday date_year =  2017 date_zone =    0 eventtype =   external-referer    eventtype = visitor-type-referred host =    ic-mm-1465 index =  mso_statistics linecount =  1 punct =   ____..___ source =  /Users/dcostillow/Documents/icHealthCheck/graphite/ic_connectivity_5min.txt sourcetype =    ic_connectivity_5min-too_small splunk_server =  idx3.icontrol.splunkcloud.com timeendpos =  10 timestartpos =   0 unix_category =   all_hosts unix_group =  default
1/9/17
2:35:01.000 PM  
0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...