Splunk Search

I need to compare two values which are generated dynamically in every hour and get the diffrence .

sandyIscream
Communicator

Basically my search looks like this

index=something | rex "(?), " | rex "(?\d+)" | eval _time=strftime(_time, "%d %H") | chart values(Count) over Filename by _time limit=0

my query output is like below if i run for two hours

Filename 1st Hour(Dynamically Generated) 2nd Hour(Dynamically Generated)
ABC 144 158
BDC 14 20

I need to get the difference between current hour and the previous hour and display that difference for every hour.

0 Karma

Richfez
SplunkTrust
SplunkTrust

If you need a third field in each row, like

Filename 1st Hour(Dynamically Generated) 2nd Hour(Dynamically Generated) Difference
ABC 144 158 24
BDC 14 20 6

Then to the end of your search add

... | eval Difference = "1st Hour(Dynamically Generated)" - "2nd Hour(Dynamically Generated)"

If instead you want the difference between the first and second lines you provided (e.g. 144-14 and 158-20), then there are a couple of ways to go about it. I'll provide my favorite.

Using streamstats with a window of 2 and an eval'd difference. Your field names are so big I'm going to shorten them to make it easier to follow the logic, and I'm breaking this into multiple lines to make it more readable. So, add your fieldnames back to that (or do you extractions and SPL with shortened versions and do a rename at the end).

... 
| streamstats window=2 first("1stHour") as Hour1First, last("1stHour") as Hour1Last, 
    first("2ndHour") as Hour2First, last("2ndHour") as Hour2Last
| eval Difference1stHour = Hour1First - Hour1Last, Difference2ndHour = Hour2First - Hour2Last

Obviously the above is wrong, but it gets you a pattern to follow. The math is probably backwards - streamstats might see the events in reverse order so it's view of first and last is backwards. Maybe. But just switch the subtration around if tha'ts the case, no big deal.

0 Karma

sandyIscream
Communicator

Thanks rich7177 for your time. But I wasn't looking for this.

But thankfully we are doing to via shell script and it looks kind of like below.

!/bin/sh

p=date -d "1 hour ago"|awk -F":" '{print $1}'|rev|cut -c1-2|rev

if [ -d /somepath..../data ];then
echo Access Start
for filename in $access_list;do
count=ls -lrt /...../data /..../processed|awk '{print $8$9}'|grep "$p:"|grep $filename|wc -l
echo $filename","$count","$p
done
echo Access End
fi

Then query the files like -

index= | rex "(?\w+)," | rex "(?\d+)" | rex "Hour\s-\s(?\d+)" | eval Time=strftime(_time,"%b %d %H") | mvexpand Count | eval Count = Count . " - (" . Time . ")" | mvcombine Count | chart values(Count) over Filename by Hour limit=0

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...