Splunk Search

Compiling of Data

albyva
Communicator

Using this set of data:

Time Host Type Packets

12:00 mothra A 5
12:05 mothra A 6
12:10 mothra A 7
12:00 mothra B 100
12:05 mothra B 200
12:10 mothra B 300

I want to compile and calculate the data to come out like this:

Time Host Packet_Loss

12:10 mothra 0.03

The problem I'm having is how do I gather up all of Type=A and Type=B so I can then run
something like | eval=packetloss(typeA/typeB) ? I understand enough of splunk to gather
up this data and all. My issue is getting everything under Type=A and Type=B combined so I
can then run the math on it.

Tags (2)
0 Karma
1 Solution

Damien_Dallimor
Ultra Champion

There is probably a better search to use if I knew more about your data and use case , how many host types you are dealing with , what time buckets you want to calculate the ratios over etc...

But anyhow , based purely on the data set above , this search worked. It should at least get you pointed in the right direction.I bucketed up into days , so the output is packet loss per day between those 2 host types. The streamstats command allows me to perform the ratio math on the current and previous event ie: host type A's packet sum and host type B's packet sum.

index=main sourcetype=foo | bucket _time span=1d | stats sum(packets) as total_packets by host,type,_time | streamstats window=1 global=f current=f first(total_packets) as next_total_packets | eval packet_loss=next_total_packets/total_packets | table  _time host packet_loss | tail 1

View solution in original post

Damien_Dallimor
Ultra Champion

There is probably a better search to use if I knew more about your data and use case , how many host types you are dealing with , what time buckets you want to calculate the ratios over etc...

But anyhow , based purely on the data set above , this search worked. It should at least get you pointed in the right direction.I bucketed up into days , so the output is packet loss per day between those 2 host types. The streamstats command allows me to perform the ratio math on the current and previous event ie: host type A's packet sum and host type B's packet sum.

index=main sourcetype=foo | bucket _time span=1d | stats sum(packets) as total_packets by host,type,_time | streamstats window=1 global=f current=f first(total_packets) as next_total_packets | eval packet_loss=next_total_packets/total_packets | table  _time host packet_loss | tail 1

albyva
Communicator

Thanks. I've give this one a try and see how things pan out. I've never used the "bucket" or "streamstats" so I'll have some new toys to play with.

Thanks,

0 Karma
Get Updates on the Splunk Community!

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...

Updated Team Landing Page in Splunk Observability

We’re making some changes to the team landing page in Splunk Observability, based on your feedback. The ...