Splunk Search

Branching in Splunk? (or another way to combine results into 1 table)

motobeats
Path Finder

I have a search that generates a table with various stats (min, max, %-tile) all by date_hour. Today I ran into an issue because one of my 'date_hours' only had a single "sample" (i.e. we only had hits at 6AM on 1 day out of 180, thus all my stats were the same).

As a result, I would like to add a column for the result of "stats dc(date)" to help inform on the number of samples but can't figure out how to pipe one to the other. I can build each table separately which leads me to the question, can I do a branch and effectively send the result of a pipe to two pipes?

These are the searches I am trying to combine into a single table but can't figure out how. Branching was just a thought. Other solutions welcome.

Search 1

search)|stats dc(date) by date_hour

Search 2

(search)|bucket _time span=1h|stats count by _time date_hour|stats min(count) as Min, p5(count) as "5th %-tile", p25(count) as "25th %-tile", p50(count) as "50th %-tile", p75(count) as "75th %-tile", p95(count) as "95th %-tile", max(count) as "Max" by date_hour
Tags (1)
0 Karma
1 Solution

motobeats
Path Finder

Ok. Really simple answer to this one. Branching would be total overkill

(search)|bucket _time span=1h|stats count by _time date_hour date|stats dc(date) as "Number of samples for this time period",min(count) as Min, p5(count) as "5th %-tile", p25(count) as "25th %-tile", p50(count) as "50th %-tile", p75(count) as "75th %-tile", p95(count) as "95th %-tile", max(count) as "Max" by date_hour

Added date to the end of my first stats operation. This included a column with the date in my results which I could then d_count in the next statement. I realize now that I need to add any meta-data like information to the first operation if I want to use it later.

I needed to add this column since a few date_hours had small numbers of samples thus providing misleading results.

View solution in original post

0 Karma

motobeats
Path Finder

Ok. Really simple answer to this one. Branching would be total overkill

(search)|bucket _time span=1h|stats count by _time date_hour date|stats dc(date) as "Number of samples for this time period",min(count) as Min, p5(count) as "5th %-tile", p25(count) as "25th %-tile", p50(count) as "50th %-tile", p75(count) as "75th %-tile", p95(count) as "95th %-tile", max(count) as "Max" by date_hour

Added date to the end of my first stats operation. This included a column with the date in my results which I could then d_count in the next statement. I realize now that I need to add any meta-data like information to the first operation if I want to use it later.

I needed to add this column since a few date_hours had small numbers of samples thus providing misleading results.

0 Karma

asimagu
Builder

Not sure if understood what you are after very well, do you want to add to that table the number of days that some events happened?

maybe this is what you want?

stats max(A) min(A) dc(date_mday) by date_hour

0 Karma

motobeats
Path Finder

[{"value":"2013-04-04","count":3760},...

0 Karma

asimagu
Builder

one more question, what is the format of your date field? that is not a field that I get on my logs. Do you see days, timestamps, something else? Maybe with that info I can see what you are after when doing a distinct count by date_hour

0 Karma

motobeats
Path Finder

Ok. Let me give some searches to help me communicate better. I have this search
(search)|stats dc(date) by date_hour
and this search
(search)|bucket _time span=1h|stats count by _time date_hour|stats min(count) as Min, p5(count) as "5th %-tile", p25(count) as "25th %-tile", p50(count) as "50th %-tile", p75(count) as "75th %-tile", p95(count) as "95th %-tile", max(count) as "Max" by date_hour
But can't figure out how to combine them to give me a single table.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...