Deployment Architecture

Aggregate and make buckets based on a field and the span mentioned

tikoonikhil
Explorer

I have two fields in my splunk data called as "impact_time" and "incident_name". Now i want to aggregate these incident names based on the "impact_time" and the span that i give. For example,
If i give span as 1d, it should aggregate the incidents under each date.

2016-06-28 a,b,c,d
2016-06-29 g,r,w,d
2016-06-30 f,e,r,t

If i give the span as 1 hr, it should aggregate on the basis of hour from the impact_time

2016-06-28 03:00:00 a,b,c,d
2016-06-29 04:00:00 g,r,w,d
2016-06-30 05:00:00 f,e,r,t

I tried using bucket command but i am not able to aggregate the incident names for each time.
Any help will be appreciated.

0 Karma
1 Solution

javiergn
Super Champion

In principle what you are trying to do can be done with a combination of bucket and stats but keeping one thing in mind: is impact_time a valid time from a Splunk point of view, that is, is it an epoch time or just a string?

If the former you should be able to do it by simply:

your base search
| bucket impact_time span=1d
| stats values(incident_name) as incident_name by impact_time

If the latter, you need to convert impact_time to a valid epoch time that Splunk recognises by using strptime (the datetime syntax I used is just an example):

your base search
| eval impact_time_epoch = strptime(impact_time, "%Y-%m-%d HH:MM:SS")
| bucket impact_time_epoch span=1d
| stats values(incident_name) as incident_name by impact_time_epoch

In both cases, if you want to display your impact_time in a nice format you can use the fieldformat command after stats. Look at the examples if the documentation here.

Hope that makes sense.

Thanks,
J

View solution in original post

javiergn
Super Champion

In principle what you are trying to do can be done with a combination of bucket and stats but keeping one thing in mind: is impact_time a valid time from a Splunk point of view, that is, is it an epoch time or just a string?

If the former you should be able to do it by simply:

your base search
| bucket impact_time span=1d
| stats values(incident_name) as incident_name by impact_time

If the latter, you need to convert impact_time to a valid epoch time that Splunk recognises by using strptime (the datetime syntax I used is just an example):

your base search
| eval impact_time_epoch = strptime(impact_time, "%Y-%m-%d HH:MM:SS")
| bucket impact_time_epoch span=1d
| stats values(incident_name) as incident_name by impact_time_epoch

In both cases, if you want to display your impact_time in a nice format you can use the fieldformat command after stats. Look at the examples if the documentation here.

Hope that makes sense.

Thanks,
J

tikoonikhil
Explorer

@javiergn
It seems to be getting the data in an aggregated way. Thanks a lot for the input. But there is one problem. When i keep the aggregation time as "hour", if in case an hour has no data, it doesn't show anything for that "impact_time". As in, the corresponding impact_time is completely missing from the data table. Is there any way it can show an empty "incident_number" field for an "impact_time" in case that "impact_time" has no data in it to show? For example:-

2016-06-28 03:00:00 a,b,c,d
2016-06-29 04:00:00 g,r,w,d
2016-06-30 05:00:00 f,e,r,t
2016-06-30 08:00:00 f,e,r,t

Here the time 06:00:00, 07:00:00 have no data. So these fields are missing from the data output completely instead of showing this:-

2016-06-28 03:00:00 a,b,c,d
2016-06-29 04:00:00 g,r,w,d
2016-06-30 05:00:00 f,e,r,t
2016-06-30 06:00:00

2016-06-30 07:00:00
2016-06-30 08:00:00 f,e,r,t

Thanks in advance.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...