Knowledge Management

90 day average per field using Summary indexing and outputlookup

sambit_kabi
Path Finder

I have a requirement to get the average of the count of the IPs over the last 90 days. I have thought of 2 approaches to distribute the query overhead across a span of 90 days.

  1. Will schedule a query to run everyday at the end of the day and collect the result in a csv file using outputlookup. Will keep appending the results every day to the existing file. Then use this file to get the 90 day average.

  2. Use summary indexing and schedule a query to run everyday at the end of the day and collect the result in the summary index specifying the search a name. Will keep appending the results every day to this index . Then use this index to get the 90 day average.

Question:
Is there a way to restrict the lookup csv file or the summary index to have just the latest 90 days record. Meaning , I want to purge the rows in the index or the file which are greater than 90 days. How can I do it?

0 Karma
1 Solution

codebuilder
SplunkTrust
SplunkTrust

This seems like a perfect candidate for a datamodel and dm acceleration.

Create a datamodel to include the index, fields, etc. that you require, then accelerate it for 90 days.
Splunk will automatically keep the acceleration up to date (always 90 days e.g.) and you'll get quite a performance increase on your searches.

Datamodels:
https://docs.splunk.com/Documentation/Splunk/8.0.2/Knowledge/Aboutdatamodels

Datamodel acceleration:
https://docs.splunk.com/Documentation/Splunk/8.0.1/Knowledge/Acceleratedatamodels

----
An upvote would be appreciated and Accept Solution if it helps!

View solution in original post

0 Karma

codebuilder
SplunkTrust
SplunkTrust

This seems like a perfect candidate for a datamodel and dm acceleration.

Create a datamodel to include the index, fields, etc. that you require, then accelerate it for 90 days.
Splunk will automatically keep the acceleration up to date (always 90 days e.g.) and you'll get quite a performance increase on your searches.

Datamodels:
https://docs.splunk.com/Documentation/Splunk/8.0.2/Knowledge/Aboutdatamodels

Datamodel acceleration:
https://docs.splunk.com/Documentation/Splunk/8.0.1/Knowledge/Acceleratedatamodels

----
An upvote would be appreciated and Accept Solution if it helps!
0 Karma

sambit_kabi
Path Finder

Cool. Thanks for pointing it out. I did it the old school way and a very raw way.
So here's what I did:
- using a scheduled search query I queried the results for that day
- Appended the rows in the csv(loopup csv file already populated with previous x days data) with the above query results
- Checked if the timestamp of the results is beyond 90 days and then filtered out the results
- Wrote the filtered results back to the csv file so that the csv has the results for the past 90 days.

Then in the dashboard I read this csv and populate the report for the last 90 days.

But I will definitely take a look at your approach. Thanks very much. Appreciate it.

0 Karma

codebuilder
SplunkTrust
SplunkTrust

Glad to help!

----
An upvote would be appreciated and Accept Solution if it helps!
0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...