Getting Data In

Disable index when it exceeds 2G in one day

seanlon11
Path Finder

We have created a bunch of different indexes for all of our different systems. At some point, these systems will freak out, and produce 10-15G of data in a matter of minutes. We are still working to fix the freak out issue.

Is disabling that index when it hits 2G in a day the best way to prevent a license violation?

The only other option I have thought of is to send that forwarder's data to the null-queue. Is that a better option?

This is desirable for a couple of reasons: 1) avoid filling up the splunk indexer drive 2) avoid license violations

Thanks, Sean

0 Karma
1 Solution

maverick
Splunk Employee
Splunk Employee

If the behavior you describe above is, indeed, rare and you didn't have to worry filling up the disk, then I would just let it violate the license limit because you are allow a few violations within any 30-day rolling period with only a warning message. And even if your search is disabled eventually due to multiple violations, your indexing will never stop and you can always request a reset license from Splunk support, which you can use to reset the violations indicator back to zero.

However, if filling up the disk is a concern (and feasibly you are not able to allocate or add more disk capacity to handle the overage) then one thing you can try is creating an alert that triggers a script to run when you start to get close to going over your licensed limit, say, 1.8 GB or more. That script could maybe copy in a nullQueue config on the Splunk forwarders and then restart them or it could maybe stop the splunk forwarders altogether until you are able to resolve the reason for the overage or until the "Freak out" period if over, etc.

BTW, if you search your _internal index for the metrics events for each day, like this:


index=_internal metrics kb group="per_index_thruput" series="main" startdaysago=1 | eval totalGB = (kb / 1024) / 1024 | timechart span=1d sum(totalGB) as total  | search total > 1.8

you can run this search every five minutes or so to trigger your alert as your index exceeds 1.8 GB per day and run your script. Also, if you have more than one index, you could create a separate alert trigger for each one by specifying the "series=index_name_here" in the search sample above.

View solution in original post

deyeo
Path Finder

the search string doesn't work. i've got this instead:

Specified field(s) missing from results: 'totalGB'

0 Karma

maverick
Splunk Employee
Splunk Employee

If the behavior you describe above is, indeed, rare and you didn't have to worry filling up the disk, then I would just let it violate the license limit because you are allow a few violations within any 30-day rolling period with only a warning message. And even if your search is disabled eventually due to multiple violations, your indexing will never stop and you can always request a reset license from Splunk support, which you can use to reset the violations indicator back to zero.

However, if filling up the disk is a concern (and feasibly you are not able to allocate or add more disk capacity to handle the overage) then one thing you can try is creating an alert that triggers a script to run when you start to get close to going over your licensed limit, say, 1.8 GB or more. That script could maybe copy in a nullQueue config on the Splunk forwarders and then restart them or it could maybe stop the splunk forwarders altogether until you are able to resolve the reason for the overage or until the "Freak out" period if over, etc.

BTW, if you search your _internal index for the metrics events for each day, like this:


index=_internal metrics kb group="per_index_thruput" series="main" startdaysago=1 | eval totalGB = (kb / 1024) / 1024 | timechart span=1d sum(totalGB) as total  | search total > 1.8

you can run this search every five minutes or so to trigger your alert as your index exceeds 1.8 GB per day and run your script. Also, if you have more than one index, you could create a separate alert trigger for each one by specifying the "series=index_name_here" in the search sample above.

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...