All Apps and Add-ons

Send alert when indexing volume limit exceeded?

I-Man
Communicator

Is there a way to send an alert if I exceed my license limit? Does Splunk generate a log when this happens?

Thanks in advance!

Tags (1)
1 Solution

joshd
Builder

You can use the following search to see the amount currently indexed by all non-internal indexes over a 1 day period:

index=_internal metrics kb group="per_index_thruput" series!=_* | eval totalGB = (kb / 1024) / 1024 | timechart span=1d sum(totalGB) as total

Then you can simply create a saved search that runs every X-minutes or hours and alert based on if the custom condition is met.. that custom condition would be if total > 10 ...meaning it would alert if the total indexed is grater than 10GB. Just adjust the value to meet your needs.

I've summarized some useful usage statistics here (along with links to splunks docs):

http://www.joshd.ca/content/splunk-usage-statistic-searches

View solution in original post

glitchcowboy
Path Finder

How could I get this per index? I'd like total on each index (series)

0 Karma

DaveSavage
Builder

You may find everything you need for index analysis in the plug-in Splunk on Splunk (SoS)...it's rather good. Download it, then run 'Metrics' from the menu...check out the 2nd bar chart down, by index.

0 Karma

joshd
Builder

Glad to hear its figured out. Sorry there werent typos I should have just used the code tag since the wiki messed up the formatting or the search.. it should have been index=_internal and series!=_* which eliminates all internal indexes because those are not charged against your license usage so you do not want them calculated against. Then you shouldn't need the earliest parameter because the timechart span is set.

0 Karma

joshd
Builder

You can use the following search to see the amount currently indexed by all non-internal indexes over a 1 day period:

index=_internal metrics kb group="per_index_thruput" series!=_* | eval totalGB = (kb / 1024) / 1024 | timechart span=1d sum(totalGB) as total

Then you can simply create a saved search that runs every X-minutes or hours and alert based on if the custom condition is met.. that custom condition would be if total > 10 ...meaning it would alert if the total indexed is grater than 10GB. Just adjust the value to meet your needs.

I've summarized some useful usage statistics here (along with links to splunks docs):

http://www.joshd.ca/content/splunk-usage-statistic-searches

jimcall
Engager

It may because you are using a basic conditional alert rather than an advanced conditional alert.

See http://docs.splunk.com/Documentation/Splunk/latest/User/SchedulingSavedSearches#Define_alerts_that_a... for more details.

0 Karma

afields
New Member

Running the exact search mentioned above by I-Man. Running is manually works perfectly. Splunk just doesn't seem to like the alert setting.

0 Karma

joshd
Builder

What is the search you are running? What happens when you run it manually (what does it return)?

0 Karma

afields
New Member

My Splunk is not liking the custom alert condition "if total > 3". What am I doing wrong?

0 Karma

I-Man
Communicator

There were a couple typos in your search but it works like this:

index=_internal metrics kb group="per_index_thruput" series=* earliest=@d | eval totalGB = (kb/1024)/1024 | timechart span=1d sum(totalGB) as total

Thanks Man!

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...