Is there a way to send an alert if I exceed my license limit? Does Splunk generate a log when this happens?
Thanks in advance!
You can use the following search to see the amount currently indexed by all non-internal indexes over a 1 day period:
index=_internal metrics kb group="per_index_thruput" series!=_* | eval totalGB = (kb / 1024) / 1024 | timechart span=1d sum(totalGB) as total
Then you can simply create a saved search that runs every X-minutes or hours and alert based on if the custom condition is met.. that custom condition would be if total > 10 ...meaning it would alert if the total indexed is grater than 10GB. Just adjust the value to meet your needs.
I've summarized some useful usage statistics here (along with links to splunks docs):
How could I get this per index? I'd like total on each index (series)
You may find everything you need for index analysis in the plug-in Splunk on Splunk (SoS)...it's rather good. Download it, then run 'Metrics' from the menu...check out the 2nd bar chart down, by index.
Glad to hear its figured out. Sorry there werent typos I should have just used the code tag since the wiki messed up the formatting or the search.. it should have been index=_internal
and series!=_*
which eliminates all internal indexes because those are not charged against your license usage so you do not want them calculated against. Then you shouldn't need the earliest parameter because the timechart span is set.
You can use the following search to see the amount currently indexed by all non-internal indexes over a 1 day period:
index=_internal metrics kb group="per_index_thruput" series!=_* | eval totalGB = (kb / 1024) / 1024 | timechart span=1d sum(totalGB) as total
Then you can simply create a saved search that runs every X-minutes or hours and alert based on if the custom condition is met.. that custom condition would be if total > 10 ...meaning it would alert if the total indexed is grater than 10GB. Just adjust the value to meet your needs.
I've summarized some useful usage statistics here (along with links to splunks docs):
It may because you are using a basic conditional alert rather than an advanced conditional alert.
See http://docs.splunk.com/Documentation/Splunk/latest/User/SchedulingSavedSearches#Define_alerts_that_a... for more details.
Running the exact search mentioned above by I-Man. Running is manually works perfectly. Splunk just doesn't seem to like the alert setting.
What is the search you are running? What happens when you run it manually (what does it return)?
My Splunk is not liking the custom alert condition "if total > 3". What am I doing wrong?
There were a couple typos in your search but it works like this:
index=_internal metrics kb group="per_index_thruput" series=* earliest=@d | eval totalGB = (kb/1024)/1024 | timechart span=1d sum(totalGB) as total
Thanks Man!