This morning I see the dreaded license exceeded message on one of my indexers. Curious as to which host caused it, I run
index=_internal source=*metrics* group=per_host_thruput earliest=-1d@d latest=-0d@d | rename series as host | eval MB=kb/1024 | stats sum(MB) as MB by host
The numbers don't appear to add up, so I'm suspicious. I run a grand total
index=_internal source=*metrics* group=per_host_thruput earliest=-1d@d latest=-0d@d | eval MB=kb/1024 | stats sum(MB) as MB
And yesterday's total is 131 MB below my license limit. The day before was was more than 300 MB below. In fact
index=_internal source=*metrics* group=per_host_thruput earliest=-10d@d latest=-0d@d | eval MB=kb/1024 | timechart span=1d sum(MB)
1 4/23/11 12:00 AM 179
2 4/24/11 12:00 AM 169
3 4/25/11 12:00 AM 334
4 4/26/11 12:00 AM 464
5 4/27/11 12:00 AM 389
6 4/28/11 12:00 AM 394
7 4/29/11 12:00 AM 333
8 4/30/11 12:00 AM 186
9 5/01/11 12:00 AM 176
10 5/02/11 12:00 AM 369
I have 2 license violations listed in this timeframe: yesterday and 4/26. This indexer's license is 500 MB. Am I looking at the wrong data? What is the license system basing its count on?
Thanks,
jon
per_host_thruput
only lists out the top 10 hosts at each time a measure is taken, so will understate the actual size, especially if you have a lot of hosts that do similar amounts of data. You might want to use per_index_thruput
instead, if you have fewer indexes. The actual license data is in license_audit.log
or license_usage.log
depending on version.
per_host_thruput
only lists out the top 10 hosts at each time a measure is taken, so will understate the actual size, especially if you have a lot of hosts that do similar amounts of data. You might want to use per_index_thruput
instead, if you have fewer indexes. The actual license data is in license_audit.log
or license_usage.log
depending on version.
Thanks, that helps. Using per_index_thruput, I got yesterday's total to match what I got via length(_raw). However, it still shows lower than my license allows for 4/26, a day the license manager says I was over. Anyway, thanks for the clarification.
"earliest=-1d@d latest=-0d@d | eval size=length(_raw) | stats sum(eval(size/(1024*1024))) as MB" yields 520 MB for yesterday. So once again I'm on the lookout for a reasonable search that will show me volume stats: by host, in total, by type, etc. This seems to be a more difficult task than it should be.