I have lookup file which contains a list of hosts around 500 as follows
host
A
B
C
d
Now, how to write a query to identify the top 10 hosts with event count spike compared to yesterday's event count ? Probably like below or any better way of presenting this would be helpfull.
host Yesterday Today
D 2.2 GB 8 GB
H 1.1 GB 3 GB
Y 0.5 GB 1.4 GB
You can do this for count or for sum(bytes) or for whatever you want, the code will work pretty much the same.
your search that gets all yesterdays and today's events with field host
| bin _time span=1d
| stats count by host _time
| eventstats min(_time) as yesterday max(_time) as today
| eval flag=case(yesterday=today,"you didn't pick two days")
| eval change=if(_time=yesterday,-count,count)
| stats sum(eval(case(_time=yesterday,count))) as Yesterday, sum(eval(case(_time=today,count))) as Today, sum(eval(if(_time=yesterday,-count,count))) as Change by host
| sort 10 - Change
You can do this for count or for sum(bytes) or for whatever you want, the code will work pretty much the same.
your search that gets all yesterdays and today's events with field host
| bin _time span=1d
| stats count by host _time
| eventstats min(_time) as yesterday max(_time) as today
| eval flag=case(yesterday=today,"you didn't pick two days")
| eval change=if(_time=yesterday,-count,count)
| stats sum(eval(case(_time=yesterday,count))) as Yesterday, sum(eval(case(_time=today,count))) as Today, sum(eval(if(_time=yesterday,-count,count))) as Change by host
| sort 10 - Change
Can you add more context to your lookup file and/or indexed data that you are trying to create a search for?