I want to create a search that will look over the last 30 days of vulnerability events and only retain those events that are from the most recent scan of a host. I can think of a few ways to do that, but I'm wondering if there is a more efficient way that I'm missing...
Create a lookup table with the max(_time) and dest and use a lookup+where clause to only retain events where _time=max(_time)
Use a join command to join the output of a "stats max(_time) by dest" subsearch to the original events and use where to filter them.
Use a multisearch with one search for the vulnerability events and another that does the "stats max(_time) by dest" and use a transaction to bind them together and then filter with where.
I could also probably do it with the map command...
What is the best route to solve this problem?
Try something like this
base search for last 30 day | eval day=relative_time(_time, "@d") | eventstats max(_time) as latest by host | where day=relative_time(latest, "@d")
I would do something like this:
My Big Broad Search Here | append [
My Scan Search Here | dedup _time host | rename _time AS lastScanTime | eval DropMe="YES" ]
| eventstats first(lastScanTime) AS lastScanTime BY host
| where _time >= lastScanTime AND isnull(DropMe)
Thanks, woodcock... I'm curious though... why use the append command at all? This search seems to get me what I want:
sourcetype=nessus:scan | eventstats max(_time) AS lastScan by dest | where _time=lastScan
I was assuming that the qualifying data (My Scan Search) is in a different dataset than the search data (My Big Broad Search). If all the data is in the same place, then this is not necessary. If you don't need to join
(your word, not mine), then why did you mention it?