Monitoring Splunk

Showing events that *HAVE NOT* occured

nickhills
Ultra Champion

Hello all,

I am trying to get my head round how to write a query to detect an absence of events.

I have a number of HTML pages which need to be reviewed on a daily basis, and I want to generate alerts if the address is 'missing' from the access logs on a given day.

I have been going round in circles trying to come up with an intelligent plan, but thus far have created an event-type for each page which needs to be reviewed.

eg: eventtype=audit_userlogins / audit_filesupdated / audit_AVdetections etc

I can search for these eventtypes (when events have occurred) and output them to a table for display, but how can I tabulate (and later alert) when there is an absence of these event?

[Note: I know I could create an alert for each eventtype, and do (psudeo) "WHERE stats count < 1" for each eventtype, but this will get ugly when I add all my audit conditions, and still wont let tabulate the data in a dashboard.]

Any ideas or suggestions welcome 🙂

If my comment helps, please give it a thumbs up!
Tags (2)

dwaddle
SplunkTrust
SplunkTrust

Did either of the suggestions below help you out?

0 Karma

dwaddle
SplunkTrust
SplunkTrust

You could use a lookup table that lists out all of your pages and use that to feed the dashboard a set of pre-populated values. Take a lookup file something like:

page,count
/foo/bar.html,0
/foo/baz.html,0
/evil/wicked/naughty/zoot.html,0
/icky-icky-icky-icky/kapang/zoop/boing.html,0

Then use it in a search something like this:

sourcetype=access_logs 
| stats count by page 
| inputlookup append=t pagelist.csv 
| stats max(count) by page

This way you always have at least one record per page, and it's default value is zero. The second stats makes sure the list gets smashed down to where if there were any results for a given page, they are shown in lieu of the zero.

In fact, you could use the lookup twice -- both for helping with your counting and for limiting your original search results, like:

sourcetype=access_logs 
[ | inputloookup pagelist.csv | fields page ]
| stats count by page 
| inputlookup append=t pagelist.csv 
| stats max(count) by page

And now your base search is also filtered only on the pages you are interested in.

lguinn2
Legend

Create a list of all the pages in a file. Upload it to Splunk as a lookup table. For my example, I will use the following, where each line represents a page:

uri_path
dir1/dir2/page1.html
dir1/dir2/page2.html
dir3/dir2/page2.html

In the search below, I assume that a lookup called page_lookup was created. Now run this search:

| inputlookup page_lookup
| join type=outer uri_path 
    [ search sourcetype=access_combined | stats count by uri_path | format maxresults = 10000 ]
| fillnull value=0 count
| where count < 1
| sort uri_path

This will give you a sorted list of the pages that did not appear. It should work as long as the number of pages does not exceed 10,000. BTW, I think that there is a hard limit to the number of results that can be returned from a subsearch.

Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...