Alerting

Use an alert to trigger scripted input

mpatnode
Path Finder

I have an input script which I would like to run based upon the results of another search. Also, I need to send the results of the alert script to the script. The results of the script creates another event which I want to correlate to the trigger event.

Perhaps alerts aren't the right mechanism. Essentially, I'm running a script to gather more information on the event using data in the event as a parameter to the script. Is there a better way to do this?

Tags (2)
0 Karma

Jason
Motivator

What we have found is that scripts for scripted alerts must live in $SPLUNK_HOME/bin/scripts, and they don't get passed the data directly. However, they are passed the data file of the data returned by the search in $8.

So, you can write a launcher script which gets called by the alert, reads the file, and passes the appropriate values to the script you really want to run (such as the one in your app that creates additional input.)

#!/usr/bin/python

# Based on a script from http://answers.splunk.com/quesrions/3019/scripted-alert-question

import csv, gzip, sys
from subprocess import call

# Enter script location here. This will be called once per event returned by the Splunk
# search, with field1=value1 field2=value2 appended. Ignore fields starting with _
# (Ensure your scheduled search has a | fields -_* | fields x y at the end to ensure
# you get the fields you want going to your script)
scriptlocation = "/opt/splunk/etc/apps/demo/bin/demo.sh"


# The rest of this should not have to be configured
def openany(p):
    if p.endswith(".gz"):
        return gzip.open(p)
    else:
        return open(p)

event_count = int(sys.argv[1])  # number of events returned.
results_file = sys.argv[8]      # file with search results

for row in csv.DictReader(openany(results_file)):
    # Build a command line to call based on fields from splunk output
    my_command = [ scriptlocation ]
    for col in row:
        if col[0]!="_":
            my_command.append(col + '=' + row[col])
    call(my_command)
0 Karma

vbumgarn
Path Finder

Another solution would be to run searches from your scripted input. If you're using python, you can use the splunk modules that ship with splunk. The trick is using passAuth = admin in your inputs.conf, then a session key is handed to your script on stdin.

import splunk.search

sessionKey = sys.stdin.readline()

job = splunk.search.dispatch('search foo', sessionKey=sessionKey)
splunk.search.waitForJob(job, maxtime=240)

if job.count > 0:
    foo = job.events[0]['foo']
else:
    foo = None
0 Karma

gkanapathy
Splunk Employee
Splunk Employee

You might be able to have your script generate a file and put in into the Splunk batch directory, or send it to a network port on which Splunk is listening instead.

Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...