Getting Data In

How to return job errors when searching via API?

nhaynie_tmo
Engager

When I call the Splunk API via Python SDK, I get results fine. However, when I run the same query via the UI, I sometimes have errors show up in the job inspector. Sometimes that error might suggest I may have partial results due to issues with an indexer. I would like to be able to get search results AND any errors that might be related to that job just as I do via the UI when querying the API. Does anyone know if this is possible and have any suggestions on how to retrieve them? Without returning errors, we may not be able to 'trust' the results if there were errors with the job we are not aware of.

Here is the code I am using to pull results:

def run_export(splunk_host, splunk_port, splunk_username, splunk_password, query, job_name, earliest_time, latest_time, directory, chunk):
    # Initialize service
    service = client.connect(host=splunk_host, port=splunk_port, username=splunk_username, password=splunk_password)

    kwargs_params = {"exec_mode": "normal",
                     "earliest_time":earliest_time,
                     "latest_time":latest_time,
                     "count":0}
    job = service.jobs.create(query, **kwargs_params)

    # A normal search returns the job's SID right away, so we need to poll for completion
    while True:
        while not job.is_ready():
            pass
        stats = {"isDone": job["isDone"],
                 "doneProgress": float(job["doneProgress"])*100,
                  "scanCount": int(job["scanCount"]),
                  "eventCount": int(job["eventCount"]),
                  "resultCount": int(job["resultCount"])}

        status = ("\r%(doneProgress)03.1f%%   %(scanCount)d scanned   "
                  "%(eventCount)d matched   %(resultCount)d results") % stats

        sys.stdout.write(status + "\n")
        sys.stdout.flush()
        if stats["isDone"] == "1":
            sys.stdout.write("\n\nDone!\n\n")
            break
        sleep(2)

    # Get the results and display them
    i=0
    for result in results.ResultsReader(job.results(count=0)):
        i=i+1
        print json.dumps(result)
        f = open(directory + '/' + job_name + '/' + chunk + '_' + str(i) + '_' + job_name + '.json','w')
        f.write(json.dumps(result))

    job.cancel()   
    sys.stdout.write('\n')

pmeyerson
Path Finder

I think you are asking for this:

http://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTsearch#GET_search.2Fjobs.2F.7Bsearch_...

see: search/jobs/{search_id}/search.log

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...