Splunk Search

Timed out waiting for status to become available

David
Splunk Employee
Splunk Employee

I have a particular use that requires very long subsearches, running potentially for 15 minutes. Of course, my subsearch normally auto-finalizes after 60 seconds. I modified my limits.conf as such:

[subsearch]
# maximum number of results to return from a subsearch
maxout = 100000
# maximum number of seconds to run a subsearch before finalizing
maxtime = 1500
# time to cache a given subsearch's results
ttl = 75000

But now, whenever I run my search, it times out with the following error:

Timed out waiting for status to become available on job=1305920559.2

When I remove the lines from $SPLUNK_HOME/etc/system/local/limits.conf, it goes back to autofinalizing.

Any ideas? I'm running 4.2.0.



Further reading:

2011-05-20 15:42:33,256 INFO    [4dd6c429401c37450] utility:59 - name=javascript, class=Splunk.Session, language=en-US, product=Gecko, appVersion=5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.65 Safari/534.24, platform=Win32, vendor=Google Inc., appCodeName=Mozilla, appName=Netscape, productSub=20030107, userAgent=Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.65 Safari/534.24, availTop=0, pixelDepth=32, availHeight=994, height=1024, width=1280, colorDepth=32, availWidth=1280, availLeft=0, documentURL=http://myServer:8000/en-US/app/MyApp/flashtimeline, documentReferrer=http://MyServer:8000/en-US/account/login?return_to=%2Fen US%2Fapp%2FMyApp%2Fflashtimeline, flash=10.2.154, Splunk.Session.START_EVENT fired @Fri May 20 2011
12:42:32 GMT-0700 (Pacific Daylight Time)
2011-05-20 15:44:10,003 ERROR   [4dd6c42fad20eaf90] search:167 - Timed out waiting for status to become available on job=1305920559.2
Traceback (most recent call last):
  File "/opt/splunk/lib/python2.6/site-packages/splunk/appserver/mrsparkle/controllers/search.py", line 160, in dispatchJob
    job = splunk.search.dispatch(q, sessionKey=cherrypy.session['sessionKey'], **options)
  File "/opt/splunk/lib/python2.6/site-packages/splunk/search/__init__.py", line 287, in dispatch
    result = SearchJob(sid, hostPath, sessionKey, namespace, owner, dispatchArgs=args)
  File "/opt/splunk/lib/python2.6/site-packages/splunk/search/__init__.py", line 486, in __init__
    self._getStatus(True)
  File "/opt/splunk/lib/python2.6/site-packages/splunk/search/__init__.py", line 829, in _getStatus
    raise splunk.SplunkdException, 'Timed out waiting for status to become available on job=%s' % self.id
SplunkdException: Timed out waiting for status to become available on job=1305920559.2
2011-05-20 15:44:10,011 ERROR   [4dd6c4693fc62bd0] utility:59 - name=javascript, class=Splunk.Error, lineNumber=13051, message=Uncaught Error: INVALID_STATE_ERR: DOM Exception 11, fileName=http://MyServer:8000/en-US/static/@96430.174/js/common.min.js
Tags (3)

araitz
Splunk Employee
Splunk Employee

I think you might be doing things sub-optimally with long running subsearches. Can you try posting your search use case so perhaps we can triage it from the beginning?

hexx
Splunk Employee
Splunk Employee

I second @araitz, we really need to see your subsearch to comment. Actually, we would also need to know how long it takes for that subsearch to complete if it is run as a stand-alone search.

deeboh
Path Finder

I"m having the same problem with my subsearch. Here is my search. Its fairly simple stuff. Sorry if my search censorship is too confusing.

index="index" [search index=index | rex "(?i)path/of/URI(?P[^:]+)" | dedup value | rename value as returnvalue ]

sideview
SplunkTrust
SplunkTrust

There's a funny quirk in the search API around 204's. When you ask for a resource, you're supposed to get back a 200 and the data, but sometimes splunkd isnt quite ready yet. so what it'll give back is a 204. The client then has to re-request periodically until the actual content becomes available.

This stuff is built into the python SDK (aka "/opt/splunk/lib/python2.6/site-packages/splunk/search/init.py") so I dont know how commonly known it is.

At any rate, it seems like the presence of the long subsearch is triggering a really really long sequence of polling and 204 responses, and the SDK is giving up. Since this is the python SDK it probably knows nothing about limits.conf and I suspect it wasnt designed to accommodate this use case.

So what's with the crazy long subsearches? Is there really no other way? Is this posted as a separate question elsewhere? You know we love a challenge. 😃

sideview
SplunkTrust
SplunkTrust

Can you post the searches so we can take a look? Quite possibly there's another way. I agree it's a bug, however reworking the search will most likely yield other benefits too.

0 Karma

Dark_Ichigo
Builder

This looks like a Splunk Bug to me, there has to be a fix for this, a Sub-search timing OUT! We can do so much with Sub-Searches, we cant just let Splunk ignore this issue, there were soo Many Times I would've liked to use Sub-searches in Splunk to create Summary Indexes, But They keep timing out!

0 Karma

sideview
SplunkTrust
SplunkTrust

I still think if you post the full search that we can probably find a better way to get your end-users what they expect.

0 Karma

johnnymc
Path Finder

i hit the same problem nick explained. do you think there's any trick to do here? i already optimized my subsearch as much as i could.

0 Karma

Dark_Ichigo
Builder

So there isn't a Solution to this Problem other than Simplifying the search?

0 Karma

David
Splunk Employee
Splunk Employee

Hmmm. Interesting. This seems to be timing out after just a minute and a half or so. I'm sure that subsearches have been done longer than that, before! Would splunkd be returning status before the search completes, but it's just hanging on that process?

As for the search, there are many better ways, and most of them we do regularly. What I'm trying to do is provide the end users an ability to run exactly the search they normally do with summary indexes off the raw data, if they need to. (i.e., "It'll take 15 minutes, but you know it will be authoritatively accurate.")

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...