I have events in json-format as input and the events are recognized fine, but in smart-mode the automatic field extraction builds very long recursive fields.
As an example I get the correct field traceMap.RESPONSE.status only a few times, but there are other extracted tapeworm fields
traceMap.SEARCHRESULT.results{}.price.traceMap.RESPONSE.status
traceMap.SEARCHRESULT.results{}.price.traceMap.SEARCHRESULT.results{}.price.traceMap.RESPONSE.status
traceMap.SEARCHRESULT.results{}.price.traceMap.SEARCHRESULT.results{}.price.traceMap.SEARCHRESULT.results{}.price.traceMap.RESPONSE.status
etc.
It looks like that splunk dont recognize the end of the "traceMap.SEARCHRESULT.results{}.price" or there are limitations in field-extraction and some events are really long (30000 characters) and I set
[kv]
maxchars = 100000
but this don't help.
Is there any idea?
Best regards, Marco
I solved it and it was a hidden error in the json-format.
"price" : NaN, is not json-format. it have to be "price" : "NaN",
but on my search I looked on the search-result of splunk and it shows me the json-objects with qouted "nan" so I thought it was interpreted correct and all is fine, but it wasn't.
So, if you have a similar problem, make sure your json or xml is correct in the _raw-level. 😉
Best regards,
Marco
I solved it and it was a hidden error in the json-format.
"price" : NaN, is not json-format. it have to be "price" : "NaN",
but on my search I looked on the search-result of splunk and it shows me the json-objects with qouted "nan" so I thought it was interpreted correct and all is fine, but it wasn't.
So, if you have a similar problem, make sure your json or xml is correct in the _raw-level. 😉
Best regards,
Marco
Have the same problem with xml and using spath.
On 4.3 it hangs the search and a core hits 100% until killed. On 5.X you get stupid long paths.