I have a set of input scripts that are working as expected. The problem I am facing is that I need to index the results but the event is not broken correctly. This is an example of the result :
[
{
"a": "4620",
"b": "splunk",
"x": "0",
"d": "3.0",
"e": "50",
"f": "0",
"g": "41.0",
"_time": "2014-01-17T10:26:43.000-05:00",
"h": "abc",
"i": "4620",
"j": "0.00",
"k": "21.0",
"l": "6.00"
},
{
"a": "4620",
"b": "ABC",
"x": "0",
"d": "3.0",
"e": "50",
"f": "0",
"g": "41.0",
"_time": "2014-01-17T10:26:43.000-05:00",
"h": "abc",
"i": "4620",
"j": "0.00",
"k": "21.0",
"l": "6.00"
}
]
This is what I have in the inputs.conf:
[script:///opt/splunk/bin/scripts/splunk-sdk-python/examples/abc.py]
disabled = 0
index = main
interval = */5 * * * *
sourcetype = feed
try sourcetype = json instead of feed
it breaks at any position.
The work around I did to make it work was to write the results in a file. Then, I monitor file using a monitoring stanza in inputs.conf. I tried to apply the same config to the script stanza but no success. The config I used in monitoring the file is the following:
LINE_BREAKER = "(^)["
TRUNCATE = 0
SHOULD_LINEMERGE = false
I should be able to index the results correctly without doing the presented work around.
Any idea?
Thanks,
Lp
I believe the JSON log parser expects logs of the format:
{"key":"value"}
{"key":"othervalue"}
I.E. JSON log files aren't JSON, they are files with one JSON object per line.
So, remove the square brackets and the comma. If that doesn't work, try putting each event on a separate line (so drop the pretty-printing).
I have been able to index any type of valid json structure without any problem using the monitoring file stanza in the inputs.conf
Is that result one or two events?
Edit: Wait, what I meant is that Splunk, when parsing JSON, doesn't expect valid JSON but instead a single JSON object per line. That is at least what I've experienced.
The event I presented in the original question is a valid Json.
In what way is your event not broken correctly?