Sample Data: {{"device_id":"a1c842ef8c0545f48e8e61d3e03c68bb","ip":"192.168.193.162","topic":"DEVICE","event":"device.access", "timestamp":"2015-05-05T20:55:30.904+0000"}}
I want to break this into two separate events using }}
as a delimiter:
{{"device_id":"a1c842ef8c0545f48e8e61d3e03c68bb","ip":"192.168.193.162","topic":"DEVICE","event":"device.access", "timestamp":"2015-05-05T20:55:30.904+0000"}}
AND
{{"source":{"email":"johndoe@acme.com"}, "name":"John Doe"},"topic":"FILE","event":"file.create","timestamp":"2015-05-05T20:55:31.428+0000"}}
I created a props.conf
file in $SPLUNK_HOME/etc/system/local
, added the following lines, and restarted splunkd, but it didn't work.
SHOULD_LINEMERGE = false
LINE_BREAKER = (}})
Any help would be much appreciated!
Did you put the correct sourcetype
in your stanza header? Try this (escape curly braces)
SHOULD_LINEMERGE = false
LINE_BREAKER = \}\}([\r\n]+)
As reported in other answers you should fix this in your props.conf at index time but if the data is already indexed you can break it as follows. Note you need the new line character after delim=" and can type it using shift-enter.
|eval raw=_raw
|makemv delim="
" raw
| mvexpand raw
| eval _raw=raw
Any other fields the original event had will now be in all part events. i.e. If line3 had a user field, all 3 lines will have that user field. So you may want to delete them and re-extract for the lines with something like this.
| fields - user
| rex "user=\"(?<user>[^\"]+)\""
Did you put the correct sourcetype
in your stanza header? Try this (escape curly braces)
SHOULD_LINEMERGE = false
LINE_BREAKER = \}\}([\r\n]+)
Hi,
How about a have an csv file, the forwarder should break into multilines, but it didn't. What should i write props.conf
CSV looklike in 1 event:
G2nS32m2gEaFZUrh,UDP,2294,64021328,447952334,511973662,1652264015 xNeJ2gTvj9wAl5Hi,UDP,2294,15902274,180739240,196641514,1652263847
Maybe:
LINE_BREAKER = \\n([\r\n]+)
Does it right?
ok, so looks like the line break rule change in props.conf doesn't get applied to events that have already been indexed. New data coming in are getting split up correctly, so your instructions did work. But the older data that have already been indexed are not getting split up. Is there a way to force splunk to re-index everything?
Yes, first you use this to remove the bad data:
sourcetype=xxx | delete
Then you do like this:
http://answers.splunk.com/answers/72562/how-to-reindex-data-from-a-forwarder.html
that didn't work either. This is exactly what I have in my $SPLUNK_HOME/etc/system/local/props.conf
file:
[_json]
SHOULD_LINEMERGE = false
LINE_BREAKER = \}\}([\r\n]+)
First of all you should not be editing that file; you should make your own file in your own app directory. In any case, you did not preserve the backslashes. You have to use exactly what I wrote in my answer.
sorry that was a typo on my part. I did include the backslashes in my props.conf file and it didn't work. I'm still getting multiple events grouped into a single Splunk event.
I edited the props.conf file in $SPLUNK_HOME/etc/system/local
because that's what the Splunk doc said to do: http://docs.splunk.com/Documentation/Splunk/6.2.3/Data/Indexmulti-lineevents
Should I be editing it in $SPLUNK_HOME/etc/users/admin/search/local
instead?
Your sample data seems to be incomplete.