Getting Data In

Indexed CSV wont line break?

jdunlea_splunk
Splunk Employee
Splunk Employee

Im indexing a CSV file and i have SHOULD_LINEMERGE set to "false" so it will break after each new line.

However per 24 hour period (and about 600,000 events), I get ~50 events which are not line broken correctly and have half of the event as a new event - How is this even happening if I have SHOULD_LINEMERGE=false? Isnt the default to break at a new line?

The only think I am thinking is that a small subset of the events in the CSV are broken over two lines? (If that's even possible) Or is there a limit to the amount of characters that Splunk will check for a line break, before it just breaks the event at the limit?? So basically meaning that we had a few very long entries in the CSV file which Splunk didn't check all the way to the end due to a limit of some sort??

Tags (4)
0 Karma

jt_splunk
Explorer

The only time I've run into this is when the application that generated the csv file had corrupt data coming in. Can you post a sample of your data?

If you're sure your dataset is clean, you may want to look at enabling SHOULD_LINEMERGE and then tweaking MUST_NOT_BREAK_BEFORE discussed here: http://docs.splunk.com/Documentation/Splunk/latest/Data/Indexmulti-lineevents.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...