I am having an issue with a particular log file where two entries get concatenated into one entry. It is not the data because if I take take the same data and add it via file upload it is fine, meaning all lines are unique. Has anyone else had an issue like this? It seems that it is a timing problem that the host if forwarding the same time the file is being written... Is this perhaps the result of the log file not being locked when written to???
Any help is appreciated. Thanks!
You need to set line breaking property in your props.conf
http://docs.splunk.com/Documentation/Splunk/6.2.5/Admin/propsconf
LINE_BREAKER = <regular expression> * Specifies a regex that determines how the raw text stream is broken into initial events,
Does this still apply when the issue only happen sometimes? It is the same file, it just seems that the file get fwd before it is done being written.