Getting Data In

Multiple lines are getting stuck together at indexing

hharvey
Explorer

I am indexing a file of single line log events and some lines are getting chunked together into one event. Trying to figure out how to correct it.

Here's some example log data that is getting indexed together in each event:

a1-server1 App-> 2011-10-05 12:35:23 - a1-server1 - [99.99.99.99] user3(Active Directory Users)[gg-group] - Key Exchange number 23 occured for user with NCIP 192.168.212.63 
a1-server1 App-> 2011-10-05 12:35:23 - a1-server1 - [99.99.99.99]  user4(Active Directory Users)[gg-group] - Key Exchange number 5 occured for user with NCIP 192.168.212.221 
a1-server1 App-> 2011-10-05 12:35:23 - a1-server1 - [99.99.99.99]  user5(Active Directory Users)[gg-group] - Key Exchange number 2 occured for user with NCIP 192.168.212.193 

a1-server1 App-> 2011-10-05 12:33:46 - a1-server1 - [99.99.99.99]  auser2(Active Directory Users)[gg-group] - Key Exchange number 29 occured for user with NCIP 192.168.212.139 
a1-server1 App-> 2011-10-05 12:33:46 - a1-server1 - [99.99.99.99]  auser1(Active Directory Users)[gg-group] - Key Exchange number 16 occured for user with NCIP 192.168.212.108 

a1-server1 App-> 2011-10-05 12:28:43 - a1-server1 - [99.99.99.99]  auser(Active Directory Users)[gg-group]  - Logout from 99.99.99.99 (session:ff6f38ff)
a1-server1 App-> 2011-10-05 12:28:43 - a1-server1 - [99.99.99.99]  auser(Active Directory Users)[gg-group]  - Closed connection after 12513 seconds, with 4502 bytes read (in 19 chunks) and 19391 bytes written (in 43 chunks)
a1-server1 App-> 2011-10-05 12:28:43 - a1-server1 - [99.99.99.99]  auser(Active Directory Users)[gg-group]  - Network Connect: ACL count = 833.
a1-server1 App-> 2011-10-05 12:28:43 - a1-server1 - [99.99.99.99]  auser(Active Directory Users)[gg-group] - Network Connect: Session ended for user with IP 192.168.212.21

Initially, in props.conf I tried setting line_merge to false:

[source::source-to-break]
SHOULD_LINEMERGE = false

When that didn't work I tried specifying a line break before the server name at the beginning of each event:

[source::source-to-break]
SHOULD_LINEMERGE = true
BREAK_ONLY_BEFORE = a[0-9]-server[0-9]\sApp[\w'-]>\s

But that also is not working. I'm beginning to think this has something to do w/ the time, but not sure.

0 Karma
1 Solution

_d_
Splunk Employee
Splunk Employee

Try the stanza below:
[source::source-to-break]
TIME_PREFIX = a\d+\-server\d+\s+App\-\>\s+
TIME_FORMAT = %Y-%m-%d %H:%M:%S
MAX_TIMESTAMP_LOOKAHEAD = 20
LINE_BREAKER = ([\r\n]+)(?=a\d+\-server\d+\s+App\-\>\s+\d+\-\d+\-\d+\s+)
SHOULD_LINEMERGE = false

This will attempt to break right before a1-server1, extract the timestamp and format it properly. If this works please do not forget to vote 🙂

View solution in original post

_d_
Splunk Employee
Splunk Employee

Try the stanza below:
[source::source-to-break]
TIME_PREFIX = a\d+\-server\d+\s+App\-\>\s+
TIME_FORMAT = %Y-%m-%d %H:%M:%S
MAX_TIMESTAMP_LOOKAHEAD = 20
LINE_BREAKER = ([\r\n]+)(?=a\d+\-server\d+\s+App\-\>\s+\d+\-\d+\-\d+\s+)
SHOULD_LINEMERGE = false

This will attempt to break right before a1-server1, extract the timestamp and format it properly. If this works please do not forget to vote 🙂

hharvey
Explorer

took me a while to get this implemented, but I just did and it worked. thanks so much!

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...