Getting Data In

How do I configure data inputs for .csv files with dynamic field headers for a new event on each line

fox
Path Finder

Running 4.2.1, we are monitoring many csv files that differ on listed fields. We have splunk configured to dynamically read the header row for field names. (props.conf: CHECK_FOR_HEADER=TRUE) and this works brilliantly! However we are not seeing the events split correctly - splunk is indexing 256 rows to one event. This is a .csv file with a clear event new line separation...

Has anyone else done this successfully?

Any ideas?

Tags (1)
0 Karma

gkanapathy
Splunk Employee
Splunk Employee

Most likely, Splunk is not detecting a timestamp in your rows. The default rule for Splunk is to merge lines together (SHOULD_LINEMERGE = true), but to split them whenever it detects a date (BREAK_ONLY_BEFORE_DATE = true). The easiest and best way to break on newlines is to simply set SHOULD_LINEMERGE = false, but if there are dates in your data and Splunk isn't finding them, you should also set TIME_FORMAT and TIME_PREFIX and maybe MAX_TIMESTAMP_LOOKAHEAD.

0 Karma

Ayn
Legend

By default Splunk will merge lines in incoming logs and then break them up according to certain rules. This behavior is controlled by the SHOULD_LINEMERGE directive in props.conf (default is true). Setting SHOULD_LINEMERGE to false will tell Splunk not to combine several lines into a single event, which will give you the behavior you want.

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...