Getting Data In

Fields Extraction Issue - CSV file Batch Upload via a TA

98123722
Explorer

This is driving me nuts 🙂 Trying to index a CSV file which a server creates once an hour (in this case this is DHCP assignment, but can be anything else -- note that I'm not looking for a specific info on bringing in DHCP data, but rather a general answer on CSV indexing).

I've created a deployment app for it, with the following conf files. The data is uploaded and is indexed, but the fields aren't extracted in the search (see below screenshot).

local\inputs.conf:

[batch://C:\users\batchUser\Survery\computer_information.csv]
sourcetype = ADSurvey
disabled = false
index = wineventlog
move_policy = sinkhole
interval = 5

local\props.conf:

[ADSurvey]
SHOULD_LINEMERGE = false
TRANSFORMS-ADSurvey = dhcp_csv
CHECK_METHOD = entire_md5

local\transforms.conf:

[dhcp_csv]
DELIMS = ","
FIELDS = "DNSHostName","IPv4Address","OperatingSystem","SamAccountName","whenCreated","whenChanged","Modified","objectSid","IPv6Address","OperatingSystemVersion"

My search doesn't seem to recognize the extracted fields:
alt text

1 Solution

maciep
Champion

What does your environment look like? Is the server creating the file also a Splunk Enterprise server? Or is it just a universal forwarder?

You have settings for 3 phases of data here, so it's important they're all in the right place.

  1. The inputs.conf setting and check_method in your props need to be on the server where you're reading the csv. And you could probably get away with using a monitor stanza instead of batch. Also, not sure if your check_method setting is even doing anything based on the docs, because it's not in a source stanza.

  2. The should_linemerge setting is a parse-time configuration and therefore belongs on your indexer (or the first Splunk Enterprise server the data will land on). The breaking of events (typically) does not happen on the forwarder.

  3. The TRANSFORMS setting in your props.conf should be a REPORT setting instead, based on the stanza it is calling transforms.conf. And it belongs on your search head, as does the stanza in transforms.conf. These settings are applied when you click the search button in Splunk, not when the data is ingested.

This is a good article describing the phases and which setting from which conf files belong to which phases.

View solution in original post

98123722
Explorer

I've gone into a bit of a rabbit hole here. I'll update this answer until this task is complete.

For now: it IS possible to extract at the source (i.e. the server on which the UF is deployed). I've successfully used INDEXED_EXTRACTIONS for this, as explained here. The problem is that only the first field of the CSV is extracted.

I'll keep this post updated.

0 Karma

maciep
Champion

What does your environment look like? Is the server creating the file also a Splunk Enterprise server? Or is it just a universal forwarder?

You have settings for 3 phases of data here, so it's important they're all in the right place.

  1. The inputs.conf setting and check_method in your props need to be on the server where you're reading the csv. And you could probably get away with using a monitor stanza instead of batch. Also, not sure if your check_method setting is even doing anything based on the docs, because it's not in a source stanza.

  2. The should_linemerge setting is a parse-time configuration and therefore belongs on your indexer (or the first Splunk Enterprise server the data will land on). The breaking of events (typically) does not happen on the forwarder.

  3. The TRANSFORMS setting in your props.conf should be a REPORT setting instead, based on the stanza it is calling transforms.conf. And it belongs on your search head, as does the stanza in transforms.conf. These settings are applied when you click the search button in Splunk, not when the data is ingested.

This is a good article describing the phases and which setting from which conf files belong to which phases.

JaoelNameiol
Explorer

Have you tried using [monitor//] instead of [batch//] in your inputs.conf and see the results?

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...