Getting Data In

Lines break when indexing JSON data using props.conf attributes

anantdeshpande
Path Finder

Hi team,

I am not able to index below JSON data in Splunk 6.2 with below props.conf attributes. Its breaking at every line and treating as separate event with no field extraction. When I add the same file from Search head using add data option and selects _json as source type, the fields are correctly extracted. But do not work when mention same attributes and customized sourcetype name in props. Please suggest.

Data:
{
  "messageId" : "VIPJAPAN40001JCOMPLETE2017220818015450",
  "messageType" : "EVENT",
  "sendingAppId" : "P1",
  "sendTimeStamp" : "2017-22-08T17:09:27.526-05:00",
}

Props:
[APP1_INOUT_App2]
INDEXED_EXTRACTIONS = json
KV_MODE = none
NO_BINARY_CHECK = true
SHOULD_LINEMERGE = false
category = Structured
disabled = false
DATETIME_CONFIG = CURRENT
0 Karma

rafiqrehman
New Member

This works perfect for json data, at least for me:

[sourcetype]
INDEXED_EXTRACTIONS=json
JSON_TRIM_BRACES_IN_ARRAY_NAMES=true
CHARSET = AUTO
MAX_DIFF_SECS_AGO = 604800
MAX_EVENTS = 10000
NO_BINARY_CHECK = 1
TRUNCATE = 0

0 Karma

vasanthmss
Motivator

Try the below config,

[APP1_INOUT_App2]
DATETIME_CONFIG=CURRENT
SHOULD_LINEMERGE=true
NO_BINARY_CHECK=true
CHARSET=UTF-8
INDEXED_EXTRACTIONS=json
KV_MODE=none
pulldown_type=true
V

niketn
Legend

@anantdeshpande, Can you try with the following?

SHOULD_LINEMERGE = true
Also if you know event break pattern like start or end you should LINE_BREAKER, BREAK_ONLY_BEFORE, MUST_BREAK_AFTER etc. to identify events correctly. Refer to documentation: http://docs.splunk.com/Documentation/Splunk/latest/Data/Configureeventlinebreaking

Where is the field for timestamp in your data? Can you please one complete event as sample (or few)?

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

anantdeshpande
Path Finder

I tried with both SHOULD_LINEMERGE = true and false, but same results.

I can write BREAK_ONLY_BEFORE = ^{ or (^){ - but JSON is recognized format by Splunk and I should not be writing other stuff like normal logs.

In my data there are multiple fields with time stamp.

0 Karma
Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...