Getting Data In

how to parse json to extract multiple line event ?

sfatnass
Contributor

i have one file json that contain many object like that :

{
    "id": 1,
    "name": "toto",
    "price": 1.50,
    "tags": ["travel", "red"] } 
{
        "id": 2,
        "name": "toto",
        "price": 12,
        "tags": ["home", "green"] }

i need to extract that on two event for the example : how can i use line_breaker on props.conf

thx

Tags (3)
0 Karma
1 Solution

jkat54
SplunkTrust
SplunkTrust

This method will index each field name in the json payload:

[ <SOURCETYPE NAME> ]  
SHOULD_LINEMERGE=true
NO_BINARY_CHECK=true
CHARSET=AUTO
INDEXED_EXTRACTIONS=json
KV_MODE=none
disabled=false
pulldown_type=true

This would not and would come at a lower performance cost:

[ <SOURCETYPE NAME> ]
CHARSET=AUTO
SHOULD_LINEMERGE=false
disabled=false
LINE_BREAKER=(^){.*"id":

View solution in original post

0 Karma

jkat54
SplunkTrust
SplunkTrust

This method will index each field name in the json payload:

[ <SOURCETYPE NAME> ]  
SHOULD_LINEMERGE=true
NO_BINARY_CHECK=true
CHARSET=AUTO
INDEXED_EXTRACTIONS=json
KV_MODE=none
disabled=false
pulldown_type=true

This would not and would come at a lower performance cost:

[ <SOURCETYPE NAME> ]
CHARSET=AUTO
SHOULD_LINEMERGE=false
disabled=false
LINE_BREAKER=(^){.*"id":
0 Karma

sfatnass
Contributor

this is my current configuraiton in the props.conf
[json]

[source::.../mysource...]
sourcetype = json
SHOULD_LINEMERGE = false
TRUNCATE=0
NO_BINARY_CHECK = 1
LINE_BREAKER = ([\r\n]+){

then i need to have something like that :

[json]

[source::.../mysource...]
 SHOULD_LINEMERGE=true
 NO_BINARY_CHECK=true
 CHARSET=AUTO
 INDEXED_EXTRACTIONS=json
 KV_MODE=json
 disabled=false
 pulldown_type=true

it's ok like that, it work very well and the performence is great thx

0 Karma

jkat54
SplunkTrust
SplunkTrust

Do you want the fields extracted at index time or search time?

Both examples I gave you worked with your example data so either you didn't reindex the data, didn't put the props in the correct place, or maybe the example data you provided isn't exactly like the data you're ingesting.

0 Karma

jkat54
SplunkTrust
SplunkTrust

The settings you used would index the fields and would need to be placed on the universal forwarder and indexers. It wouldn't apply to data already ingested either.

0 Karma

sfatnass
Contributor

just to extract json for many event like your exemple and extract all field too, because i will use some request and i need to know who the field contain the correct value ^^
but it's ok and thx for your reply

0 Karma

jkat54
SplunkTrust
SplunkTrust

great, just so you know the INDEXED_EXTRACTIONS will consume more disk space and does require more CPU on the indexers/forwarders

0 Karma

sfatnass
Contributor

ok but it's more performent no? the objectif of my project is to play more speed ^^

0 Karma

jkat54
SplunkTrust
SplunkTrust

It can be faster when you're searching for the fields involved yes.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...