Getting Data In

How to parse json which makes up part of the event

rkeenan
Explorer

Hello,

We have some json being logged via log4j so part of the event is json, part is not. The log4j portion has the time stamp. I can use field extractions to get just the json by itself. The users could then use xmlkv to parse the json but I'm looking for this to be done at index time so the users don't need to do this - any suggestions?

Example of logs (all lines are log4j logging json):
2017-01-04 00:00:00.981 [log_level] methodName- {"key1":"value1","key2":"value2","key3":"value3"}
2017-01-04 00:00:00.984 [log_level] methodName- {"key1":"value1"}
2017-01-04 00:00:00.984 [log_level] methodName - {"key1":"value1","key2":"value2"}

Thanks

0 Karma

aaraneta_splunk
Splunk Employee
Splunk Employee

@rkeenan - Did one of the answers below help provide a solution your question? If yes, please click “Accept” below the best answer to resolve this post and upvote anything that was helpful. If no, please leave a comment with more feedback. Thanks.

0 Karma

arkadyz1
Builder

You can use spath to extract subfields from json. Try something like:

<your search>
| rex "^\d{4}-\d\d-\d\d \d\d:\d\d:\d\d:\d\d\d \[(?P<log_level>\w+\] (?<method>\w+) (?P<my_json>.*)$"
| spath field=my_json path=my_prefix

This will create log_level, method, my_json and the hierarchy of my_prefix.* fields (in your cases you'll get my_prefix.key_1, my_prefix.key_2 and my_prefix.key_3)

0 Karma

niketn
Legend
| <Your Base Search>
| rex field=_raw "\[log_level\] methodName[-|\s]+(?<jsonData>.*)" 
| table _time jsonData _raw
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma

somesoni2
SplunkTrust
SplunkTrust

You can setup automatic key-value pair extraction at search time (index time extraction is costlier, slows indexing process and requires additional space) so that uses have the fields available to them without any inline extractions. Add this to your props.conf/transforms.conf on search heads.

props.conf

[YourSourceType]
TRANSFORMS-kvextract = jsonextract

transforms.conf

[jsonextract]
REGEX = \"(?<_KEY_1>[A-z0-9]+)\":\"(?<_VAL_1>[^\"]+)\"
0 Karma

mreynov_splunk
Splunk Employee
Splunk Employee

dynamic key names will slow things down even more. Why not indexed_extractions = JSON instead?

0 Karma

somesoni2
SplunkTrust
SplunkTrust

It's not pure JSON, pretty sure INDEXED_EXTRACTIONS=json would not work.

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...