Splunk Search

Extract fields from JSON response

Naji
Explorer

I am new to Splunk and I have the following message which I would like to parse into a table of columns:

 

 

{dt.trace_id=837045e132ad49311fde0e1ac6a6c18b, dt.span_id=169aa205dab448fc, dt.trace_sampled=true}
{
	"correlationId": "3-f0d89f31-6c3c-11ee-8502-123c53e78683",
	"message": "API Request",
	"tracePoint": "START",
	"priority": "INFO",
	"category": "com.cfl.api.service",
	"elapsed": 0,
	"timestamp": "2023-10-16T15:59:09.051Z",
	"content": {
		"clientId": "",
		"attributes": {
			"headers": {
				"accept-encoding": "gzip,deflate",
				"content-type": "application/json",
				"content-length": "92",
				"host": "hr-fin.svr.com",
				"connection": "Keep-Alive",
				"user-agent": "Apache-HttpClient/4.5.5 (Java/16.0.2)"
			},
			"clientCertificate": null,
			"method": "POST",
			"scheme": "https",
			"queryParams": {},
			"requestUri": "/cfl-service-api/api/process",
			"queryString": "",
			"version": "HTTP/1.1",
			"maskedRequestPath": "/api/queue/send",
			"listenerPath": "/cfl-service-api/api/*",
			"localAddress": "/localhost:8082",
			"relativePath": "/cfl-service-api/api/process",
			"uriParams": {},
			"rawRequestUri": "/cfl-service-api/api/process",
			"rawRequestPath": "/cfl-service-api/api/process",
			"remoteAddress": "/123.123.123.123:123",
			"requestPath": "/cfl-service-api/api/process"
		}
	},
	"applicationName": "cfl-service-api",
	"applicationVersion": "6132",
	"environment": "dev",
	"threadName": "[cfl-service-api].proxy.BLOCKING @78f55ba"
}

 

 

 

Thank you so much for your help.

Labels (3)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Have you tried "| table *"?  In other words, is that message the raw events?  Because if it is, Splunk would have already given you all the fields like correlationId, message, content.clientId, content.attributes.reasonPhrase, and so on.

If the message is in a field named "data", you can use spath to extract it.

 

 

| spath input=data

 

 

Either way, your sample would give these fields and values

fieldnamefieldvalue
applicationNamecfl-service-integration-proxy
applicationVersion61808
categorycom.cfl.api.service-integration
content.attributes.reasonPhraseOK
content.attributes.statusCode200
content.clientId1234567
correlationId3-f86043c0-6c3c-11ee-8502-123c53e78683
elapsed435
environmentdev
messageAPI Response
priorityINFO
threadName[cfl-service-integration-proxy].proxy.BLOCKING @78f55ba
timestamp2023-9-16T15:59:22.083Z
tracePointEND

Hope this helps.

 

Naji
Explorer

you are correct, some of the fields are automatically extracted as part of the Event heading, but none of the fields I am interested in are available, such as:

tracePoint
content.attributes[]  //not interested in the headers
applicationName
applicationVersion
environmen

By the way, I tried what you suggested:

| spath input=data

but I see no change in my search results.

Thank you

 

Tags (1)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Do you mean to say that a string like "{dt.trace_id=837045e132ad49311fde0e1ac6a6c18b, dt.span_id=169aa205dab448fc, dt.trace_sampled=true}" is at the beginning of raw event?  If so, you will need to first extract the part with compliant JSON. (It is also a very bad log pattern from your developer.)

You can do so with

| eval json = replace(_raw, "^{.+}", "")

(Actual method will depend on how raw logs are structure, how stable such a structure is, etc.)  Then, apply spath.

| eval json = replace(_raw, "^{.+}", "")​
| spath input=json

Alternatively, get rid of the spurious part from _raw then spath.

| rex mode=sed "s/^{.+}//"
| spath

Here is an emulation you can play with and compare with real data

| makeresults
| fields - _time
| eval _raw = "{dt.trace_id=837045e132ad49311fde0e1ac6a6c18b, dt.span_id=169aa205dab448fc, dt.trace_sampled=true}
{
	\"correlationId\": \"3-f0d89f31-6c3c-11ee-8502-123c53e78683\",
	\"message\": \"API Request\",
	\"tracePoint\": \"START\",
	\"priority\": \"INFO\",
	\"category\": \"com.cfl.api.service\",
	\"elapsed\": 0,
	\"timestamp\": \"2023-10-16T15:59:09.051Z\",
	\"content\": {
		\"clientId\": \"\",
		\"attributes\": {
			\"headers\": {
				\"accept-encoding\": \"gzip,deflate\",
				\"content-type\": \"application/json\",
				\"content-length\": \"92\",
				\"host\": \"hr-fin.svr.com\",
				\"connection\": \"Keep-Alive\",
				\"user-agent\": \"Apache-HttpClient/4.5.5 (Java/16.0.2)\"
			},
			\"clientCertificate\": null,
			\"method\": \"POST\",
			\"scheme\": \"https\",
			\"queryParams\": {},
			\"requestUri\": \"/cfl-service-api/api/process\",
			\"queryString\": \"\",
			\"version\": \"HTTP/1.1\",
			\"maskedRequestPath\": \"/api/queue/send\",
			\"listenerPath\": \"/cfl-service-api/api/*\",
			\"localAddress\": \"/localhost:8082\",
			\"relativePath\": \"/cfl-service-api/api/process\",
			\"uriParams\": {},
			\"rawRequestUri\": \"/cfl-service-api/api/process\",
			\"rawRequestPath\": \"/cfl-service-api/api/process\",
			\"remoteAddress\": \"/123.123.123.123:123\",
			\"requestPath\": \"/cfl-service-api/api/process\"
		}
	},
	\"applicationName\": \"cfl-service-api\",
	\"applicationVersion\": \"6132\",
	\"environment\": \"dev\",
	\"threadName\": \"[cfl-service-api].proxy.BLOCKING @78f55ba\"
}"
``` data emulation above ```

Naji
Explorer

I tried what you suggested, but I was unable to get the results I expected. To resolve the issue, I had to disable Java log enrichment feature in Dynatrace OneAgent to stop OneAgent from injecting  

{dt.trace_id=837045e132ad49311fde0e1ac6a6c18b, dt.span_id=169aa205dab448fc, dt.trace_sampled=true}

 into my logs. Now things are back to normal.

Tags (1)
0 Karma
Get Updates on the Splunk Community!

Your Guide to SPL2 at .conf24!

So, you’re headed to .conf24? You’re in for a good time. Las Vegas weather is just *chef’s kiss* beautiful in ...

Get ready to show some Splunk Certification swagger at .conf24!

Dive into the deep end of data by earning a Splunk Certification at .conf24. We're enticing you again this ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Now On-Demand Join us to learn more about how you can leverage Service Level Objectives (SLOs) and the new ...