Splunk Enterprise

Multiple json lines in the log are merged into a single event

lstruman
New Member

Hi,

I have one node running the universal forwarder. It forwards a log file that contain json data like below. However at the
other splunk enterprise node, The 2 lines are treated as a single event.

Column Time: 10/7/15 7:11:45.000 PM
Column Event:

{"relativeCreated": 3642544.4090366364, "process": 13962, "module": "middlewares", "funcName": "process_response", "levelchar": "I", "exc_text": "", "message": "Request finished", "extra": {}, "audit": {"body": "{\"block_http_enabled\": true, \"block_p2p_enabled\": false, \"block_multicast_enabled\": true, \"block_icmp_enabled\": false, \"firewall_level\": 4, \"block_ident_enabled\": true}", "content_length": 49, "user_agent": "python-requests/2.7.0 CPython/2.7.8 Linux/2.6.32-573.7.1.el6.x86_64", "start-time": "2015-10-07 05:11:44.969834", "audit_id": "d01a5ca26cdb11e59fd8080027ce083d", "status_code": 200, "csid": "db08e8b9-5d37-463f-871c-0136fa465c25", "duration": "20.20", "path": "/api/v1/gateway-configset/db08e8b9-5d37-463f-871c-0136fa465c25/configuration?group_id=group_firewall_configuration", "remote_ip": "192.168.56.101", "method": "POST", "host_name": "node2:8080"}, "name": "request", "thread": 139797986514688, "created": 1444212704.990133, "threadName": "Thread-196", "msecs": 990.1330471038818, "filename": "middlewares.py", "levelno": 20, "processName": "MainProcess", "pathname": "/vol/xpc/src/shared/audit_logging/middlewares.py", "lineno": 48, "exc_info": null, "_time": "2015-10-07 10:11:44.990464", "levelname": "INFO"} 
{"relativeCreated": 3642558.506965637, "process": 13962, "module": "middlewares", "funcName": "process_response", "levelchar": "I", "exc_text": "", "message": "Request finished", "extra": {}, "audit": {"body": "", "content_length": 223, "user_agent": "python-requests/2.7.0 CPython/2.7.8 Linux/2.6.32-573.7.1.el6.x86_64", "start-time": "2015-10-07 05:11:44.995214", "audit_id": "d01e42b86cdb11e5ba45080027ce083d", "status_code": 200, "csid": "db08e8b9-5d37-463f-871c-0136fa465c25", "duration": "8.97", "path": "/api/v1/gateway-configset/db08e8b9-5d37-463f-871c-0136fa465c25/configuration?group_id=group_firewall_configuration", "remote_ip": "192.168.56.101", "method": "GET", "host_name": "node2:8080"}, "name": "request", "thread": 139797986514688, "created": 1444212705.004231, "threadName": "Thread-197", "msecs": 4.230976104736328, "filename": "middlewares.py", "levelno": 20, "processName": "MainProcess", "pathname": "/vol/xpc/src/shared/audit_logging/middlewares.py", "lineno": 48, "exc_info": null, "_time": "2015-10-07 10:11:45.004495", "levelname": "INFO"}

I want each json line be treated as a single event. I had searched and tried these at the splunk enterprise node. They did not work.

[root@node1 local]# cat /opt/splunk/etc/system/local/props.conf
[mysourcetype]
KV_MODE = json
SHOULD_LINEMERGE = false
TRUNCATE = 0
TIME_PREFIX = _time
LINE_BREAKER=([\r\n]+)
0 Karma

rphillips_splk
Splunk Employee
Splunk Employee

on your indexers can you try this configuration:

$SPLUNK_HOME/etc/system/local/props.conf

[ mysourcetype]
SHOULD_LINEMERGE=true
NO_BINARY_CHECK=true
TIME_FORMAT=%Y-%m-%d %H:%M:%S.%6
TIME_PREFIX=_time
MAX_TIMESTAMP_LOOKAHEAD=26

restart splunk after applying configuration

Get Updates on the Splunk Community!

More Ways To Control Your Costs With Archived Metrics | Register for Tech Talk

Tuesday, May 14, 2024  |  11AM PT / 2PM ET Register to Attend Join us for this Tech Talk and learn how to ...

.conf24 | Personalize your .conf experience with Learning Paths!

Personalize your .conf24 Experience Learning paths allow you to level up your skill sets and dive deeper ...

Threat Hunting Unlocked: How to Uplevel Your Threat Hunting With the PEAK Framework ...

WATCH NOWAs AI starts tackling low level alerts, it's more critical than ever to uplevel your threat hunting ...