Splunk Enterprise

Multiple json lines treated as a single event

lstruman
New Member

Hi,

I have searched and found people had a similar problem. However none of the suggestions worked for me. Since I am new to splunk, i want to write my env down so that people want to help can tell me which part i did wrong. JSON data like below are added to a log file on the node that is running the universal forwarder.

{"relativeCreated": 1571762.7699375153, "process": 13962, "module": "middlewares", "funcName": "process_response", "levelchar": "I", "exc_text": "", "message": "Request finished", "extra": {}, "audit": {"body": "", "content_length": 131, "user_agent": "python-requests/2.7.0 CPython/2.7.8 Linux/2.6.32-573.7.1.el6.x86_64", "start-time": "2015-10-07 04:37:14.202709", "audit_id": "fdd46aac6cd611e58d1d080027ce083d", "status_code": 400, "mac": "14CFE21473B2", "csid": "7c96a912-2ab8-4d7b-8bce-2b9b08d4340e", "duration": "5.74", "path": "/api/v1/gateway-configset/7c96a912-2ab8-4d7b-8bce-2b9b08d4340e/gateway-cpe/14CFE21473B2/configset?action=apply_and_associate", "remote_ip": "192.168.56.101", "method": "POST", "host_name": "node2:8080"}, "name": "request", "thread": 139797986514688, "created": 1444210634.208494, "threadName": "Thread-131", "msecs": 208.4939479827881, "filename": "middlewares.py", "levelno": 20, "processName": "MainProcess", "pathname": "/vol/xpc/src/shared/audit_logging/middlewares.py", "lineno": 48, "exc_info": null, "_time": "2015-10-07 09:37:14.208723", "levelname": "INFO"} 
{"relativeCreated": 1571770.4730033875, "process": 13962, "module": "middlewares", "funcName": "process_response", "levelchar": "I", "exc_text": "", "message": "Request finished", "extra": {}, "audit": {"body": "", "content_length": 107, "start-time": "2015-10-07 04:37:14.213509", "audit_id": "fdd605886cd611e582d8080027ce083d", "status_code": 200, "mac": "14CFE21473B2", "user_agent": "python-requests/2.7.0 CPython/2.7.8 Linux/2.6.32-573.7.1.el6.x86_64", "duration": "2.64", "path": "/api/v1/gateway-cpe/14CFE21473B2/association", "remote_ip": "192.168.56.101", "method": "GET", "host_name": "node2:8080"}, "name": "request", "thread": 139797986514688, "created": 1444210634.216197, "threadName": "Thread-132", "msecs": 216.19701385498047, "filename": "middlewares.py", "levelno": 20, "processName": "MainProcess", "pathname": "/vol/xpc/src/shared/audit_logging/middlewares.py", "lineno": 48, "exc_info": null, "_time": "2015-10-07 09:37:14.216385", "levelname": "INFO"}

However at the other node that runs splunk enterprise, i see these 2 lines are grouped into one time stamp, that is

column Time: 10/7/15 6:37:14.000 PM
column Event: The above 2 lines of JSON

I want each JSON line treated as one event. I saw people making suggestions about configuring the props.conf, I had tried this

[root@node1 local]# cat /opt/splunk/etc/system/local/props.conf
[mysourcetype]
KV_MODE = json
SHOULD_LINEMERGE = false
TRUNCATE = 0
TIME_PREFIX = _time
LINE_BREAKER=([\r\n]+)

but it did not work. Can anyone help? Thanks a lot.

Tags (1)
0 Karma

ehughes100
Explorer

I had the same problem where we were using Nifi to push to a TCP data input in Splunk. The fix for us was to change the source type in the TCP data input configuration to json_no_timestamp. This was chosen by setting the "Set sourcetype" to "From List" and then picking it out of the list.

somesoni2
Revered Legend

Try this for your props.conf (On indexer/Heavy forwarder, restart after change)

[mysourcetype]
 KV_MODE = json
 SHOULD_LINEMERGE = false
 TRUNCATE = 0
 LINE_BREAKER=([\r\n]+)(?=\{\"relativeCreated\")
TIME_PREFIX=start-time\":\s*\"
TIME_FORMAT=%Y-%m-%d %H:%M:%S.%N
0 Karma

lstruman
New Member

Thanks for the suggestion. However it did not work. I don't know if it matters but if I control the slow down the speed data is written to the log file, things get better. That is, if I introduce delay after a new line is appended to the log file, the number of "merged" events gets less. If I make it very slow, then I think the merged effect goes away. However this is for testing. For production log, the log file must grow very fast.

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...