Getting Data In

How to configure a sourcetype for JSON data to parse each line as a distinct event?

rfitch
Path Finder

I have an application that logs out single line JSON statements. For some reason when I load this into Splunk, most of the events are being arbitrarily grouped. I want each line to be a distinct event. Here is an example of some event grouping.

{"correlationId":"19432348-67ec-4942-97b8-afbd5946e450", "logger":"stuff.stuff.controller.Controller", "timestamp":"2016-09-22T11:44:14,799 MDT", "level":"INFO ", "threadId":"", "thread":"stuff-http--76", "threadPriority":"", "message":"stuff API starting"}
{"correlationId":"19432348-67ec-4942-97b8-afbd5946e450", "logger":"stuff.stuff.handler.Handler", "timestamp":"2016-09-22T11:44:15,026 MDT", "level":"INFO ", "threadId":"", "thread":"stuff-http--76", "threadPriority":"", "message":"stuff part of process completed in 225 ms, resourceEndPoint :[/server/stuff/stuff/rest/v2/@@stuffClass@@/]"}

I've tried some different JSON source types and I keep getting this behavior. I've also tried not setting a source type and letting Splunk Cloud determine what it is. Still the same behavior.

How can I force each line into its own event?

nareshinsvu
Builder

I had a similar requirement. My logs have normal texts and Json messages.

You can use below settings in your props.conf to extract the json messages as-is and can do rex on each line as you wanted

LINE_BREAKER = ([\r\n]+)
SHOULD_LINEMERGE = true

sloshburch
Splunk Employee
Splunk Employee

Try KV_MODE=json as per https://docs.splunk.com/Documentation/Splunk/6.4.3/Admin/Propsconf . Additionally, the event breaks can be tuned in the props to occur with newlines and curly brackets. I would imagine the default LINE_BREAKER works but I suppoe you could tune it to LINE_BREAKER=([\r\n]+)\{. See: https://docs.splunk.com/Documentation/Splunk/6.4.3/Admin/Propsconf#Line_breaking

Is your time stamp coming out correctly?

rfitch
Path Finder

Gave that a shot and it didn't change anything. The time stamp doesn't work every event, even thought they are structured the same.

Timestamp extract from this event with 2 lines

{"correlationId":"87165dae-6c7f-415f-8133-f30f955cbfb3", "logger":"stuff.XslUtil", "timestamp":"2016-09-22T13:36:58,861 MDT", "level":"INFO ", "threadId":"", "thread":stuff-http--80", "threadPriority":"", "message":"transform operation completed - run timing info: 6 ms"}
{"correlationId":"87165dae-6c7f-415f-8133-f30f955cbfb3", "logger":"stuffController", "timestamp":"2016-09-22T13:36:58,887 MDT", "level":"INFO ", "threadId":"", "thread":"tomcat-http--80", "threadPriority":"", "message":"stuffClasses operation completed - total time: 429 ms"}

Timestamp not extracted from this event with a single line

{"correlationId":"88e9b32e-3666-4615-9eb6-54dc45ac436c", "logger":"kennis.kdp.handler.PassthroughResourceHandler", "timestamp":"2016-09-22T13:36:55,352 MDT", "level":"INFO ", "threadId":"", "thread":"tomcat-http--19", "threadPriority":"", "message":"Denodo part of process completed in 432 ms, resourceEndPoint :[/server/kennis__fund_v2_0/fund_classes/rest/v2/@@fundClass@@/]"}

I'm just dealing with the source types GUI since it's Splunk Cloud. I'd attach some photos, but I don't have enough karma points

jacobpevans
Motivator

In a test environment, navigate to Settings > Add data > Upload. Upload a saved file version of your log. Change the sourcetype to _json (or a clone of it), and play with it from there. This is much easier than guessing parameters in .conf files. As an example, once the JSON is properly parsing, you can simply pick timestamp to be the field that _time derives from. Best of luck.

Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.
0 Karma

sloshburch
Splunk Employee
Splunk Employee

Please share your config up to this point? When adding the data in the GUI you should be able to too this on the left-hand side menu on the bottom.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...