Getting Data In

The lines of some json are joining

dbrancaglion
Explorer

My logs are joining in some events and with that I'm missing values that I later adding, for splunk only counts as valid a field inside the log.

example:

*{"id":"4122257","type":"TRANSACAO_CREDITO","amount":3.73,"queued_ms":"3","paySmart_ms":"0","elapsed_ms":"311","instance":"macarico.xxx.xxxx","brand":"XXX","product":1,"status":"approved","tags":["AUTHORIZATION_REQUEST","MAGNETIC","PIN_ENTRY_CAP","PIN","CVC2","NO_CHIP_DATA","TRACK1_PRESENT","ATTENDED_TERMINAL","MERCHANT_TERMINAL_ON_LOCAL","CARDHOLDER_PRESENT","CARD_PRESENT","TERMINAL_MANUAL_MAGNETIC_CHIP","NO_CEP","NO_CNPJ"]} {"id":"4122258","type":"TRANSACAO_CREDITO","amount":20.50,"queued_ms":"2","paySmart_ms":"0","elapsed_ms":"317","instance":"macarico.xxx.xxxx","brand":"XXX","product":1,"status":"denied","reason":"NEGADA_CODIGO_DE_SEGURANCA_2_INVALIDO","tags":["AUTHORIZATION_REQUEST","MAGNETIC","PIN_ENTRY_CAP","PIN","CVC2","NO_CHIP_DATA","TRACK2_PRESENT","ATTENDED_TERMINAL","MERCHANT_TERMINAL_ON_LOCAL","CARDHOLDER_PRESENT","CARD_PRESENT","TERMINAL_MAGNETIC_CHIP","CEP","NO_CNPJ"]}
*

these two were json with the same timestamp and the same event, when they should be in two different events.

And this only occurs in some random cases.

Someone already had a similar experience?

Tags (1)
0 Karma
1 Solution

cpetterborg
SplunkTrust
SplunkTrust

I think that you are not parsing the data correctly. Your props.conf file may be the problem. You don't have a timestamp in the json data (which I would suggest you add to the data). Next, you may not be taking a single line only for an event, though this may not actually be the case with your data, so we can look past that one. And third, it seems most likely that you aren't telling Splunk to index it as JSON data. Here is what I would suggest in your props.conf file:

[yoursourcetype]
KV_MODE = json
NO_BINARY_CHECK = 1
SHOULD_LINEMERGE = false

There could be other configurations, but the KV_MODE should be set to json.

If your data comes in multline, then use false for SHOULD_LINEMERGE, but if it is only one per line, then set it to true.

If you have a timestamp in your JSON data, then you can add the necessary parsing for the timestamp. If you are just going to use the date that the event comes in or the file modification time, then I suggest you add that information so that Splunk doesn't have to figure that out all the time. Look at the following reference for some more information about the timestamp that could be of help:

http://docs.splunk.com/Documentation/Splunk/6.2.2/Data/Tunetimestampextractionforbetterindexingperfo...

View solution in original post

cpetterborg
SplunkTrust
SplunkTrust

I think that you are not parsing the data correctly. Your props.conf file may be the problem. You don't have a timestamp in the json data (which I would suggest you add to the data). Next, you may not be taking a single line only for an event, though this may not actually be the case with your data, so we can look past that one. And third, it seems most likely that you aren't telling Splunk to index it as JSON data. Here is what I would suggest in your props.conf file:

[yoursourcetype]
KV_MODE = json
NO_BINARY_CHECK = 1
SHOULD_LINEMERGE = false

There could be other configurations, but the KV_MODE should be set to json.

If your data comes in multline, then use false for SHOULD_LINEMERGE, but if it is only one per line, then set it to true.

If you have a timestamp in your JSON data, then you can add the necessary parsing for the timestamp. If you are just going to use the date that the event comes in or the file modification time, then I suggest you add that information so that Splunk doesn't have to figure that out all the time. Look at the following reference for some more information about the timestamp that could be of help:

http://docs.splunk.com/Documentation/Splunk/6.2.2/Data/Tunetimestampextractionforbetterindexingperfo...

dbrancaglion
Explorer

Works Fine!!!!

Many thanks for your help !!!!

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...