Application running in Openshift 3.6 (kubernetes/Docker), generates raw JSON like this to STDOUT (when running standalone). We're not logging to files right now because we could not get logs from files into splunk (we're not sure where in Openshift apps should log, and that could be our issue)
{"@timestamp":"2017-11-29T01:04:40.938-05:00","@version":1,"message":"Preparing to transfer $1028.","logger_name":"ofi.paas.ocp.logging.splunk.loggers.Slf4TransferService","thread_name":"pool-1-thread-1","level":"INFO","level_value":20000,"transactionOwner":"Susan","transactionId":"9fe28a5b-7ed1-4adc-b852-b4exxxxxxxxx","appname":"my-java-app"}
We can see these log events in splunk, but they're not indexed correctly because none of our custom fields are available for searching, also, log events in splunk appear to be wrapping the raw event generated from our appto have slightly different structure compared to when the app logs standalone (not in Openshift)
{"log", <RAW LOG DATA HERE>}\n","stream":"stdout","time":"<TIMESTAMP>"}
Here's an example of the wrapping:
{"log":"{\"@timestamp\":\"2017-11-29T06:44:58.240+00:00\",\"@version\":1,\"message\":\"Has transfer of $681 completed successfully ? true.\",\"logger_name\":\"ofi.paas.ocp.logging.splunk.loggers.Slf4TransferService\",\"thread_name\":\"pool-1-thread-1\",\"level\":\"INFO\",\"level_value\":20000,\"transactionOwner\":\"John\",\"transactionId\":\"9c2025d7-399d-492c-84f1-3a5xxxxxxxxx\",\"appname\":\"my-java-app\"}\n","stream":"stdout","time":"2017-11-29T06:44:58.240849341Z"}
I suspect some intermediary component between Openshift and splunk is doing this wrapping which might be throwing off the parser, that's my take but I'm new to splunk so not entirely sure.
... View more