Getting Data In

[Time Extraction] EventTime extraction for log file with event having different time fileds

rbal_splunk
Splunk Employee
Splunk Employee

In one log file, we have data format that different for Event time:

i) For this we would like "time":1544476509062 to be extracted as the event type:

{"type":"transaction", "time":1544476509062, "path":"/healthcheckbalance", "protocol":"https", "protocolSrc":"8075", "duration":0, "status":"success", "serviceContexts":[], "customMsgAtts":{}, "correlationId":"5dd70e5ca4224df8175513e2", "legs":[]}

ii) For this we need to exatract event date from "leg":0, "timestamp":1544476508996

{"type":"transaction", "time":1544476509047, "path":"/creditos/v1/lis", "protocol":"https", "protocolSrc":"8065", "duration":51, "status":"success", "serviceContexts":[{"service":"creditos", "monitor":true, "client":"28ebc792-a5fa-4ec7-b2e4-fe3a98c7d52c", "org":"Canais_Internos", "app":"Super App", "method":"GET /lis", "status":"success", "duration":29}], "customMsgAtts":{}, "correlationId":"5cd70e5c9c22d31f411c8a9d", "legs":[{"uri":"/creditos/v1/lis", "status":200, "statustext":"OK", "method":"GET", "vhost":null, "wafStatus":0, "bytesSent":755, "bytesReceived":1014, "remoteName":"10.28.67.235", "remoteAddr":"10.28.67.235", "localAddr":"10.28.72.157", "remotePort":"37672", "localPort":"8065", "sslsubject":null, "leg":0, "timestamp":1544476508996, "duration":51, "serviceName":"creditos", "subject":"28ebc792-a5fa-4ec7-b2e4-fe3a98c7d52c", "operation":"GET /lis", "type":"http", "finalStatus":"Pass"}, {"uri":"/creditos/v1/lis", "status":200, "statustext":"OK", "method":"GET", "vhost":null, "wafStatus":0, "bytesSent":1043, "bytesReceived":873, "remoteName":"172.16.88.66", "remoteAddr":"172.16.88.66", "localAddr":"10.28.72.157", "remotePort":"80", "localPort":"40090", "sslsubject":null, "leg":1, "timestamp":1544476509019, "duration":27, "serviceName":"creditos", "subject":"28ebc792-a5fa-4ec7-b2e4-fe3a98c7d52c", "operation":"GET /lis", "type":"http", "finalStatus":null}]}

iii) Eventtime will be as per "logCreationTime":"2018-12-10 19:15:02.283"
{"type":"header", "logCreationTime":"2018-12-10 19:15:02.283", "hostname":"fa163e4631ael03.ctmm1.prod.cloud.ihf", "domainId":"330f4a77-989c-4f70-8184-8f6a2ca44da9", "groupId":"group-2", "groupName":"GtwInterno", "serviceId":"instance-100", "serviceName":"GtwInstance_28_72_157", "version":"v7.5.3-Internal"}

0 Karma

rbal_splunk
Splunk Employee
Splunk Employee

1. Props.conf Configuration

[json_two_timeformat]
INDEXED_EXTRACTIONS = json
SHOULD_LINEMERGE = false
KV_MODE = none
TIME_PREFIX = (\"time\":|\"logCreationTime\")
MAX_TIMESTAMP_LOOKAHEAD = 1000
DATETIME_CONFIG = /etc/system/local/datetime_json_three_timeformats.xml

2. Create the etc/system/local/datetime_json_three_timeformats.xml

<!-- SPLUNK_HOME/etc/system/local/datetime_json_three_timeformats.xml -->

<datetime>

<define name="_utcepoch_leg0" extract="utcepoch, subsecond">
    <text><![CDATA[\d{13},\s+.*? \"leg\":0, \"timestamp\":(\d{10})(\d{3})]]></text>
</define>

<define name="_utcepoch_time" extract="utcepoch, subsecond">
   <text><![CDATA[(\d{10})(\d{3}), .* \"legs\":\[\]]]></text>
</define>

<define name="logCreationTime" extract="year, month, day, hour, minute, second, subsecond">
   <text><![CDATA[(\d{4})-(\d{2})-(\d{2})\s+(\d{2}):(\d{2}):(\d{2})\.(\d{3})]]></text>
</define>


<timePatterns>
    <use name="_utcepoch_leg0"/>
    <use name="_utcepoch_time"/>
    <use name="logCreationTime"/>
</timePatterns>
<datePatterns>
    <use name="_utcepoch_leg0"/>
    <use name="_utcepoch_time"/>
    <use name="logCreationTime"/>
</datePatterns>


</datetime>

```

3. Restart splunk

4. Ingest sample files
$SPLUNK_HOME/bin/splunk add oneshot json_two_timeformat.log -sourcetype json_two_timeformat

5. Results:

$ ./bin/splunk search "index=test01 sourcetype=json_two_timeformat | spath path=legs{}.timestamp output=legs_timestamp | eval legs0_timestamp=mvindex(legs_timestamp, 0) | eval T=_time | table _time T legs0_timestamp time"

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...

Updated Data Management and AWS GDI Inventory in Splunk Observability

We’re making some changes to Data Management and Infrastructure Inventory for AWS. The Data Management page, ...