Getting Data In

How to get timestamp from incomplete time field

mblauw
Path Finder

Today I've been trying to index a logfile in which only the timefield hours is given. I tried several ways to import this in the right manner, but I can't seem to get it to work.. Does anybody know how Splunk would be able to index this logfile correctly?

Here is one line from the log. The first element is the date field (here: 2016-12-01), where the second is the hour field (here: 0)

2016-12-01;0;BIM03_R_RWSTI4143;BIM03;ARS;RWS BI-meetnet;BI Meetnet Geo3 ARA;Speed;lus;lane1;60.00;2017-01-23 22:00:00;100.00;0.00;0.00;0.00;95.00;5.00;0.00;0.00;0.00;0.00;0.00;0.00;96.67;3.33;0.00;0.00;0.00;0.00;0.00;0.00;0.00;0.00

0 Karma
1 Solution

somesoni2
Revered Legend

Something like this worked for me for your sample data

[ <SOURCETYPE NAME> ]
SHOULD_LINEMERGE=false
LINE_BREAKER = ([\r\n]+)(?=\d{4}-\d{2}-\d{2}\;)
TIME_PREFIX=^
TIME_FORMAT=%Y-%d-%m;%H
MAX_TIMESTAMP_LOOKAHEAD=13

View solution in original post

somesoni2
Revered Legend

Something like this worked for me for your sample data

[ <SOURCETYPE NAME> ]
SHOULD_LINEMERGE=false
LINE_BREAKER = ([\r\n]+)(?=\d{4}-\d{2}-\d{2}\;)
TIME_PREFIX=^
TIME_FORMAT=%Y-%d-%m;%H
MAX_TIMESTAMP_LOOKAHEAD=13

mblauw
Path Finder

Thank you so much. I've been trying to find out how this works for quite a while now. Do you maybe have any documentation about how to construct such source types? (especially how to build up the LINE_BREAKER component and why TIME_PREFIX=^)

0 Karma
Get Updates on the Splunk Community!

Threat Hunting Unlocked: How to Uplevel Your Threat Hunting With the PEAK Framework ...

WATCH NOWAs AI starts tackling low level alerts, it's more critical than ever to uplevel your threat hunting ...

Splunk APM: New Product Features + Community Office Hours Recap!

Howdy Splunk Community! Over the past few months, we’ve had a lot going on in the world of Splunk Application ...

Index This | Forward, I’m heavy; backward, I’m not. What am I?

April 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...