Getting Data In

Recognition of FILETIME Timestamps

tom_frotscher
Builder

Hello,

i have the following problem:

I have to read in logfiles with Splunk that contain an uncommon timestamp format. After a little bit of research, i realized that the timestamps are in a modified version of the windows FILETIME format. The definition is basically :

a 64-bit value representing the number of 100-nanosecond intervals since January 1, 1601 (UTC).

As far as i can see, Splunk is not able to correctly parse this timestamp format. Additionally, there is the small modification, that i mentioned before. In my files the last four digits of the timestamp are cut.

Here a small example:

  1. 13043462391557 <- Timestamp from my files
  2. 130434623915570000 <- Timestamp as FILETIME definition implies

So if i am correct, this should be something like "100-microsecond intervals since January 1, 1601 (UTC)"

I know i could just read it in and do some math to convert the timestamp to epoch, but i would prefer to get a clean timestamp recognition working at indextime.

Is there any way splunk can correctly recognize this timestampformat, or at least the original FILETIME format?

strive
Influencer

I used the manual conversion tool that you specified.
For 130434623915570000 if the filetime is chosen as Input format then as you say it displays Thursday, May 1, 2014 7:59:52pm

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

Note, this timestamp doesn't map to May 2011. In fact it maps to Thursday, May 1, 2014 7:59:52pm.

Mapping to May 2011 is what Splunk does by default, incorrectly interpret this as a unix epoch timestamp with extra precision.
Manual conversion tool: http://www.silisoftware.com/tools/date.php

strive
Influencer

OOPS that was a mistake. Thanks for correcting it.

0 Karma

yannK
Splunk Employee
Splunk Employee

Are you sure that it works ?

I thought that the timestamp detection was happening before the transforms regex replacement.
see http://docs.splunk.com/Documentation/Splunk/6.1.2/Admin/Configurationparametersandthedatapipeline

strive
Influencer

I haven't tried this kind of conversions, but if i have to put my 2 cents in...

I used http://www.epochconverter.com/ to check the sample timestamps that you have given. I feel the second one has two zeroes extra.

Both 1304346239 and 1304346239155700 resolve to same date & time. That is GMT: Mon, 02 May 2011 14:23:59 GMT

So, i feel you need to ignore last four digits of timestamps that are present in your files.

As Yann pointed out use TIME_FORMAT in props.conf

0 Karma

yannK
Splunk Employee
Splunk Employee
0 Karma

tom_frotscher
Builder

Hi, thanks for your answer. I already did the convertion of the timestamp in splunk, but as mentioned i would prefer to correctly detect the timestamp at index time.

Also the "130434623915570000" timestamp is no 1970 epoch timestamp with higher precision, it is still filetime. With your approach, i would gather the same result as already mentioned by @strive.

Get Updates on the Splunk Community!

More Ways To Control Your Costs With Archived Metrics | Register for Tech Talk

Tuesday, May 14, 2024  |  11AM PT / 2PM ET Register to Attend Join us for this Tech Talk and learn how to ...

.conf24 | Personalize your .conf experience with Learning Paths!

Personalize your .conf24 Experience Learning paths allow you to level up your skill sets and dive deeper ...

Threat Hunting Unlocked: How to Uplevel Your Threat Hunting With the PEAK Framework ...

WATCH NOWAs AI starts tackling low level alerts, it's more critical than ever to uplevel your threat hunting ...