Getting Data In

How to train Splunk to learn my timestamps?

gregbo
Communicator

I'm trying to use splunk train to learn my timestamps so I can put them in a datetime.xml file, and it won't even try. My timestamps look like: 20180529132292003-0700 and 20180529132292-0700

All it says is:

Interactively learning date formats.

Skipping unpromissing line 1.
Skipping unpromissing line 2.

0 Karma

xpac
SplunkTrust
SplunkTrust

My guess would be that, because of the lack of any seperators, it doesn't see a way to actually extract anything meaningful out of it. For a human, the timestamp is rather easy to read, but not for software. I'd try to either change the timestamps to contain seperators, or, just do this manually, by using either TIME_FORMAT or building a datetime.xml yourself.

However, I tried to build a timestamp extraction for this - and the format is pretty weird.

Using 20180529132292003-0700, I'd split it like 2018-05-29 13:22, but after that - what is 92003 supposed to mean? Is it 0.92003 minutes? Or is this just a bad example?

You see, if a human has difficulties to get a proper timestamp out of it... how should Splunk do this? 😉

Hope that helps - if it does I'd be happy if you would upvote/accept this answer, so others could profit from it. 🙂

0 Karma

gregbo
Communicator

I can't change the format of the timestamp, it's an international standard for the data I'm trying to pull in.

0 Karma

xpac
SplunkTrust
SplunkTrust

And the same data stream sometimes has subseconds, and sometimes has not? (like the 003 part in your example above)

0 Karma

gregbo
Communicator

They usually have subseconds of "000".

0 Karma

xpac
SplunkTrust
SplunkTrust

Okay, if subseconds are always present, you could simply use TIME_FORMAT = %Y%m%d%H%M%S%3N%z. Splunk actually recognizes your timestamp without any additional training, but for some reason ignores the timezone information...

0 Karma

gregbo
Communicator

Ah, I misspoke...subseconds are almost always "000" when they are there, but sometimes they aren't there at all.

0 Karma

xpac
SplunkTrust
SplunkTrust

Mhh, that's pretty shitty... any chance to get that fixed with the log source? Inconsistent timestamps might actually force you to go back to datetime.xml...

0 Karma

gregbo
Communicator

Next to no chance...I'll probably have to try datetime.xml, which hasn't worked at all the last couple times I tried it...oh well

0 Karma

FrankVl
Ultra Champion

Why trying to train and not just specify the format explicitely with TIME_FORMAT?

As also discussed in your other question here: https://answers.splunk.com/answers/660301/why-arent-timestamps-being-recognized-consistently.html

PS: 20180529132292003-0700 is a strange timestamp. Based on your previous explanation, this has 92 as the seconds, which cannot be correct...

0 Karma

gregbo
Communicator

That was a typo. I changed it and it still didn't work. I'm trying to train because the data has timestamps with two different formats, and multiple posts made it sound like using datetime.xml would be "easy". I can't change the format of the timestamp in the data, it's an international standard for this type of data

0 Karma

FrankVl
Ultra Champion

No options to collect those 2 different formats separately?

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...