Getting Data In

Why is Splunk index evaluation of _time inconsistent with timestamp in log file?

squiggle
Explorer

The splunk index evaluation of _time is not consistent with what is in the log. See the two entries below. Both are from the same log. The first stamps time with 4:29:17 am, but as you can see that is not the timestamp in the event. The second, almost exactly the same, is stamped correctly at 1:15:36. Where does splunk get 4:29:17 am from and why? Most importantly, how do I prevent this from happening? It is a less than 5% occurrence but when it happens it throws off my dashboards completely and sends people into a panic.

8/4/14
4:29:17.402 AM

20140804_01:15:17.402-1014-EOBCloseThreadLaunch_Start

8/4/14
1:15:36.236 AM

20140804_01:15:36.236-3797-EOBCloseThreadLaunch_Start

Here is the relevant search string (max because I always want the later of the two):
index=daily_logs "EOBCloseThreadLaunch_Start" | stats max(_time) as launch_time by date_mday

Tags (2)
1 Solution

martin_mueller
SplunkTrust
SplunkTrust

I'm guessing those values are flipped, ie that the start position actually is 0 and the end position is 25 or 26. That indicates that Splunk uses the "-1014" and "-3797" in the timestamp recognition, 21 would have been correct (8 date + 9 time + 4 punctuation = 21).

Which props.conf file depends on which sourcetype these events have. However, if none have been touched then that's futile.

Define a new sourcetype in props.conf in either etc/apps/your_app/default if you're developing an app or etc/system/local if you're not like this:

[your_sourcetype]
TIME_FORMAT = %Y%m%d_%H:%M:%S.%3N
TIME_PREFIX = ^
MAX_TIMESTAMP_LOOKAHEAD = 22
NO_BINARY_CHECK = 1
pulldown_type = 1

That'll make sure Splunk uses only the timestamp and no "-1234" after that, which may be confused for a weird time zone.
It'll probably be easiest for you to define that sourcetype through the data preview: http://docs.splunk.com/Documentation/Splunk/6.1.2/Data/Overviewofdatapreview
For more background about timestamp extraction read up on http://docs.splunk.com/Documentation/Splunk/6.1.2/Data/HowSplunkextractstimestamps

View solution in original post

martin_mueller
SplunkTrust
SplunkTrust

I'm guessing those values are flipped, ie that the start position actually is 0 and the end position is 25 or 26. That indicates that Splunk uses the "-1014" and "-3797" in the timestamp recognition, 21 would have been correct (8 date + 9 time + 4 punctuation = 21).

Which props.conf file depends on which sourcetype these events have. However, if none have been touched then that's futile.

Define a new sourcetype in props.conf in either etc/apps/your_app/default if you're developing an app or etc/system/local if you're not like this:

[your_sourcetype]
TIME_FORMAT = %Y%m%d_%H:%M:%S.%3N
TIME_PREFIX = ^
MAX_TIMESTAMP_LOOKAHEAD = 22
NO_BINARY_CHECK = 1
pulldown_type = 1

That'll make sure Splunk uses only the timestamp and no "-1234" after that, which may be confused for a weird time zone.
It'll probably be easiest for you to define that sourcetype through the data preview: http://docs.splunk.com/Documentation/Splunk/6.1.2/Data/Overviewofdatapreview
For more background about timestamp extraction read up on http://docs.splunk.com/Documentation/Splunk/6.1.2/Data/HowSplunkextractstimestamps

jerrin
Explorer

Thanks for the above solution. As a newbie it really helped me to explore more. But I am trying to fig out how to define the props.conf if the source is ingested from the AWS SQS.

Right now its indexed on index time. I have a field that is

created_timestamp

and prefer the events to be indexed on that TS.

my sourcetype sourcetype="aws:s3:accesslogs"

if I search like this index=* sourcetype="aws:s3:accesslogs" I get 4 indexes but I want this to impact on just 1 index and set my timestamp.

Please help me!

0 Karma

somesoni2
Revered Legend

There is no way to correct already indexed data unless you can delete and re-index those with new configuration.

squiggle
Explorer

Thank you. I have implemented this in inputs.conf on the forwarder. Time will tell whether it is working or not. I suppose the last part of my question above should be "and once the solution is implemented for future data, how do I correct the past errors?" All those bad timestamp conversions remain in the index.

0 Karma

somesoni2
Revered Legend

These props.conf setting will be there in Indexer and the sourcetype association will be done in inputs.conf file on forwarder. Just identify the monitor stanza for your log and add attribute "sourcetype = YourSourceType"

squiggle
Explorer

Thanks for the answer and the documentation references. I have implemented this via splunkweb and it is not working. Reviewing the documentation, it is clear that the sourcetype must be associated with the incoming data stream. However, this log come from a forwarder, so I find no way within splunkweb to make this association. Do i need to directly edit the forwarder conf file? If so, is it the same configuration you provided? Thanks for the help, and sorry for the slow reply, this work is 3rd priority and doesn't get as much time as it should.

0 Karma

squiggle
Explorer

Just the values under TIME on the splunkweb interface? Also, which folder for props.conf? I see 8 files under different paths in the splunk windows folder. I'm not sure which is being used... FYI it has never been edited.

Incorrect (8/4):
timestartpos 26
timeendpos 0

Correct (8/4):
timestartpos 26
timeendpos 0

Incorrect (7/28):
timestartpos 26
timeendpos 0

Correct (7/28):
timestartpos 25
timeendpos 0

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

Do post the values for timestartpos and timeendpos fields for some correctly timestamped and some incorrectly timestamped events.

Also, do post the props.conf settings used for that sourcetype.

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...