Getting Data In

Does splunk support timestamps that use Modified Julian Day (MJD)?

Lowell
Super Champion

Has anyone setup monitoring of ntpd stats? The problem I'm running into is that these log files have an unusual timestamp format, so I was wondering if anyone else has figured this out before.

I have two NTPd log files that I would like to monitor with splunk. We recently had some issues with our clocks getting out of sync, and so using splunk to more proactively monitor the NTP services would be ideal. Here are some sample events:

/var/log/ntpstats/loopstats:

55365 184.755 0.005201000 -20.717 0.000455463 0.153665 6
55365 37381.756 0.000415000 -17.188 0.000230239 0.007782 7
55365 49826.825 -0.031047000 -16.996 0.011537315 0.059551 10
55365 52986.926 0.000128000 -16.451 0.001437442 0.067062 7

/var/log/ntpstats/peerstats:

55365 52995.979 207.46.197.32 9314 -0.003170778 0.074376426 0.082986177 0.033217419
55365 53045.904 127.127.1.0 9014 0.000000000 0.000000000 0.000000000 0.000000954
55365 53047.023 216.45.57.38 9414 -0.000126195 0.079608226 0.002956711 0.010910166
55365 53049.961 66.218.191.240 9614 0.001047601 0.021774658 0.004612503 0.006862981

Base on some docs I found online, it looks like this is the order of the fields for each file:

 loopstats:  day, second, offset, drift compensation, estimated error, stability, polling interval
 peerstats:  day, second, address, status, offset, delay, dispersion, skew (variance)

Does splunk's TIME_FORMAT support this kind of notion of splitting the day and seconds components like this?

I've been able to deterermine that the day field is a a Modified Julian Day (MJD), and the seconds field is the number of seconds past midnight. I can get the correct timestamp if I use the following python code (and the mx.DateTime module):

def convert_timestamp(day, seconds):
    from mx.DateTime import DateTimeFromMJD, DateTimeDelta
    day = int(day)
    seconds = float(seconds)
    timestamp = DateTimeFromMJD(day) + DateTimeDelta(0,0,0, seconds)
    return timestamp.strftime("%Y-%m-%d %H:%M:%S") + (".%03d" % (divmod(timestamp.second,1)[1] * 1000))

As a work around, I've written a script (using the python function above) to reformat the NTP day/seconds values into a more traditional timestamp format. Hopefully someday splunk will support this type of custom time format in a more native way.

Tags (1)

team_webangle
Engager

For what it's worth, here are the field extractions for loopstats:

^(?P<date_mjd>\d+) (?P<sec_past_midnight>[0-9\.]+) (?P<clock_offset_sec>[0-9\.\-]+) (?P<frequency_offset_ppm>[0-9\.\-]+) (?P<jitter_sec>[0-9\.\-]+) (?P<wander_ppm>[0-9\.\-]+) (?P<clock_discipline>[0-9\.\-]+) 

jrodman
Splunk Employee
Splunk Employee

As far as I know, we're pretty closely tied to the strptime model.

http://www.splunk.com/base/Documentation/4.1.3/Admin/Configuretimestamprecognition

We do handle timezone offsets, but not second offsets.

Please do file an enhancement request with support if this is important to you, especially if you can clue is into what sources use this format.

Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...