Getting Data In

DB connect - timestamp issue

kochera
Communicator

Hi,

we use DB connect to monitor database jobs. Each job generates an entry. the field log_number is an increasing counter, the fiels log_entrydate is the actual timestamp.

log_entrydate format: 2013-06-19 14:03:12.183506

I tried to extract the timestamp and to get an event for each log_number but was not successful. Configuration:

output.timestamp = true
output.timestamp.column = LOG_ENTRYDATE
output.timestamp.parse.format = yyyy-MM-dd HH:mm:ss.SSS --> I don't know how to specify

Any help is appreciated.

Cheers,
Andy

Tags (1)
1 Solution

ziegfried
Influencer

DB Connect uses a SimpleDateFormat to parse the timestamp. SimpleDateFormat only support milli-seconds precision, the timestamp in your case seems to too precise (nano-seconds). There are 2 ways to work around this:

1) Parse the timestamp up to the seconds (ie. reduce the precision): yyyy-MM-dd HH:mm:ss, which should work
2) Use a custom SQL statement and convert the LOG_ENTRYDATE column to TIMESTAMP (eg. using the TO_TIMESTAMP function: http://docs.oracle.com/cd/B19306_01/server.102/b14200/functions193.htm)

View solution in original post

bwooden
Splunk Employee
Splunk Employee

Sigi & kochera, your science is pure and appreciated. I tweaked solution slightly for MS-SQL. I'll paste those configs here for anyone else querying MS-SQL.

Query Snippet:

SELECT
CAST([EPOEvents].[ReceivedUTC] as varchar) as [timestamp],
...

Respective parameters in inputs.conf:

output.timestamp = 1
output.timestamp.column = timestamp
# format for writing event:  2012-08-03 08:17:00
output.timestamp.format = yyyy-MM-dd HH:mm:ss
# format for translating date from query results:  Aug  3 2012  8:17AM
output.timestamp.parse.format = MMM dd yyyy HH:mmaa

Relevant bit from props.conf:
TIME_FORMAT=%Y-%m-%d %H:%M:%S
TZ=UTC

ziegfried
Influencer

DB Connect uses a SimpleDateFormat to parse the timestamp. SimpleDateFormat only support milli-seconds precision, the timestamp in your case seems to too precise (nano-seconds). There are 2 ways to work around this:

1) Parse the timestamp up to the seconds (ie. reduce the precision): yyyy-MM-dd HH:mm:ss, which should work
2) Use a custom SQL statement and convert the LOG_ENTRYDATE column to TIMESTAMP (eg. using the TO_TIMESTAMP function: http://docs.oracle.com/cd/B19306_01/server.102/b14200/functions193.htm)

kochera
Communicator

Hi,

thanks for the hint. Finally I got a it working.

1) SELECT TO_TIMESTAMP(TO_CHAR(LOG_ENTRYDATE),'YYYY-MM-DD HH24:MI:SS.FF') LOG_ENTRYDATE,
2) SELECT TO_CHAR (TO_TIMESTAMP (LOG_ENTRYDATE, 'YYYY-MM-DD HH24:MI:SS,FF6'),
'YYYY-MM-DD HH24:MI:SS.FF3')
LOG_ENTRYDATE,

Both should work, for me solution 1) was just fine.

Cheers,
Andy

kochera
Communicator

Hi Sigi,

find below the definition of the log_entrydate field.

LOG_ENTRYDATE VARCHAR2 YES 29 0 10

cheers,
Andy

0 Karma

ziegfried
Influencer

What's the datatype of the log_entrydate column? Is it VARCHAR (or similar) or is it an actual TIMESTAMP)?

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...