I am trying to tail an MSSql Database that contains trouble tickets. I pull everything in Key/Value pairs. The DateCreated field is a datetime in MSSql, but when the data shows up in Splunk it shows up as a decimal number "1834239.3423" - how can I get Splunk to properly recognize this as a datetime without writing my own polling script?
Thanks
Hello,
I used to have the same issue, just try configuring like that your dbx inputs.conf:
[dbmon-tail://MSSQL/MSSQL_Table1]
index = mssqlindex
interval = 120
output.format = kv
output.timestamp = 1
output.timestamp.column = LastModifiedUTC
output.timestamp.parse.format = yyyy-MM-dd' 'HH:mm:ss' 'Z
output.timestamp.format = yyyy-MM-dd' 'HH:mm:ss' 'Z
query = SELECT convert(varchar,s.LastModified,120) + ' +0000' AS LastModifiedUTC, LastModified, any other column FROM table1 WHERE foo=bar {{AND $rising_column$ > ?}}
sourcetype = mssqltail
table = table1
tail.rising.column = LastModified
disabled = 0
You need to stop restart splunk or "reload inputs". Unfortunately you can't do that using the GUI as the GUI doesn't offers the option to configure output.timestamp.parse.format.
Hope it helps.
I would like it if someone could tell me which ".conf" file I could put an eval stanza in to do the same thing so I don't have to do this on every single search line for all the date fields I want to convert...
Is the props.conf file the right place to eval the date to something readable or is that only for changing it before it gets indexed?
Thanks
So I think I was tripping over my own feet. I have a stanza in the props.conf file that tries to Eval the date field - but that never worked. At the same time I was trying to run an Eval or a Convert function on the command line FOR THE SAME FIELD... This was failing. When I tried the Eval on the command line for a different date field it worked like a charm. So I am stripping all of the stanza's from the props.conf files and will use this on the command line for searches:
| eval A_Time=strftime(DateOpened,"%Y-%m-%d %H:%M:%S")
Question 2: The Splunk event can have only one "_time", which is the default one you use when search for a time range, or last 24h for example.
If you have more dates and want to use it, you can simply convert it to _time, like: "search foo bar | eval _time=secondary_datetime".
Or you might want to convert the stored epoch format into something more readable using "| convert timeformat="%F %T" ctime(secondary_datetime)". But would definitely store it in epoch format as is easier to manipulate before convert to human format.
Question 1: The Tail process need a id or date or anything incremental. The "?" is to be left like that. The DBX stores the last value at "splunk//var//lib//splunk//persistentstorage//dbx//" and use it for the next time it query the DB
Last question - I have successfully brought the data into splunk from the database and established the CreatedDate and gotten it in the right format - but I have multiple date fields and the other fields are still showing up in Epoch format - not date time - how do I change them?
Two other follow-on questions then. I have multiple date fields, is it possible to add multiple "output.timestamp.column" fields in comma delimited so they pickup the .parse.format format command? Also, if I am tailing a database do I need the {{AND $rising_coumn$ > ?}} and if so, what do I replace the "?" with? I thought the tail command would generate and maintain the value itself?
TIA