I've recently installed the Tenable Nessus app, which is doing most of it's search-time field extractions using the "KV_MODE = json". The logs contain several time fields which are correctly being extracted, however the logs contain the time in 10-digit epoch format. I would like the reported values of those fields to be human readable instead. How would I go about doing that?
(note, I'm not talking about the timestamp _time, simply fields in the log which contain other time information)
The current stanza in props.conf is:
[tenable:sc:vuln]
KV_MODE = json
TRUNCATE = 1000000
MAX_TIMESTAMP_LOOKAHEAD = 1
EVAL-bugtraq = split(bid, ",")
REPORT-tenable_sc_cert_from_xref = tenable_sc_cert_from_xref
EVAL-cve = split(cve, ",")
EVAL-cpe = split(cpe, "<br/>")
FIELDALIAS-tenable_sc_cvss = baseScore AS cvss
FIELDALIAS-tenable_sc_dest = ip AS dest
FIELDALIAS-tenable_sc_see_also = seeAlso{} AS see_also
REPORT-tenable_sc_msft_from_see_also = tenable_sc_msft_from_see_also
REPORT-tenable_sc_mskb_from_see_also = tenable_sc_mskb_from_see_also
FIELDALIAS-tenable_sc_severity_id = severity.id AS severity_id
LOOKUP-severity_for_tenable_sc = nessus_severity_lookup severity_id OUTPUT severity
FIELDALIAS-tenable_sc_vendor_severity = severity.name AS vendor_severity
FIELDALIAS-tenable_sc_signature = pluginID AS signature_id pluginName AS signature
LOOKUP-vendor_info_for_tenable_sc = tenable_sc_vendor_info_lookup sourcetype OUTPUT vendor,product
FIELDALIAS-tenable_sc_xref = xref{} AS xref
FIELDALIAS-tenable_sc_dvc = host AS dvc
Any advice is greatly appreciated. Thank you!
Firstly... This is bad, it is saying your timestamp is just 1 character:
MAX_TIMESTAMP_LOOKAHEAD = 1
MAX_TIMESTAMP_LOOKAHEAD = <integer>
* Specifies how far (in characters) into an event Splunk should look for a
timestamp.
* This constraint to timestamp extraction is applied from the point of the
TIME_PREFIX-set location.
* For example, if TIME_PREFIX positions a location 11 characters into the
event, and MAX_TIMESTAMP_LOOKAHEAD is set to 10, timestamp extraction will
be constrained to characters 11 through 20.
* If set to 0, or -1, the length constraint for timestamp recognition is
effectively disabled. This can have negative performance implications
which scale with the length of input lines (or with event size when
LINE_BREAKER is redefined for event splitting).
* Defaults to 150 (characters).
To convert the time, just do it at search time:
... | convert ctime(epochTimeField) | table epochTimeField
Or, use _time instead of epochTimeField (whatever epochTimeField is)...
... | table _time
A caution about _time: it's often a different value to other time data in the event. By default, _time represents the local time on the indexer at the time the event is indexed.
This may or may not match the client-side event time; unless it's explicitly matched to the event data in props, _time shouldn't be relied on as an event timestamp.
The _time for the timestamp is fine. All search-time and index-time field extractions are working 100% as intended. However, I'd like to modify something.
Several of the fields being extracted at search are fields which report data-specific information containing time. In a snipit of an event below, you will see "firstSeen", "lastSeen", "pluginModDate", and "pluginPubDate", all of which are epoch times, in the data. Is it possible to convert these times to human readable using props/transforms?
...
exploitFrameworks:
family: { [+] }
firstSeen: 1486540578
hasBeenMitigated: 0
ip: ###.###.###.###
lastSeen: 1486765040
macAddress: ##.##.##.##.##.##
netbiosName: **domain\hostname**
patchPubDate: 1485363600
pluginID: 96828
pluginInfo: 96828 (445/6) Google Chrome < 56.0.2924.76 Multiple Vulnerabilities
pluginModDate: 1486400400
pluginName: Google Chrome < 56.0.2924.76 Multiple Vulnerabilities
pluginPubDate: 1485536400
pluginText:
...
Thanks again.