Splunk Search

How to add additional fields to timechart?

zindain24
Path Finder

Hello, I am looking to add two additional fields to the results of my search. (Account_Name) and (Workstation_Name). I'm unsure how to add them since the timechart seems to be limiting what fields are available.

Here is the search I have now:

index=domaincontrollers sourcetype="WinEventLog:Security" accountname | eval time=_time| timechart span=1d min(time) as "Logon Time", max(time) as "Logoff Time"|convert timeformat="%m/%d/%y %H:%M:%S" ctime(*)

Results:

_time       Logon Time          Logoff Time
2014-08-26  08/26/14 06:20:33   08/26/14 15:04:22
2014-08-27  08/27/14 06:42:43   08/27/14 11:37:59

Thanks!

Tags (2)
1 Solution

Ayn
Legend

You can include the values for these just fine, as long as you wrap them within a statistical function.

The thing with timechart is that as it always operates on intervals (timespans) that it uses to divide events, it needs to know how to handle the situation where multiple events are found in an interval. This is where the statistical functions come to use. Let's say you define the timespan for timechart to be 1 minute, and that somewhere in the log you have 3 of these events occurring within 1 minute. Splunk then needs to know how to give you ONE value for your fields, even though there are 3 values of each. You can tell Splunk to just give you an average from the 3 events using the stats function avg. Or, if you only want the values from the first of the events within the time period, use first. Want the median? Use median. And so on.

In your case I guess you simply just want the values, in which case you can use the values() function. In that case you'd do something like this:

index=domaincontrollers sourcetype="WinEventLog:Security" accountname | timechart span=1d values(Account_Name) as Account_Name, values(Workstation_Name) as Workstation_Name, min(_time) as "Logon Time", max(_time) as "Logoff Time"|convert timeformat="%m/%d/%y %H:%M:%S" ctime(*)

If you're getting multiple values for an interval you might want consider splitting by Workstation_Name and Account_Name instead - but timechart and chart don't allow you to split by multiple fields, so you will have to use stats instead. Because stats itself doesn't split time into intervals, you need to use the separate command bucket for this. The end result is this:

index=domaincontrollers sourcetype="WinEventLog:Security" accountname | bucket _time span=1d | stats min(_time) as "Logon Time", max(_time) as "Logoff Time" by _time,Account_Name,Workstation_Name |convert timeformat="%m/%d/%y %H:%M:%S" ctime(*)

View solution in original post

zindain24
Path Finder

Thank you Ayn for explaining your answer so well.

I was still running into a problem because of the ctime(*) expression used in the convert command.

ctime(*) made the time conversion run against ALL fields (including Workstation_Name and Account_Name) which caused them to be excluded from results. Understanding this, I now call each field I want to convert directly:
|convert timeformat="%m/%d/%y %H:%M:%S" ctime("Logon Time") ctime("Logoff Time")

I also chose to use mvjoin to combine multiple values into (Workstation_Names) and (Account_Names)

Here is the final working search:

index=domaincontrollers sourcetype="WinEventLog:Security" accountname | eval time=_time |timechart span=1d min(time) as "Logon Time", max(time) as "Logoff Time" values(Account_Name) as Account_Names values(Workstation_Name) as Workstation_Names | eval Account_Names=mvjoin(Account_Names, ", ")| eval Workstation_Names=mvjoin(Workstation_Names, ", ") |convert timeformat="%m/%d/%y %H:%M:%S" ctime("Logon Time") ctime("Logoff Time")

Ayn
Legend

You can include the values for these just fine, as long as you wrap them within a statistical function.

The thing with timechart is that as it always operates on intervals (timespans) that it uses to divide events, it needs to know how to handle the situation where multiple events are found in an interval. This is where the statistical functions come to use. Let's say you define the timespan for timechart to be 1 minute, and that somewhere in the log you have 3 of these events occurring within 1 minute. Splunk then needs to know how to give you ONE value for your fields, even though there are 3 values of each. You can tell Splunk to just give you an average from the 3 events using the stats function avg. Or, if you only want the values from the first of the events within the time period, use first. Want the median? Use median. And so on.

In your case I guess you simply just want the values, in which case you can use the values() function. In that case you'd do something like this:

index=domaincontrollers sourcetype="WinEventLog:Security" accountname | timechart span=1d values(Account_Name) as Account_Name, values(Workstation_Name) as Workstation_Name, min(_time) as "Logon Time", max(_time) as "Logoff Time"|convert timeformat="%m/%d/%y %H:%M:%S" ctime(*)

If you're getting multiple values for an interval you might want consider splitting by Workstation_Name and Account_Name instead - but timechart and chart don't allow you to split by multiple fields, so you will have to use stats instead. Because stats itself doesn't split time into intervals, you need to use the separate command bucket for this. The end result is this:

index=domaincontrollers sourcetype="WinEventLog:Security" accountname | bucket _time span=1d | stats min(_time) as "Logon Time", max(_time) as "Logoff Time" by _time,Account_Name,Workstation_Name |convert timeformat="%m/%d/%y %H:%M:%S" ctime(*)
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...