Splunk Search

Derive duration from TimeStamp in event field

ctripod
Explorer

I have a bit of a tricky one here.

I have a search which leverages an automatic lookup. One of the output fields is CreatedDateTime. The Created DateTime can be identical based on the TargetSystem in each event.

I'm trying to derive "duration" based on the events with different CreatedDateTime.

Sample Flow:
SourceSystem=ServerA TargetSystem=System1 CreatedDateTime=6/2/17 1:57
SourceSystem=ServerA TargetSystem=System2 CreatedDateTime=6/2/17 1:57
SourceSystem=ServerA TargetSystem=System3 CreatedDateTime=6/2/17 1:57
SourceSystem=ServerA TargetSystem=System4 CreatedDateTime=6/2/17 1:57
SourceSystem=ServerA TargetSystem=System1 CreatedDateTime=6/2/17 2:00
SourceSystem=ServerA TargetSystem=System3 CreatedDateTime=6/2/17 3:15
SourceSystem=ServerA TargetSystem=System4 CreatedDateTime=6/3/17 5:36

I tried evaluating eval _time=strptime(CreatedDateTime, "%m/%d/%Y %H:%M %p"), but that is not working. Maybe because it's an output from a lookup? I have tried streamstats using eventcount/transaction with filters with no luck.

If anyone has any ideas that would be great. _time is not an option as the indexed data all has the same timestamp.

0 Karma

somesoni2
SplunkTrust
SplunkTrust

Give this a try (also, in the strptime function, the time format should match exactly).

your current search giving fields SourceSystem TargetSystem CreatedDateTime
| eval _time=strptime(CreatedDateTime,"%m/%d/%y %H:%M")
| stats min(_time) as start max(_time) as end by SourceSystem TargetSystem
| eval duration=tostring(end-start,"duration")
| convert ctime(start) ctime(end)

cmerriman
Super Champion

is there an AM/PM in your CreatedDateTime? if not, remove the %p from the strptime, as that would likely be what is failing the eval.

EDITED:

you should be able to use streamstats. try something like this:

|eval CreatedDateTimeEpoch=strptime(CreatedDateTime, "%m/%d/%y %H:%M")
|sort 0 SourceSystem TargetSystem CreatedDateTimeEpoch
|streamstats current=f window=1 values(CreatedDateTimeEpoch) as prevCreatedDateTimeEpoch by SourceSystem TargetSystem
|eval duration=CreatedDateTimeEpoch-prevCreatedDateTimeEpoch

using data provided:

|makeresults|eval data="SourceSystem=ServerA TargetSystem=System1 CreatedDateTime='6/2/17 1:57',SourceSystem=ServerA TargetSystem=System2 CreatedDateTime='6/2/17 1:57',SourceSystem=ServerA TargetSystem=System3 CreatedDateTime='6/2/17 1:57',SourceSystem=ServerA TargetSystem=System4 CreatedDateTime='6/2/17 1:57',SourceSystem=ServerA TargetSystem=System1 CreatedDateTime='6/2/17 2:00',SourceSystem=ServerA TargetSystem=System3 CreatedDateTime='6/2/17 3:15',SourceSystem=ServerA TargetSystem=System4 CreatedDateTime='6/3/17 5:36'"|makemv data delim="," |mvexpand data|eval _raw=data |kv|rex mode=sed field=CreatedDateTime "s/'//g"|eval CreatedDateTimeEpoch=strptime(CreatedDateTime, "%m/%d/%y %H:%M")|sort 0 SourceSystem TargetSystem CreatedDateTimeEpoch
|streamstats current=f window=1 values(CreatedDateTimeEpoch) as prevCreatedDateTimeEpoch by SourceSystem TargetSystem
|eval duration=CreatedDateTimeEpoch-prevCreatedDateTimeEpoch|fields - _raw - data

ctripod
Explorer

This is great, thanks! And works as advertised. I'm trying now to stuff it into one of the new "timeline" based timecharts. Once I get the fields and times into the visualizations I'll send out an update.

here is an example from the visualization. Looks like just need to get the stats / tabling correctly to drive the visualization.

| eval _time = strptime(start_date." ".start_time, "%m/%d/%Y %H:%M %p") | eval end_time = strptime(end_date." ".end_time, "%m/%d/%Y %H:%M %p") | eval duration = (end_time - _time) * 1000 * (random()%5) | eval duration = IF(duration < 86400000, 0, duration) | stats count by _time, duration, "NERC Region", "Event Description" | table _time "NERC Region" "Event Description" duration
0 Karma

cmerriman
Super Champion

What visualization are you using? If you're having trouble getting it to work, let us know and we'll try to troubleshoot the problem. Otherwise I'm glad it's getting where it needs to be.

0 Karma

ctripod
Explorer

It's this one. Still can't get it working. I'd like the longer durations to span the timeline. https://splunkbase.splunk.com/app/3120/

0 Karma

cmerriman
Super Champion

alright, this is what i've come up with. in the documentation, it mentions that duration is measured in milliseconds.
http://docs.splunk.com/Documentation/Timeline/1.1.0/TimelineViz/TimelineSearchDataFormat

| makeresults 
|eval data="SourceSystem=ServerA TargetSystem=System1 CreatedDateTime='6/2/17 1:57',SourceSystem=ServerB TargetSystem=System2 CreatedDateTime='6/2/17 2:57',SourceSystem=ServerB TargetSystem=System2 CreatedDateTime='6/2/17 6:34',SourceSystem=ServerA TargetSystem=System3 CreatedDateTime='6/2/17 2:57',SourceSystem=ServerA TargetSystem=System4 CreatedDateTime='6/3/17 1:57',SourceSystem=ServerA TargetSystem=System1 CreatedDateTime='6/2/17 2:00',SourceSystem=ServerA TargetSystem=System3 CreatedDateTime='6/3/17 1:15',SourceSystem=ServerA TargetSystem=System4 CreatedDateTime='6/3/17 5:36'"
| makemv data delim="," 
| mvexpand data 
| eval _raw=data 
| kv 
| rex mode=sed field=CreatedDateTime "s/'//g" 
| eval CreatedDateTimeEpoch=strptime(CreatedDateTime, "%m/%d/%y %H:%M") 
| sort 0 SourceSystem TargetSystem CreatedDateTimeEpoch
| streamstats current=f window=1 values(CreatedDateTimeEpoch) as prevCreatedDateTimeEpoch by SourceSystem TargetSystem
| eval duration=(CreatedDateTimeEpoch-prevCreatedDateTimeEpoch)*1000
| eval _time=prevCreatedDateTimeEpoch
|where isnotnull(duration)
|table _time SourceSystem TargetSystem  duration
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...