Splunk Search

Time duration between two events from timestamp mentioned in the event.

Fury
Loves-to-Learn Lots

Hi there,

 

I have a requirement where i need time duration between two events in ms.
Events look like this 
Event A: Processing started at : <01:00:00.100>
Event B: Processing completed at: <01:00:00:850>

The numbers at the end of each event are timestamps and i have extracted them as fields 'time1' and 'time2' respectively.

Is there a way to get time duration between these two events form the extracted fields?

Using transaction gives the time duration in seconds, whereas i would require it in ms.

Please help.

Labels (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

You can convert the values to time integer values with

| eval t1=strptime(time1, "%H:%M:%S.%Q")

 but that will then give t1 as a representation of 'today' and that time including any time zone relevance, so if you have two times where 

time1=23:59:59

time2=00:00:02

then that falls down.

Another option is to simply rex the field like this

| rex field=time_field "(?<h>[^:]*):(?<m>[^:]*):(?<s>[^\.]*)\.(?<ms>\d+)"
| eva millis=(h*3600000)+(m*60000)+(s*1000)+ms

The question is - how have you established the two events (A and B) as being the ones used to establish the duration.

If each event has a common ID then the simplest thing to do is to 

| eval start=if(event_is_A,millis,null)
| eval end=if(event_is_B,millis,null)
| stats values(start) as start values(end) as end by Id
| eval duration=if(end<start,(end+86400000-start), end-start)

This handles the day wrap and the 'event_is_X' construct is simply your way to determine if this is the start or end event.

If you are using the 'transaction' command, then that has its own set of issues to consider and it's always good to try to avoid using transaction. stats is generally faster and has fewer issues.

However, the simple math/rex option is the easiest way.

 

0 Karma

Fury
Loves-to-Learn Lots

@bowesmana 

Thanks for the help. I will try the options you've provided.


And answer to your question "how have you established the two events (A and B) as being the ones used to establish the duration?"

It was provided to me by the developer of the service/component that these events should be the ones used to establish the duration.

0 Karma

bowesmana
SplunkTrust
SplunkTrust

@Fury 

Connection the events was more about how can you know which two events are part of the same transaction, i.e. if you have

event a - start
event b - start
event c - end
event d - end

do you have a correlation id that ties either a-->c or a-->d

 

0 Karma

Fury
Loves-to-Learn Lots

@bowesmana  @isoutamo 

Apologies, I have missed to mention that there is a correlation id that ties the events like you said.

Events look something like this,

event a -  Processing started at : <timestamp> for correlationid001
event b - Processing started at : <timestamp> for correlationid002
event c - Processing completed at: <timestamp> for correlationid001
event d - Processing completed at: <timestamp> for correlationid002

So, correlationid001 ties a -->c

correlationid002 ties b-->d

I have extracted the correlation id as a field.

0 Karma

isoutamo
SplunkTrust
SplunkTrust
Can you post example events (and your current query), so we can easier help you to figure out this challenge?
0 Karma

Fury
Loves-to-Learn Lots

@isoutamo 

 

The sample events look like these.

2020-11-04 20:50:42; SOURCE="source1"; sourcetype="abc"; PLATFORM="platform1"; SERVICE="xxxx"; EVENT_MESSAGE="Processing event  with correlationId: C001 at: 20:50:42.374

2020-11-04 20:50:43; SOURCE="source1"; sourcetype="abc"; PLATFORM="platform1"; SERVICE="xxxx"; EVENT_MESSAGE="Processing completed for event  with correlationId: C001, at: 20:50:43.382"

And my original query is as below.

index=xyz sourcetype=abc SERVICE="xxxx" "Processing event with correlationId: * at: *" OR "Processing completed for event  with correlationId: *, at: *" | table co_id, host, EVENT_MESSAGE, time1, time2 | sort co_id host | transaction co_id

 

Where co_id, time1, time2 are extracted fields.

0 Karma

bowesmana
SplunkTrust
SplunkTrust

@Fury 

Here is a working example of your data and using stats to find duration

| makeresults
| eval event="2020-11-04 20:50:42; SOURCE=\"source1\"; sourcetype=\"abc\"; PLATFORM=\"platform1\"; SERVICE=\"xxxx\"; EVENT_MESSAGE=\"Processing event  with correlationId: C001 at: 20:50:42.374\"!2020-11-04 20:50:43; SOURCE=\"source1\"; sourcetype=\"abc\"; PLATFORM=\"platform1\"; SERVICE=\"xxxx\"; EVENT_MESSAGE=\"Processing completed for event  with correlationId: C001, at: 20:50:43.382\""
| eval event=split(event,"!")
| mvexpand event
| rename event as _raw
| extract
| eval _time=strptime(_raw,"%F %T")
| rex field=EVENT_MESSAGE "Processing event  with correlationId: (?<co_id>\w+) at: (?<start_time>\d+:\d+:\d+\.\d+)"
| rex field=EVENT_MESSAGE "Processing completed for event  with correlationId: (?<co_id>\w+), at: (?<end_time>\d+:\d+:\d+\.\d+)"
| rex field=_raw "(?<date>\d+-\d+-\d+)"
| eval start_time=strptime(date." ".start_time, "%F %T.%Q")
| eval end_time=strptime(date." ".end_time, "%F %T.%Q")
| stats values(start_time) as start_time values(end_time) as end_time by SOURCE sourcetype PLATFORM SERVICE co_id
| eval duration=end_time-start_time

I encourage you to avoid the idea of using transaction as stats will give you fewer headaches. There are plenty of memory and timing issues to consider with transaction that can make life difficult.

In the above almost all apart from the last two lines are setting up your data. I am parsing the start and end time from the EVENT_MESSAGE and then using the date of the event to calculate the time, to allow for wrapping over midnight.

Then it's a simple stats command that aggregates the events together, collecting start and end time and then the simple duration calculation at the end.

Hope this helps.

 

0 Karma

isoutamo
SplunkTrust
SplunkTrust
Usually there should be some kind of "transaction id" which combine events together. This can be a individual field or combination of fields, but it must be there if there could be multiple transaction at the same time. Otherwise it's just like @bowesmana said, you couldn't identified which events belongs to which transaction.
r. Ismo
0 Karma

inventsekar
SplunkTrust
SplunkTrust

Hi @Fury you can use your current transation search and get seconds then using eval, you can convert that to milli seconds.

current search | eval MilliSeconds=Seconds*1000
0 Karma

Fury
Loves-to-Learn Lots

Thanks for the reply @inventsekar.

In my current search, the time stamp i.e,  _time is being captured only in sec.

Suppose Event A is logged at 10:00:21.000 & Event B is logged at  10:00:21:450 in real time.

The time stamps in splunk would still show it as 
Event A @ 10:00:21.000
Event B @  10:00:21.000

So when i use transaction it would give me o as duration.
This was the reason the developer has added timestamps in the events, which when logged in splunk would show the timestamps with ms info.

 Is there any way I could use of the extracted fields of timestamps?

0 Karma
Get Updates on the Splunk Community!

Index This | Forward, I’m heavy; backward, I’m not. What am I?

April 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

A Guide To Cloud Migration Success

As enterprises’ rapid expansion to the cloud continues, IT leaders are continuously looking for ways to focus ...

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...