Splunk Search

How do I calculate the average response time for the date/time field in my sample data?

vrmandadi
Builder

I am trying to calculate the average response time for the below field

ENDPOINT_LOG{}.EML_RESPONSE_TIME:    2016-01-19 15:44:08.127 +00:00

Since the above field has date along with time, it is not able to calculate the avg response time by using the simple stats command.

index="main" source="dcd_sample_data_for_splunk_dashboard.txt" sourcetype="dcd" |stats avg (ENDPOINT_LOG{}.EML_RESPONSE_TIME)

I would like to know from you experts, how the regex is written for this and calculate the avg response time

Tags (4)
0 Karma

martin_mueller
SplunkTrust
SplunkTrust

Are you looking for a duration taken to get the response (e.g. 42 seconds), or for a point in time when the response was received (e.g. January 1st 2016, 15:42:07)?

0 Karma

vrmandadi
Builder

actually i have two fields request time and initial time

request time=2016-01-19 15:44:07.858 +00:00
initial time= 2016-01-19 15:44:07.793 +00:00

I want to get the average time by ( request_time-initial_ time) / 2 ..so 858-793= 65 and 65 divided by 2 is 32.5

0 Karma

Richfez
SplunkTrust
SplunkTrust

What IS that average response time of that event? 2016 years, 1 month, 19 days ,15 hours, 44 minutes and 8.127 seconds?

I suspect what you may want instead is something like subtracting _time from it (or vice versa). This is only a guess, but perhaps trying things like ...

index="main" source="dcd_sample_data_for_splunk_dashboard.txt" sourcetype="dcd" 
| eval response_time=ENDPOINT_LOG{}.EML_RESPONSE_TIME-_time
|stats avg (response_time)

But that would only help if this is only a small portion of the event and that timestamp isn't be used as the event's _time.

I'll bet you won't be able to do direct math on it, though. You'll need to convert it to epoch values. To do that, ASSUMING that nothing weird happens with the funny field name...

... | eval my_epoch=strptime(ENDPOINT_LOG{}.EML_RESPONSE_TIME, "%Y-%m-%d %H:%M:%S.%3Q")
| eval my_response_time=starttime-ENDPOINT_LOG{}.EML_RESPONSE_TIME | ...

If this doesn't help (or help enough), is there any other timestamp in the event that you want to compare against? Do you have to pair, group or combine this event with another one to find out the actual amount of time it took? More information about your events and about what information you are trying to glean from this data would help in either of those cases.

0 Karma

vrmandadi
Builder

I want to separate the time from the date in the field and calculate its average

ENDPOINT_LOG{}.EML_RESPONSE_TIME: 2016-01-19 15:44:08.127 +00:00

if i simply do the table command

index="main" source="dcd_sample_data_for_splunk_dashboard.txt" sourcetype="dcd"| table ENDPOINT_LOG{}.EML_RESPONSE_TIME

It displays the below result .

ENDPOINT_LOG{}.EML_RESPONSE_TIME

2016-01-19 15:44:08.127 +00:00

2016-01-19 15:44:07.858 +00:00
2016-01-19 15:44:07.793 +00:00

2016-01-19 15:44:07.858 +00:00

2016-01-19 15:44:07.379 +00:00
2016-01-19 15:44:07.326 +00:00

2016-01-19 15:44:07.379 +00:00

For example the second row has three results

2016-01-19 15:44:07.858 +00:00
2016-01-19 15:44:07.793 +00:00
2016-01-19 15:44:07.858 +00:00

I want the average of (15:44:07.858+15:44:07.793+15:44:07.858) /3

0 Karma

Richfez
SplunkTrust
SplunkTrust

Sorry for extra edits, copy/pasta fail, fixed.

Ah, you answer indicates you trying to find out the time of day that most responses happen at?

If that's the case, well, many of the pieces are there. You might be able to ...

... | eval my_epoch=strptime(ENDPOINT_LOG{}.EML_RESPONSE_TIME, "%Y-%m-%d %H:%M:%S.%3Q")
| eval my_response_time=starttime-ENDPOINT_LOG{}.EML_RESPONSE_TIME
| stats avg(my_response_time) AS usual_response_time 
0 Karma

Richfez
SplunkTrust
SplunkTrust

Add to that ...

... | eval usual_response_time=strftime(usual_response_time, "%H:%M:%S.%3Q")

If you'd like to see it as just the hours/minutes/seconds/subseconds.

0 Karma

vrmandadi
Builder

I tried your query but it says "Error in 'eval' command: The expression is malformed. Expected )."

0 Karma

Richfez
SplunkTrust
SplunkTrust

I was afraid the open and close squiggly brackets would break everything, but I think they don't. Here's what I found.

| gentimes start=2/1/2016 end=2/2/2016
| eval ENDPOINT_LOG{}.EML_RESPONSE_TIME="2016-01-19 15:44:08.127"
| eval my_epoch=strptime('ENDPOINT_LOG.EML_RESPONSE_TIME', "%Y-%m-%d %H:%M:%S.%3Q")
| stats avg(my_epoch) AS usual_response_time
| eval usual_response_time=strftime(usual_response_time, "%H:%M:%S.%3Q")

That's a run-anywhere search which I'll explain. You'll only need parts of it.

gentimes just fakes up a data point in a way splunk likes. You don't need this line.
The first eval creates a fieldname and sets it to that date and time. You don't need this line.

The next lines are the ones you need.
The second eval creates a new field "my_epoch" which is the long variable only converted to a unix epoch time (easy to do math on). NOTE that those squiggly brackets get stripped out of the variable name so they're not in there any more. I don't know if this will be in your case or not, but might. BUT, because of the "dot" in the middle, I had to put it in single quotes. You might have to play a bit with the quoting and squiggly brackets to get it not failing.

The stats takes an average of my_epoch and names it "usual_response_time". I only have one data point, so that's not interesting in my case but it's an example. 🙂
The last eval reformats the usual_response_time to be like you want.

In that case, my output is 15:44:08.127

I'm pretty sure this will work fine for what you need, but this assumes you already have the data properly piled together via some method or another (e.g. transaction or some other way to group them into groups that make sense. If you, say, only run the search over the past 8 hours of results, it should be right. If you run it over three days, well, it will be wrong. But you could run it over bins of length one day each and get it that way.

Alternatively, if you really only want to average times across all days ignoring the days (which I think you do), you may also try some mod math and convert the one eval line to

...
| eval my_epoch=strptime('ENDPOINT_LOG.EML_RESPONSE_TIME', "%Y-%m-%d %H:%M:%S.%3Q")%86400
| ...

The %86400 at the end should take care of that. Let me know how that goes. If we're getting close, I might rearrange these comments a bit and make it into a better "initial answer".

0 Karma

vrmandadi
Builder

thank You for the quick response but my question is different

actually i have two fields request time and initial time

request time=2016-01-19 15:44:07.858 +00:00
initial time= 2016-01-19 15:44:07.793 +00:00

I want to get the average time by ( request_time-initial_ time) / 2 ..so 858-793= 65 and 65 divided by 2 is 32.5

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...