I have logs which contain a value, time_taken. I want to work out the latency of entire transactions by summing the time_taken from each. I'm using transaction to pull these related events into one event. I cannot use the "duration" field generated by the transaction command, as this is generated from the time difference of the events, I want to sum the time_taken values from each event so the time taken of the second event is not ignored.
A single now look like this (they are in json format):
{"field1": "1", "field2": "2", ...... ,"time_taken":"0:00:01.404100"}
{"field1": "5", "field2": "2", ...... ,"time_taken":"0:00:00.663664"}
The "time_taken" fields are both being picked up, so I have multiple time_taken fields for each event.
Is there a simple way to access fields from each event, specifically, I want to create a new field, summing the time_taken fields, for each event (note that some of the events have 1 event and some have more than 2). Is this possible?
Many thanks.
You need to convert your time_taken field to seconds and then use "stats sum(time_taken), list(field1) as field1, ..., list(fieldN) as fieldN by YOUR_TRANSACTION_UNIQUE_ID"
Note there's a dur2sec function but it won't work in your case as you also have microseconds.
In summary, something like:
yoursearchhere
| yourtransactionhere
| rex field=time_taken "(?<H>\d{1,2}):(?<M>\d{2}):(?<S>\d{2})\.(?<uS>\d{6})"
| eval time_taken_seconds =
tonumber(H)*3600 +
tonumber(M)*60 +
tonumber(S) +
tonumber("0.".uS)
| stats sum(time_taken_seconds) as total_time_taken_seconds,
list(field1) as field1, ...,
list(fieldN) as fieldN
by YOUR_TRANSACTION_UNIQUE_ID
Let me know if that helps.
You need to convert your time_taken field to seconds and then use "stats sum(time_taken), list(field1) as field1, ..., list(fieldN) as fieldN by YOUR_TRANSACTION_UNIQUE_ID"
Note there's a dur2sec function but it won't work in your case as you also have microseconds.
In summary, something like:
yoursearchhere
| yourtransactionhere
| rex field=time_taken "(?<H>\d{1,2}):(?<M>\d{2}):(?<S>\d{2})\.(?<uS>\d{6})"
| eval time_taken_seconds =
tonumber(H)*3600 +
tonumber(M)*60 +
tonumber(S) +
tonumber("0.".uS)
| stats sum(time_taken_seconds) as total_time_taken_seconds,
list(field1) as field1, ...,
list(fieldN) as fieldN
by YOUR_TRANSACTION_UNIQUE_ID
Let me know if that helps.
I've got the following code
mysearch...
| rex field=time_taken "(?\d{1,2}):(?\d{2}):(?\d{2}).(?\d{6})"
| eval transaction_time = tonumber(H)*3600 + tonumber(M)*60 + tonumber(S) + tonumber("0.".uS)
| stats sum(transaction_time) by DutyId
It's correctly parsing the H, M, S, uS fields, however the eval command is creating only one time value, wierdly!
Ok I got it working, I had to use the rex command before running the transaction.
Thanks!
Try using mvexpand:
mysearch...
| mvexpand time_taken
| rex field=time_taken "(?<H>\d{1,2}):(?<M>\d{2}):(?<S>\d{2})\.(?<uS>\d{6})"
| eval transaction_time = tonumber(H)*3600 + tonumber(M)*60 + tonumber(S) + tonumber("0.".uS)
| stats sum(transaction_time) by DutyId
Just to know, do you really need transaction to collect these events together and then sum the values? If you have a common field (you must be using it for transaction to correlate events - assume it's field1), then simple stats should work for you
your search + field extractions from json| stats sum(time_taken) by field1
If you still want to use transaction use streamstats to create unique ids and then sum it,
transaction field1 and other options| streamstats count AS event_id| stats sum(time_taken) by event_id
This should create unique id for your each transaction and then sum the values based on this unique id
hi jpanderson,
try like this:
uses the regular expression to extract the values that appear in your time_taken events and saves them in a new field. After, you can now work with the new field that will contain all values of time_taken.