So I have some data in the format of
Time | UUID | event_name_status | actual_important_log_time
---------------------------------------------------------------------------------------------------------------
2020-03-26T12:00:00 | 123456789 | car_end | 2020-03-25T16:50:30
2020-03-26T12:00:00 | 123456789 | car_mid | 2020-03-25T16:40:30
2020-03-26T12:00:00 | 123456789 | car_start | 2020-03-25T16:30:30
2020-03-26T12:00:00 | 123456788 | car_end | 2020-03-25T15:50:30
2020-03-26T12:00:00 | 123456788 | car_mid | 2020-03-25T15:20:30
2020-03-26T12:00:00 | 123456788 | car_start | 2020-03-25T14:50:30
Which Is a consistent pattern with each transaction having a start, mid and end with a different UID per transaction (also different vehichles for other transactions).
I currently group them into transactions using the following search command.
* | transaction UUID startswith="car_start" endswith="car_end"
Which groups the transactions showing how many there were in the last X length of time (could be hundreds/thousands in a day.
I need to get the duration of each transaction using the actual_important_log_time field and then use these values to get the average time the car transaction took. (this will then go in a dashboard
... View more