Splunk Search

Help in Calculating average timestamp,with multivalue fields?

vrmandadi
Builder

Hello ,

I have tried my best to get the average response time which is the based on two other timestamps which ra multivalue .Below is the query which I ran where I dont see any results

index=test |mvexpand ENDPOINT_LOG{}.EML_RESPONSE_TIME|mvexpand ENDPOINT_LOG{}.EML_REQUEST_TIME |eval average_response_time = strptime("ENDPOINT_LOG{}.EML_RESPONSE_TIME", "%Y-%m-%d %H:%M:%S.%3N") - strptime("ENDPOINT_LOG{}.EML_REQUEST_TIME", "%Y-%m-%d%H:%M:%S.%3N") |table average_response_time 

the ENDPOINT_LOG{}.EML_REQUEST_TIME and ENDPOINT_LOG{}.EML_RESPONSE_TIME are multivalue fields i.e. each event has two or three of them .I am trying to calculate the average response time but i dont get any results.Can anyone please help

I tried giving values to the fields and then it worked,but not with the field names

example

index=test_dev source="/var/log/rtn/dcd_new_sample_data_for_splunk_dashboard.json"  | stats count | eval "ENDPOINT_LOG{}.EML_REQUEST_TIME"="2016-01-19 15:44:00.749 +00:00" | eval "ENDPOINT_LOG{}.EML_RESPONSE_TIME"="2016-01-19 15:44:02.366 +00:00" | eval average_response_time = strptime("2016-01-19 15:44:02.366 +00:00", "%Y-%m-%d %H:%M:%S.%3N") - strptime("2016-01-19 15:44:00.749 +00:00", "%Y-%m-%d%H:%M:%S.%3N")

output

count ENDPOINT_LOG.EML_REQUEST_TIME ENDPOINT_LOG.EML_RESPONSE_TIME average_response_time
662 2016-01-19 15:44:00.749 +00:00 2016-01-19 15:44:02.366 +00:00 1.617000

0 Karma
1 Solution

MuS
Legend

Hi vrmandadi,

This was a tricky one, but I got there finally 😉
Your problem is that your needed values are within one event and therefore the SPL command to get the result is a bit tricky, because it contains zipping those multivalued fields and splitting them again base on the field ENDPOINT_LOG{}.EML_LAST_UPDATE_TS because this is unique for the information.

So, here is the command which will work on the provided event example indexed as _json sourcetype:

source="foo.json" host="indexer" sourcetype="_json" 
| eval my_id = 'ENDPOINT_LOG{}.EML_ID'
| eval request= 'ENDPOINT_LOG{}.EML_REQUEST_TIME'
| eval response = 'ENDPOINT_LOG{}.EML_RESPONSE_TIME' 
| eval update_time = 'ENDPOINT_LOG{}.EML_LAST_UPDATE_TS' 
| table update_time my_id request response  | sort - update_time
| eval zipped = mvzip(update_time, my_id, "###") | eval zipped = mvzip(zipped, request, "###") | eval zipped = mvzip(zipped, response, "###") 
| mvexpand zipped | makemv delim="###" zipped 
| eval update_time = mvindex(zipped, 0) | eval my_id = mvindex(zipped, 1) | eval request = mvindex(zipped, 2) | eval response = mvindex(zipped, 3)
| fields - zipped
| eval average_response_time = strptime(response, "%Y-%m-%d %H:%M:%S.%3N") - strptime(request, "%Y-%m-%d%H:%M:%S.%3N")

What happens here is the following:

source="foo.json" host="indexer" sourcetype="_json" 

The base search to get the event

| eval my_id = 'ENDPOINT_LOG{}.EML_ID'
| eval request= 'ENDPOINT_LOG{}.EML_REQUEST_TIME'
| eval response = 'ENDPOINT_LOG{}.EML_RESPONSE_TIME' 
| eval update_time = 'ENDPOINT_LOG{}.EML_LAST_UPDATE_TS' 

Just some eval's to make the field names more usable

| table update_time my_id request response  | sort - update_time

A little table added, not need at all but helps to show the information

| eval zipped = mvzip(update_time, my_id, "###") | eval zipped = mvzip(zipped, request, "###") | eval zipped = mvzip(zipped, response, "###") 

zip all the multivalued fields together into one new filed called zipped

| mvexpand zipped | makemv delim="###" zipped 

expand it and set the delimiter to be ###

| eval update_time = mvindex(zipped, 0) | eval my_id = mvindex(zipped, 1) | eval request = mvindex(zipped, 2) | eval response = mvindex(zipped, 3)

get the values based on position in the zipped field into new single value fields

| fields - zipped

remove the zipped field and

| eval average_response_time = strptime(response, "%Y-%m-%d %H:%M:%S.%3N") - strptime(request, "%Y-%m-%d%H:%M:%S.%3N")

do some maths to get the response time for each transaction - tata 🙂 the result looks like this

alt text

Hope this make sense and is useful ...

cheers, MuS

View solution in original post

MuS
Legend

Hi vrmandadi,

This was a tricky one, but I got there finally 😉
Your problem is that your needed values are within one event and therefore the SPL command to get the result is a bit tricky, because it contains zipping those multivalued fields and splitting them again base on the field ENDPOINT_LOG{}.EML_LAST_UPDATE_TS because this is unique for the information.

So, here is the command which will work on the provided event example indexed as _json sourcetype:

source="foo.json" host="indexer" sourcetype="_json" 
| eval my_id = 'ENDPOINT_LOG{}.EML_ID'
| eval request= 'ENDPOINT_LOG{}.EML_REQUEST_TIME'
| eval response = 'ENDPOINT_LOG{}.EML_RESPONSE_TIME' 
| eval update_time = 'ENDPOINT_LOG{}.EML_LAST_UPDATE_TS' 
| table update_time my_id request response  | sort - update_time
| eval zipped = mvzip(update_time, my_id, "###") | eval zipped = mvzip(zipped, request, "###") | eval zipped = mvzip(zipped, response, "###") 
| mvexpand zipped | makemv delim="###" zipped 
| eval update_time = mvindex(zipped, 0) | eval my_id = mvindex(zipped, 1) | eval request = mvindex(zipped, 2) | eval response = mvindex(zipped, 3)
| fields - zipped
| eval average_response_time = strptime(response, "%Y-%m-%d %H:%M:%S.%3N") - strptime(request, "%Y-%m-%d%H:%M:%S.%3N")

What happens here is the following:

source="foo.json" host="indexer" sourcetype="_json" 

The base search to get the event

| eval my_id = 'ENDPOINT_LOG{}.EML_ID'
| eval request= 'ENDPOINT_LOG{}.EML_REQUEST_TIME'
| eval response = 'ENDPOINT_LOG{}.EML_RESPONSE_TIME' 
| eval update_time = 'ENDPOINT_LOG{}.EML_LAST_UPDATE_TS' 

Just some eval's to make the field names more usable

| table update_time my_id request response  | sort - update_time

A little table added, not need at all but helps to show the information

| eval zipped = mvzip(update_time, my_id, "###") | eval zipped = mvzip(zipped, request, "###") | eval zipped = mvzip(zipped, response, "###") 

zip all the multivalued fields together into one new filed called zipped

| mvexpand zipped | makemv delim="###" zipped 

expand it and set the delimiter to be ###

| eval update_time = mvindex(zipped, 0) | eval my_id = mvindex(zipped, 1) | eval request = mvindex(zipped, 2) | eval response = mvindex(zipped, 3)

get the values based on position in the zipped field into new single value fields

| fields - zipped

remove the zipped field and

| eval average_response_time = strptime(response, "%Y-%m-%d %H:%M:%S.%3N") - strptime(request, "%Y-%m-%d%H:%M:%S.%3N")

do some maths to get the response time for each transaction - tata 🙂 the result looks like this

alt text

Hope this make sense and is useful ...

cheers, MuS

vrmandadi
Builder

Take A bow MuS,that was awesome,I have done a small change in the query according to my requirement and it worked..You are the GURU,...Thanks a ton

0 Karma

MuS
Legend

Hi vrmandadi,

if you can split the time values by _time or by host so there are no longer multivalued you could get the average time using something like this:

index=test 
| eval response_time = strptime("ENDPOINT_LOG{}.EML_RESPONSE_TIME", "%Y-%m-%d %H:%M:%S.%3N")
| eval request_time = strptime("ENDPOINT_LOG{}.EML_REQUEST_TIME", "%Y-%m-%d%H:%M:%S.%3N") 
| stats avg(eval(response_time - request_time)) AS avg_response by _time, host

This is un-tested but should give you a starting point.

Hope this helps ...

cheers, MuS

0 Karma

vrmandadi
Builder

index=test_dev source="/var/log/rtn/dcd_new_sample_data_for_splunk_dashboard.json" | mvexpand ENDPOINT_LOG{}.EML_REQUEST_TIME|mvexpand ENDPOINT_LOG{}.EML_RESPONSE_TIME | eval response_time = strptime("ENDPOINT_LOG{}.EML_RESPONSE_TIME", "%Y-%m-%d %H:%M:%S.%3N") | eval request_time = strptime("ENDPOINT_LOG{}.EML_REQUEST_TIME", "%Y-%m-%d%H:%M:%S.%3N") | stats avg(eval(response_time - request_time)) AS avg_response

I tried using the mvexpand command and used your query but there are no values for the avg_response

0 Karma

vrmandadi
Builder

Here is the sample event,so that you have a better idea

{ [-]
DATETIME: 03/04/2016 11:47:00:925
ENDPOINT_LOG: [ [-]
{ [-]
EML_BUSINESS_DATE: 2016-01-19
EML_REQUEST_TIME: 2016-01-19 15:44:00.749 +00:00
EML_EXTERNAL_RESULT_CODE: 000
EML_ID: 724109
EML_RESPONSE_TIME: 2016-01-19 15:44:02.366 +00:00
EML_RES_CONN_URI: /JSONDepositing%231
EML_RRN: 160119154358748
EML_STATE: 2
EML_TQU_ID: A
}
{ [-]
EML_BUSINESS_DATE: 2016-01-19
EML_EXTERNAL_RESULT_CODE: 000
EML_ID: 2147483647
EML_INTERNAL_MESSAGE_TYPE: 1100
EML_LAST_UPDATE_TS: 2016-01-19 15:44:02.370 +00:00
EML_MESSAGE_UID: s4970qalv:DCD_DB2:201601191544000749:000000000000000001
EML_REQUEST_TIME: 2016-01-19 15:44:01.163 +00:00
EML_REQ_CONN_URI: connection:/s4970qalv/DCD_PB1/DCD_JSON_PayingBank_TestHarness/DCD_TestHarness_PB/JSONPayingBankDCDTestHarness%231
EML_REQ_IAP_NAME: DCD_TestHarness_PB
EML_REQ_RAW_MESSAGE: (BLOB)
EML_REQ_RAW_MESSAGE_EKY_ID: 221
EML_RESPONSE_TIME: 2016-01-19 15:44:02.366 +00:00
EML_RES_CONN_URI: connection:/s4970qalv/PayingBank_TestHarness

0 Karma

MuS
Legend

Is this to complete event? Looks like there is missing some data ...

0 Karma

vrmandadi
Builder

its a big chunk of data,do you want the whole

{"DATETIME":"03/04/2016 11:47:00:925","TRL_ID":337651,"TRL_BUSINESS_DATE":"2016-01-19","TRL_APPR_ID":977,"TRL_TSC_CODE":201,"TRL_TQU_ID":"A","TRL_TCG_ID":3,"TRL_TTY_ID":3,"TRL_FUNCTION_CODE":0,"TRL_MESSAGE_REASON_CODE":0,"TRL_RRN":"160119154358748","TRL_AMT_COMPLETED":0,"TRL_AMT_COMPLETED_PREV_TXN":0,"TRL_AMT_TXN":0,"TRL_AMT_CARDHOLDER_BILLING":0,"TRL_AMT_CARDHOLDER_BILLING_FEE":0,"TRL_CONVERSION_RATE_CARDHOLDER":0,"TRL_TXN_CUR_ISO_ID":0,"TRL_CARD_BILLING_CUR_ISO_ID":0,"TRL_PAN_CTY_ISO_ID":0,"TRL_CARD_SEQUENCE_NBR":0,"TRL_CARD_TRACK_NBR":0,"TRL_PIN_RETRY_COUNT":0,"TRL_ACQR_CTY_ISO_ID":0,"TRL_MCC_ID":0,"TRL_CARD_ACCEPTOR_MBC_ID":0,"TRL_CARD_ACPT_NAME_LOCATION":"null null null null null","TRL_AUTHORISED_BY":"DCD_JSON_PayingBank_TestHarness/DCD_TestHarness_PB","TRL_ACTION_RESPONSE_CODE":0,"TRL_MATCH_FLAG":"N","TRL_SERVICE_CODE":0,"TRL_STANDIN_REASON_INDICATOR":0,"TRL_AUTHORIZATION_LIFE_CYCLE":0,"TRL_MESSAGE_NBR":0,"TRL_ORIG_TXN_CUR_ISO_ID":0,"TRL_ORIG_TXN_AMT":0,"TRL_ACCOUNT_TYPE_1_ATP_ID":0,"TRL_ACCOUNT_1_MAX_AVAILABLE":0,"TRL_ACCOUNT_1_SPENT_AMOUNT":0,"TRL_ACCOUNT_TYPE_2_ATP_ID":0,"TRL_ACCOUNT_2_MAX_AVAILABLE":0,"TRL_ACCOUNT_2_SPENT_AMOUNT":0,"TRL_PREV_TXN_TRL_ID":0,"TRL_PREV_TXN_TSC_CODE":0,"TRL_ADDNL_ACTION_CODES":"DRC: [000] DCD_JSON_PayingBank_TestHarness;\nORC: [000] DCD_JSON_DepositingBanks;\nSTACK:\n[0] Approved IN [Auth Driver] COMMENT [Default Result Code];\n","TRL_CUSTOM_DATA":"838E71646DE42A09F11C4A7C0C120D5C","TRL_CARD_DATA_INPUT_CPBLTY":"0","TRL_CARDHOLDER_AUTHENT_CPBLTY":"U","TRL_CARD_CAPTURE_CPBLTY":"U","TRL_OPERATING_ENVIRONMENT":"U","TRL_CARDHOLDER_PRESENT_IND":"U","TRL_CARD_PRESENT_IND":"U","TRL_CARD_DATA_INPUT_MODE":"0","TRL_CARDHOLDER_AUTHENT_METHOD":"U","TRL_CARDHOLDER_AUTHENT_ENTITY":"U","TRL_CARD_DATA_OUTPUT_CPBLTY":"0","TRL_TERMINAL_OUTPUT_CPBLTY":"0","TRL_PIN_CAPTURE_CPBLTY":"0","TRL_DEST_STAN":"160119154358748","TRL_ORIGIN_ICH_NAME":"DCD_JSON_DepositingBanks","TRL_DEST_ICH_NAME":"DCD_JSON_PayingBank_TestHarness","TRL_AUX_MESSAGE_TYPE":1100,"TRL_ORIGIN_FEP_NBR":0,"TRL_RVRSL_TYPE":0,"TRL_TAC_ACC_SEQ":0,"TRL_BPM_BILL_SEQ":0,"TRL_EFFECTIVE_TQU_ID":"A","TRL_ORIGIN_IAP_NAME":"DCD_JSON_DB_1","TRL_DEST_IAP_NAME":"DCD_TestHarness_PB","TRL_MESSAGE_TYPE":"CMP_REQ","TRL_CARD_ADNL_STATUS_CODE":";;;","TRL_SESSION":"0","TRL_TRANSACTION_ID":"160119154358748","TRL_LOCATION_LOGGED":"Default Location","TRL_TXN_TO_SETTLE_RATE":0,"TRL_CARDBILLING_TO_SETTLE_RATE":0,"TRL_AMT_SETTLE":0,"TRL_SETTLE_CUR_ISO_ID":0,"TRL_AMT_ACCOUNT":0,"TRL_ACCOUNT_CUR_ISO_ID":0,"TRL_SETTLE_TO_ACCOUNT_RATE":0,"TRL_AMT_MERCH_SETTLE":0,"TRL_MERCH_SETTLE_CUR_ISO_ID":0,"TRL_SETTLE_TO_MER_SETTLE_RATE":0,"TRL_TIME_ZONE":"GMT-06:00","TRL_ORIGIN_RESULT_CODE":"000","TRL_DESTINATION_RESULT_CODE":"000","TRL_FRACTALS_BRIDGE_SENT":"N","TRL_MESSAGE_UID":"s4970qalv:DCD_DB2:201601191544000749:000000000000000001","TRL_DATETIME_LOCAL_TXN":"2016-01-19 15:43:58.725 +00:00","TRL_DATE_CAPTURE":"2016-01-19 00:00:00.000 +00:00","TRL_DEST_REPLY_TIME":"2016-01-19 15:44:02.366 +00:00","TRL_DEST_REQUEST_TIME":"2016-01-19 15:44:01.163 +00:00","TRL_SYSTEM_TIMESTAMP":"2016-01-19 03:44:00.749 PM","TRL_LAST_UPDATE_TS":"2016-01-19 15:44:00.749 +00:00","TRL_CUSTOM_DATA_EKY_ID":221,"EWS_BUSINESS_DATE":"2016-01-19","EWS_CPC_ID":10284,"EWS_DBP_ID":35,"EWS_TRL_ID":337649,"EWS_LAST_UPDATE_TS":"2016-01-19 15:44:02.729 +00:00","EWS_PB_ACTIVE_FLAG":"A","EWS_DB_REQUIRED_PB_IC":"Y","EWS_DB_REQUIRED_PB_PP":"Y","EWS_DB_REQUIRED_PB_IP":"N","EWS_DB_STANDIN_OPTION":"Y","EWS_DB_ACTIVE_FLAG":"Y","EWS_PB_FORWARD_NITS":"Y","EWS_PB_ALLOWED_DB_IP":"Y","EWS_PB_ALLOWED_DB_PP":"Y","EWS_PB_ALLOWED_DB_IC":"Y","EWS_CPC_ROUTE_TO_PB_OPT":"Y","EWS_CPC_ROUTE_TO_RTDC_OPT":"Y","EWS_MESSAGE_TOKENIZED":"N","EWS_NON_DCD_PARTICIPANT":"N","EWS_MICR_EPC":"4","EWS_PB_GUARANTEE_IND":"G","EWS_DUPLICATE_SUSPECT":"N","EWS_PRIMARY_CLIENT_ID":"p111000111","EWS_MESSAGE_QUALIFIER":"CMP_REQ","EWS_OFFSET_DATE":"2016-01-19","EWS_AMOUNT":"30000","EWS_PB_GUARANTEE_IND_EXPIRY":"2016-01-20","EWS_OFFSET_ID":"160119154358748","EWS_SERIAL_NUMBER":"987654321000001","EWS_MICR_AUX_ON_US":"12345678","EWS_ITEM_ID":"160119154358748","EWS_ACCOUNT_DRAWEE":"122200000001","EWS_PAYEE_ACCOUNT_NUMBER":"111100000001","EWS_CHANNEL":"EE","EWS_DB_BANK_STATUS":"IC","EWS_PB_BANK_STATUS":"IC","EWS_USER_ID":"doej","EWS_MICR_ON_US":"122200000001-12","EWS_RTDC_RESPONSE_CODE":"000","EWS_IMAGE_BACK":"...","EWS_IMAGE_FRONT":"...","EWS_IMAGE_TYPE":"jpg","EWS_REQUEST_CONTEXT":"DEP","EWS_INSTRUMENT_TYPE":"CHK","EWS_SWITCH_LEVEL_RESP_CODE":"000","EWS_CPC_GUARANTEE_IND_U":"BOTH","EWS_DB_PROGRAM_ID":"RTDC","EWS_CPC_GUARANTEE_IND_G":"PB","EWS_CPC_GUARANTEE_IND_N":"BOTH","EWS_REQUEST_ADDENDA_1":":","EWS_SERVICE_NAME":"DCD","EWS_ROUTING_DECISION":"PB and RTDC","EWS_PC_TRAN":"12","EWS_DB_CLIENT_API":"DCDCLI","EWS_PAYEE_ACCOUNT_ABA":"111000111","EWS_MICR_ABA":"222000111","EWS_PAYING_BANK_ID":"222000111","ENDPOINT_LOG": [{"EML_RES_RAW_MESSAGE":"(BLOB)","EML_REQ_RAW_MESSAGE":"(BLOB)","EML_BUSINESS_DATE":"2016-01-19","EML_ID":724109,"EML_REQ_RAW_MESSAGE_EKY_ID":221,"EML_RES_RAW_MESSAGE_EKY_ID":221,"EML_INTERNAL_MESSAGE_TYPE":1100,"EML_STATE":2,"EML_LAST_UPDATE_TS":"2016-01-19 15:44:02.824 +00:00","EML_RESPONSE_TIME":"2016-01-19 15:44:02.366 +00:00","EML_REQUEST_TIME":"2016-01-19 15:44:00.749 +00:00","EML_EXTERNAL_RESULT_CODE":"000","EML_TQU_ID":"A","EML_RRN":"160119154358748","EML_MESSAGE_UID":"s4970qalv:DCD_DB2:201601191544000749:000000000000000001","EML_REQ_CONN_URI":"connection:/s4970qalv/DCD_DB2/DCD_JSON_DepositingBanks/DCD_JSON_DB_1/JSONDepositingBankDCD%231","EML_RES_CONN_URI":"connection:/s4970qalv/DCD_DB2/DCD_JSON_DepositingBanks/DCD_JSON_DB_1/JSONDepositingBankDCD%231","EML_REQ_IAP_NAME":"DCD_JSON_DB_1","EML_RES_IAP_NAME":"DCD_JSON_DB_1"},{"EML_RES_RAW_MESSAGE":"(BLOB)","EML_REQ_RAW_MESSAGE":"(BLOB)","EML_BUSINESS_DATE":"2016-01-19","EML_TSC_CODE":201,"EML_ID":2147483647,"EML_REQ_RAW_MESSAGE_EKY_ID":221,"EML_RES_RAW_MESSAGE_EKY_ID":221,"EML_INTERNAL_MESSAGE_TYPE":1100,"EML_STATE":2,"EML_LAST_UPDATE_TS":"2016-01-19 15:44:02.370 +00:00","EML_RESPONSE_TIME":"2016-01-19 15:44:02.366 +00:00","EML_REQUEST_TIME":"2016-01-19 15:44:01.163 +00:00","EML_EXTERNAL_RESULT_CODE":"000","EML_STAN":"160119154358748","EML_TQU_ID":"A","EML_RRN":"160119154358748","EML_MESSAGE_UID":"s4970qalv:DCD_DB2:201601191544000749:000000000000000001","EML_REQ_CONN_URI":"connection:/s4970qalv/DCD_PB1/DCD_JSON_PayingBank_TestHarness/DCD_TestHarness_PB/JSONPayingBankDCDTestHarness%231","EML_RES_CONN_URI":"connection:/s4970qalv/DCD_PB1/DCD_JSON_PayingBank_TestHarness/DCD_TestHarness_PB/JSONPayingBankDCDTestHarness%231","EML_REQ_IAP_NAME":"DCD_TestHarness_PB","EML_RES_IAP_NAME":"DCD_TestHarness_PB"}]}

0 Karma
Get Updates on the Splunk Community!

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...

Updated Team Landing Page in Splunk Observability

We’re making some changes to the team landing page in Splunk Observability, based on your feedback. The ...

New! Splunk Observability Search Enhancements for Splunk APM Services/Traces and ...

Regardless of where you are in Splunk Observability, you can search for relevant APM targets including service ...