Splunk Search

How to get single row output with fields from multiple events from multiple log files

poddraj
Explorer

Hi Team,

My scenario is I have multiple request and response xmls which are basically my events in index for one circuit id. Basically, whenever I request with the circuit id from UI it will create a new transaction id for that particular hit which means logs will have multiple requestids for the same circuit id for 1 day.
What I need is when I search with the circuit ID it should give me a table output showing all the different request ids along with their specific response fields in a single row.

My challenge here is I am trying to show the fields from request & response xmls from multiple source files into a single row but it is returning multiple rows. Please help if there is any way to get this done.

Tags (2)
0 Karma

to4kawa
Ultra Champion
index=e2etest OR index=dtix AND source IN (/opt/delphi/dtix/tomcat/logs/Messaging.log* , *NFAM.log) 2020021076664318
| eval sessions = 1
| table DELPHI_REQUEST.REQUEST.CID,DELPHI_REQUEST.REQUEST.COMMAND,DELPHI_RESPONSE.RESULTS.TRANID,DELPHI_REQUEST.CTLHDR.SYS_NAME,DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP,DELPHI_REQUEST.CTLHDR.TRANID,source, sessions
| stats list(*) as * by sessions
| table DELPHI_REQUEST.REQUEST.CID,DELPHI_REQUEST.REQUEST.COMMAND,DELPHI_RESPONSE.RESULTS.TRANID,DELPHI_REQUEST.CTLHDR.SYS_NAME,DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP,DELPHI_REQUEST.CTLHDR.TRANID,source
| rename DELPHI_REQUEST.REQUEST.CID as "CKTID",DELPHI_REQUEST.REQUEST.COMMAND as "Command",DELPHI_RESPONSE.RESULTS.TRANID as "Resp_Tranid",DELPHI_REQUEST.CTLHDR.TRANID as "Req_Tranid",DELPHI_REQUEST.CTLHDR.SYS_NAME as "System",DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP as "Req Timestamp",DELPHI_REQUEST.REQUEST.SID as "Req_SessionId,DELPHI_RESPONSE.REQUEST.SID as "resp_SessionId,DELPHI_RESPONSE.RESULTS.ANALYSIS.TRBL_CODE as "Trouble Code"

To marge different fields, create a key and use stats .

previous answer:

| makeresults
| eval _raw="<APP_REQUEST>
     <CTLHDR>
         <DIPVER>5.0</DIPVER>
         <DOMAIN>FTTP</DOMAIN>
         <SVC_ID>|1077606440069375000|HEM</SVC_ID>
          <SYS_ID>10-118-21.xx.com</SYS_ID>
         <SVC_NAME>TEST</SVC_NAME>
         <SYS_NAME>DTI_EXPRESS</SYS_NAME>
         <DTIMEOUT>3</DTIMEOUT>
         <ATTACHMENTS>Y</ATTACHMENTS>
         <REQ_TIME_STAMP>2020-02-10T10:11:14Z</REQ_TIME_STAMP>
         <TSTMODE>REAL</TSTMODE>
     </CTLHDR>
     <REQUEST>
         **<CID>12345</CID>**
         <TSTCOND>S</TSTCOND>
         <TR_TYPE>CCON</TR_TYPE>
         <SVCTYPE>DATA</SVCTYPE>
         **<COMMAND>GET_BHR_CFG</COMMAND>
         <EMP_ID>97864</EMP_ID>**
         <WK_TYPE>M</WK_TYPE>
         <JOB_TYPE>M</JOB_TYPE>
         <REQUIRED>
             <SVCTYPE>DATA</SVCTYPE>
             <COMMAND>GET_BHR_CFG</COMMAND>
             <EMP_ID>97864</EMP_ID>
             <WK_TYPE>M</WK_TYPE>
             <JOB_TYPE>M</JOB_TYPE>
         </REQUIRED>
         <ATTRIBUTES>
             <NUM_ATTRS>0</NUM_ATTRS>
         </ATTRIBUTES>
         <SID>A76664317</SID>
     </REQUEST>
 </APP_REQUEST>
#
 <APP_RESPONSE>
     <CTLHDR>
         <DIPVER>5.0</DIPVER>
         <DOMAIN>FTTP</DOMAIN>
         <SVC_ID>|1074764670547209000|HEM</SVC_ID>
         <SYS_ID>10-118-21.xx.com</SYS_ID>
         <SVC_NAME>TEST</SVC_NAME>
         <SYS_NAME>DTI_EXPRESS</SYS_NAME>
         <DTIMEOUT>3</DTIMEOUT>
         <ATTACHMENTS>Y</ATTACHMENTS>
         <REQ_TIME_STAMP>2020-02-10T09:23:52Z</REQ_TIME_STAMP>
         <TSTMODE>REAL</TSTMODE>
     </CTLHDR>
     <REQUEST>
         <CID>12345</CID>
         <TSTCOND>S</TSTCOND>
         **<TR_TYPE>CCON</TR_TYPE>**
         <SVCTYPE>DATA</SVCTYPE>
         <COMMAND>GET_BHR_CFG</COMMAND>
         <EMP_ID> 97864</EMP_ID>
         <WK_TYPE>M</WK_TYPE>
         <JOB_TYPE>M</JOB_TYPE>
         <NETYPE>BHR</NETYPE>
         <REQUIRED>
             <SVCTYPE>DATA</SVCTYPE>
             <COMMAND>GET_BHR_CFG</COMMAND>
             <EMP_ID>96864</EMP_ID>
             <WK_TYPE>M</WK_TYPE>
             <JOB_TYPE>M</JOB_TYPE>
         </REQUIRED>
         <ATTRIBUTES>
             <NUM_ATTRS>0</NUM_ATTRS>
         </ATTRIBUTES>
         <SID>A76555190</SID>
     </REQUEST>
  <RESULTS>
   <RETC>0000</RETC>
   <RSTYPE>C</RSTYPE>
   **<INFOMSG>Request Successful</INFOMSG>**
   <TIME_SENT>2017-12-13T15:17:07Z</TIME_SENT>
   **<TRANID>2017121376436144</TRANID>**
   <SID>A76436143</SID>
   <NUM_ANA>1</NUM_ANA>
   <ANALYSIS>
    <NEID>G131500234</NEID>
    <NETYPE>BHR</NETYPE>
    **<DC>TOK</DC>**
    <DC_SSUM>Retrieve succees</DC_SSUM>
    <CKTDM>FTTP</CKTDM>
    <TRBL_LOCATION_CODE>G1A00234</TRBL_LOCATION_CODE>
    **<TRBL_CODE>TSDELTOK</TRBL_CODE>**
    <TRBL_LAYER>1</TRBL_LAYER>
    <EXPLANATION>Retrieve success</EXPLANATION>
   </ANALYSIS>
 </APP_RESPONSE>"
 | makemv delim="#" _raw
| stats count by _raw
`comment("this is your sample, from here, the logic")`
| spath
| stats values(*) as *
0 Karma

poddraj
Explorer

HI to4kawa,

Thanks for your reply, I have used your query which is using eval sessions=1, but that query is still it giving me multiple rows like below

Query:
index=fios OR index=dtix AND source IN (/opt/delphi/dtix/tomcat/logs/Messaging.log*, *NFAM.log ) "DELPHI_REQUEST.CTLHDR.DOMAIN"=FTTP OR "DELPHI_RESPONSE.CTLHDR.DOMAIN"=FTTP 84/KQXA/134861/VZVA
| eval sessions=1
| stats list(DELPHI_REQUEST.REQUEST.CID) as CKTID list(DELPHI_RESPONSE.RESULTS.TRANID) as "Tranid" list(DELPHI_REQUEST.CTLHDR.TRANID) as Req_Tranid list(DELPHI_REQUEST.REQUEST.COMMAND) as Command list(DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP) as TimeStamp list(DELPHI_RESPONSE.RESULTS.ANALYSIS.TRBL_CODE) as TroubleCode by sessions

O/P:
84/KQXA/VZVA 2020020576000004 2020020576000004 GET_TOPOLOGY 2020-02-05T07:01:11Z TSDELRVW
84/KQXA/VZVA 2020020576000004 GET_TOPOLOGY 2020-02-05T07:01:11Z TSDELRVW
84/KQXA/VZVA GET_TOPOLOGY 2020-02-05T07:01:11Z TSDELRVW
Actually above query will give me 6 raw events (i.e. 6 xmls) from 2 source files. I need only 1 row o/p i.e. only 1st row in above o/p

0 Karma

to4kawa
Ultra Champion
index=fios OR index=dtix AND source IN (/opt/delphi/dtix/tomcat/logs/Messaging.log*, *NFAM.log ) "DELPHI_REQUEST.CTLHDR.DOMAIN"=FTTP OR "DELPHI_RESPONSE.CTLHDR.DOMAIN"=FTTP 84/KQXA/134861/VZVA
| eval sessions=1
| stats list(DELPHI_REQUEST.REQUEST.CID) as CKTID list(DELPHI_RESPONSE.RESULTS.TRANID) as "Tranid" list(DELPHI_REQUEST.CTLHDR.TRANID) as Req_Tranid list(DELPHI_REQUEST.REQUEST.COMMAND) as Command list(DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP) as TimeStamp list(DELPHI_RESPONSE.RESULTS.ANALYSIS.TRBL_CODE) as TroubleCode by sessions
| dedup sessions
0 Karma

poddraj
Explorer

Thanks for your reply to4kawa.

I tried above query I am getting results but have below questions:

1.It is not listing the columns in relation to each other.

example:
Col : Value
Tranid:123
Command:test
Circuitid:ABC

As per logs I have few more tranids under same ABC ckt but those are not getting shown in separate rows however, those are listed in the output but not against the same circuit id or command. Basically the ordering is not correctly showing in the output

  1. I tried to use sort command on top of the query but it is not working
  2. When I try to export it is not exporting as rows but it is exporting in to single row with entire data. How to export it as normal excel
0 Karma

to4kawa
Ultra Champion

When you used session in your query and the field condition was not provided by you, so I used it as is.

O/P:

84/KQXA/VZVA 2020020576000004 2020020576000004 GET_TOPOLOGY 2020-02-05T07:01:11Z TSDELRVW
84/KQXA/VZVA 2020020576000004 GET_TOPOLOGY 2020-02-05T07:01:11Z TSDELRVW
84/KQXA/VZVA GET_TOPOLOGY 2020-02-05T07:01:11Z TSDELRVW

You say this is actually 6 rows. what are they?

I tried to use sort command on top of the query but it is not working
ans: some value is multivalue. so, sort is not work.
When I try to export it is not exporting as rows but it is exporting in to single row with entire data. How to export it as normal excel
ans: If you want to export, you should use table at last.

0 Karma

poddraj
Explorer

Hi to4kawa,

Let me put my requirement in simple way with help of below events and logs.

Example:
index=test1
source:abc.log
Event 1:
<emp_request>
<name>sam</name>
<age>32</age>
<eid>123<eid>
</emp_request>

index=test2
source pqr.log
Event 1:
<emp_request>
<name>sam</name>
<age>32</age>
<eid>123<eid>
</emp_request>

Event 2:
<emp_request>
<name>sam</name>
<age>32</age>
<eid>123<eid>
<tranid>456</tranid> -- This log is adding this extra field into above xml and sends down
</emp_request>

source:xyz.log
Event 1:

<emp_request>
<name>sam</name>
<age>32</age>
<eid>123<eid>
<tranid>456</tranid> 
</emp_request>


Event 2:
<emp_response>
<name>sam</name>
<age>32</age>
<eid>123<eid>
<tranid>456</tranid> 
<sal>100$</sal -- This is new field this log is adding in response to above req xml
<bonus>0.25</bonus> -- This is new field this log is adding in response to above req xml

The above response xml event in source:xyz.log will also be available in source abc.log & pqr.log because it is the final response send from xyz.log to pqr.log which in turn sends to abc.log

Now I need below columns as single row table output if I search with my emp name sam
name|eid|tranid|esal|bonus

Note:I need esal & bonus to picked from Event 2 in source xyz.log and name & eid from Event 1 in source abc.log and tranid from Event 2 of source pqr.log

0 Karma

to4kawa
Ultra Champion

you say 6 rows events
In splunk, these are multivalues.not row.
single row, these are already single.
how do you display these?

0 Karma

maciep
Champion

depending how your search works currently and/or what the current multi-row results look like, you could maybe use chart of xyseries.

So if you're currently using stats to format the data already, maybe use chart instead, e.g

... | chart latest(result) over circuit_id by request_id

Or if you already have a table with fields of circuit_id, request_id and result, then maybe use xyseries to reformat it to a chart layout.

... | xyseries circuit_id, request_id, result

Without seeing the search...not sure if those will help, but worth mentioning i think.

0 Karma

poddraj
Explorer

My query is as below:
index=e2etest OR index=dtix AND source IN (/opt/delphi/dtix/tomcat/logs/Messaging.log* , *NFAM.log) 2020021076664318
| table DELPHI_REQUEST.REQUEST.CID,DELPHI_REQUEST.REQUEST.COMMAND,DELPHI_RESPONSE.RESULTS.TRANID,DELPHI_REQUEST.CTLHDR.SYS_NAME,DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP,DELPHI_REQUEST.CTLHDR.TRANID,source
| rename DELPHI_REQUEST.REQUEST.CID as "CKTID",DELPHI_REQUEST.REQUEST.COMMAND as "Command",DELPHI_RESPONSE.RESULTS.TRANID as "Resp_Tranid",DELPHI_REQUEST.CTLHDR.TRANID as "Req_Tranid",DELPHI_REQUEST.CTLHDR.SYS_NAME as "System",DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP as "Req Timestamp",DELPHI_REQUEST.REQUEST.SID as "Req_SessionId,DELPHI_RESPONSE.REQUEST.SID as "resp_SessionId,DELPHI_RESPONSE.RESULTS.ANALYSIS.TRBL_CODE as "Trouble Code"

Click on below image URl for o/p of my search

https://drive.google.com/file/d/15R077UEyxoEaP2Z7d2YdeShfXzkA7jdI/view?usp=drivesdk

0 Karma

maciep
Champion

you seem to be tabling out a lot of fields. Can you let us know what the fields/headers would be for the ideal "one row" of data you're looking to get to?

0 Karma

poddraj
Explorer

Hi maciep,
My idle o/p is as below with only 6 columns and the query I ran for getting it is:

index=dtix "CID"=82/KQXA/683013/VZNY
| eval CKTID_TN=coalesce(DELPHI_RESPONSE.REQUEST.CID,DELPHI_RESPONSE.REQUEST.TN)
| table CID, CKTID_TN TRANID, COMMAND, REQ_TIME_STAMP, TRBL_CODE, DC.
Note: CKTID_TN value is not coming up not sure why my coalesce is not working which should ideally return same value CID column

CID TRANID COMMANDREQ_TIME_STAMPTRBL_CODEDC

82/KQXA/683013/VZNY 2020021376014711 GET_BHR_CFG 2020-02-13T03:21:59Z TSDELTOK TOK
82/KQXA/683013/VZNY 2020021076664318 GET_BHR_CFG 2020-02-10T10:11:14Z TSDELTOK TOK
82/KQXA/683013/VZNY 2017121376436144 GET_BHR_CFG 2020-02-10T09:23:52Z TSDELTOK TOK
82/KQXA/683013/VZNY 2020021076000001 GET_BHR_CFG 2020-02-10T07:11:55Z TSDELTOK TOK

Here I am using all the fields from reponse xml of circuit id and hence I got in 1 single row but when I try to add few fields from request xml then it is creating 2 rows for same cktid where 1 row is showing the fields from request xml and other row is showing values from response xmls

Even I tried to join those two indexes with belwo join query still I am getting duplicate rows may be because of same request xmls being available in both the indexes?

index = dtix source=/opt/delphi/dtix/tomcat/logs/Messaging.log* 76/KQXA/266281/VZNY
| join type=left CID [search index = fios source=*NFAM.log 76/KQXA/266281/VZNY
] | table CID, COMMAND, TRANID,REQ_TIME_STAMP,TRANID, TRBL_CODE, DC

0 Karma

wmyersas
Builder

Please share the search you have currently, and, if possible some sample data

0 Karma

poddraj
Explorer

The query I am using is as below:

index=e2etest OR index=dtix AND source IN (/opt/delphi/dtix/tomcat/logs/Messaging.log* , *NFAM.log) 2020021076664318
| table DELPHI_REQUEST.REQUEST.CID,DELPHI_REQUEST.REQUEST.COMMAND,DELPHI_RESPONSE.RESULTS.TRANID,DELPHI_REQUEST.CTLHDR.SYS_NAME,DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP,DELPHI_REQUEST.CTLHDR.TRANID,source
| rename DELPHI_REQUEST.REQUEST.CID as "CKTID",DELPHI_REQUEST.REQUEST.COMMAND as "Command",DELPHI_RESPONSE.RESULTS.TRANID as "Resp_Tranid",DELPHI_REQUEST.CTLHDR.TRANID as "Req_Tranid",DELPHI_REQUEST.CTLHDR.SYS_NAME as "System",DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP as "Req Timestamp",DELPHI_REQUEST.REQUEST.SID as "Req_SessionId,DELPHI_RESPONSE.REQUEST.SID as "resp_SessionId,DELPHI_RESPONSE.RESULTS.ANALYSIS.TRBL_CODE as "Trouble Code"

but it is giving me multiple rows of same transaction id

0 Karma

poddraj
Explorer

Hi wmyersas,

Please find my query below :

index=e2etest OR index=dtix AND source IN (/opt/delphi/dtix/tomcat/logs/Messaging.log* , *NFAM.log) 2020021076664318
| table DELPHI_REQUEST.REQUEST.CID,DELPHI_REQUEST.REQUEST.COMMAND,DELPHI_RESPONSE.RESULTS.TRANID,DELPHI_REQUEST.CTLHDR.SYS_NAME,DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP,DELPHI_REQUEST.CTLHDR.TRANID,DELPHI_RESPONSE.RESULTS.ANALYSIS.TRBL_CODE,source
| rename DELPHI_REQUEST.REQUEST.CID as "CKTID",DELPHI_REQUEST.REQUEST.COMMAND as "Command",DELPHI_RESPONSE.RESULTS.TRANID as "Resp_Tranid",DELPHI_REQUEST.CTLHDR.TRANID as "Req_Tranid",DELPHI_REQUEST.CTLHDR.SYS_NAME as "System",DELPHI_REQUEST.CTLHDR.REQ_TIME_STAMP as "Req Timestamp",DELPHI_REQUEST.REQUEST.SID as "Req_SessionId,DELPHI_RESPONSE.REQUEST.SID as "resp_SessionId,DELPHI_RESPONSE.RESULTS.ANALYSIS.TRBL_CODE as "Trouble Code"

Fields are from 2 sources and from 2 different events (1 event from request xml and other event from response xml) but are related to 1 circuit id.
I need songle row ouptut with all those fields

0 Karma

poddraj
Explorer

Hi,
I am adding more inputs for better understanding of my issue. Below are my request & reponse xml samples. I need a tabular output in a single row for 1 circuit id with highlighted fields from below request & response xmls.
Kindly help as I am getting o/p in 2 separate rows where request xml fields in one row and response xml fields in other row.
Note: APP_REQUEST.REQUEST.CID & APP_RESPONSE.REQUEST.CID will always have same value.

Request XML whose source = A
---------------------------------------
<APP _REQUEST>
    <CTLHDR>
        <DIPVER>5.0</DIPVER>
        <DOMAIN>FTTP</DOMAIN>
        <SVC_ID>|1077606440069375000|HEM</SVC_ID>
         <SYS_ID>10-118-21.xx.com</SYS_ID>
        <SVC_NAME>TEST</SVC_NAME>
        <SYS_NAME>DTI_EXPRESS</SYS_NAME>
        <DTIMEOUT>3</DTIMEOUT>
        <ATTACHMENTS>Y</ATTACHMENTS>
        <REQ_TIME_STAMP>2020-02-10T10:11:14Z</REQ_TIME_STAMP>
        <TSTMODE>REAL</TSTMODE>
    </CTLHDR>
    <REQUEST>
        **<CID>12345</CID>**
        <TSTCOND>S</TSTCOND>
        <TR_TYPE>CCON</TR_TYPE>
        <SVCTYPE>DATA</SVCTYPE>
        **<COMMAND>GET_BHR_CFG</COMMAND>
        <EMP_ID>97864</EMP_ID>**
        <WK_TYPE>M</WK_TYPE>
        <JOB_TYPE>M</JOB_TYPE>
        <REQUIRED>
            <SVCTYPE>DATA</SVCTYPE>
            <COMMAND>GET_BHR_CFG</COMMAND>
            <EMP_ID>97864</EMP_ID>
            <WK_TYPE>M</WK_TYPE>
            <JOB_TYPE>M</JOB_TYPE>
        </REQUIRED>
        <ATTRIBUTES>
            <NUM_ATTRS>0</NUM_ATTRS>
        </ATTRIBUTES>
        <SID>A76664317</SID>
    </REQUEST>
</APP_REQUEST>

Response XMLwhose source = B
---------------------------------
<APP_RESPONSE>
    <CTLHDR>
        <DIPVER>5.0</DIPVER>
        <DOMAIN>FTTP</DOMAIN>
        <SVC_ID>|1074764670547209000|HEM</SVC_ID>
        <SYS_ID>10-118-21.xx.com</SYS_ID>
        <SVC_NAME>TEST</SVC_NAME>
        <SYS_NAME>DTI_EXPRESS</SYS_NAME>
        <DTIMEOUT>3</DTIMEOUT>
        <ATTACHMENTS>Y</ATTACHMENTS>
        <REQ_TIME_STAMP>2020-02-10T09:23:52Z</REQ_TIME_STAMP>
        <TSTMODE>REAL</TSTMODE>
    </CTLHDR>
    <REQUEST>
        <CID>12345</CID>
        <TSTCOND>S</TSTCOND>
        **<TR_TYPE>CCON</TR_TYPE>**
        <SVCTYPE>DATA</SVCTYPE>
        <COMMAND>GET_BHR_CFG</COMMAND>
        <EMP_ID> 97864</EMP_ID>
        <WK_TYPE>M</WK_TYPE>
        <JOB_TYPE>M</JOB_TYPE>
        <NETYPE>BHR</NETYPE>
        <REQUIRED>
            <SVCTYPE>DATA</SVCTYPE>
            <COMMAND>GET_BHR_CFG</COMMAND>
            <EMP_ID>96864</EMP_ID>
            <WK_TYPE>M</WK_TYPE>
            <JOB_TYPE>M</JOB_TYPE>
        </REQUIRED>
        <ATTRIBUTES>
            <NUM_ATTRS>0</NUM_ATTRS>
        </ATTRIBUTES>
        <SID>A76555190</SID>
    </REQUEST>
 <RESULTS>
  <RETC>0000</RETC>
  <RSTYPE>C</RSTYPE>
  **<INFOMSG>Request Successful</INFOMSG>**
  <TIME_SENT>2017-12-13T15:17:07Z</TIME_SENT>
  **<TRANID>2017121376436144</TRANID>**
  <SID>A76436143</SID>
  <NUM_ANA>1</NUM_ANA>
  <ANALYSIS>
   <NEID>G131500234</NEID>
   <NETYPE>BHR</NETYPE>
   **<DC>TOK</DC>**
   <DC_SSUM>Retrieve succees</DC_SSUM>
   <CKTDM>FTTP</CKTDM>
   <TRBL_LOCATION_CODE>G1A00234</TRBL_LOCATION_CODE>
   **<TRBL_CODE>TSDELTOK</TRBL_CODE>**
   <TRBL_LAYER>1</TRBL_LAYER>
   <EXPLANATION>Retrieve success</EXPLANATION>
  </ANALYSIS>
</APP_RESPONSE>
0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...