Splunk Search

How to aggregate Splunk events with different key value pairs based on event time?

ahmedhassanean
Explorer

I have logs that contain different Key/value in different logs, but with same transaction.
I would like to summarize all this key/value pairs with time in events as below.
time1:key1:value1:time2:key2:value2:....

Is there is any way to achieve that?

thanks in advance

0 Karma

woodcock
Esteemed Legend

Like this;

index=default | table TransactionID "Customer number" agentID viewduration "number of requests" "Server URL"
| stats values(*) AS * BY TransactionID

Or maybe this:

index=default | table TransactionID "Customer number" agentID viewduration "number of requests" "Server URL"
| stats list(*) AS * BY TransactionID
0 Karma

somesoni2
Revered Legend

May be like this

your base search | fields _time transactionID fieldlist here | stats list(*) as * by transactionID
0 Karma

Richfez
SplunkTrust
SplunkTrust

ahmedhassanean, could you please supply some actual sample events, and for those events please create some sort of a mock-up of the output you'd like? This will do WONDERS for the quality of responses you get. Right now I think we're all guessing as to how to rearrange your inputs into your desired outputs.

0 Karma

ahmedhassanean
Explorer

if i run below spl:

index=default | table TransactionID "Customer number" agentID viewduration "number of requests" "Server URL"

i will get below output:

    time  TransactionID    Customer number    agentID    viewduration    number of requests    Server URL
10:01:01          15647                 51
10:01:02          15647                            13               2
10:01:03          15647                                                                   7
10:01:03          18333                                                                       google.com
10:01:04          15647
10:01:05          15647                                                                       google.com
10:01:06          15647
10:01:07          15647                 69
10:01:08          15647
10:01:09          15647                            74
10:01:10          15647                                           10
10:01:11          15647                                                                13
10:01:12          15647
10:01:13          15647                                           14                           yahoo.com
10:01:14          15647                 
10:01:15          15647                 
10:01:16          15647                                                                14   
10:01:17          15647                 
10:01:18          15647                            10                                          yahoo.com
11:01:18          18333                                                                17
12:01:18          18333                 13
13:01:18          18333
0 Karma

Richfez
SplunkTrust
SplunkTrust

Thanks, that helps! Now, what would you like the output of the above events look like?

0 Karma

ahmedhassanean
Explorer

sample example:
TransactionID=15647|10:01:01|Customernumber=51|10:01:02|agentID=13|10:01:02|agentID=13

one line for each transaction

0 Karma

Richfez
SplunkTrust
SplunkTrust

... stats latest(_time) as StartTime, Customernumber, first(_time) as EndTime, list(agentID), ... wait, what? You want to normalize the first half of the stuff, but explicitly not normalize the last half of the stuff? That's difficult. How many "agentid" are there?

Hmm, I'll have to think on that some more. Unless...
... stats latest(_time) as StartTime, Customernumber, first(_time) as EndTime, list(agentID), list(_time) by customernumber,TransactionID

If that's not close enough, to what use would you or someone else actually put this data, as formatted in the way you suggested, to use? Would another format be useful too?

0 Karma

ahmedhassanean
Explorer

no it's not what i want
i would like to normalize all fireld/value pair of each transaction but with time for each field/value in row

i was thinking in command like mvzip but it will be hard to obtain because of large number of key/value pairs

0 Karma

somesoni2
Revered Legend

Have a look at transaction command.

0 Karma

ahmedhassanean
Explorer

transaction command collect events at one big multi values event instead i want to have single value event in row

0 Karma

ahmedhassanean
Explorer

not it's tabular format i would like it be raw event time field value .....etc

0 Karma

ahmedhassanean
Explorer

new events as below:
time1,field1=value1:time1: field2=value2:time2 ,field1=value2,time2:field3=value3 and so on

to summarize it i want to collect each field/value with time for all same filed/value of same transactionID to new one event

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Can you share some sample events?

---
If this reply helps you, Karma would be appreciated.
0 Karma

ahmedhassanean
Explorer

events1: time1 field1=value1 field2=value2
events2: time2 field1=value2 field3=value3
event3: time3 field4=value4
event4: time4
event5: time5 event1=value1 field6=value6

0 Karma

woodcock
Esteemed Legend

given these events, what would you like the output to be?

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...