Splunk Search

Is it possible to use the transaction command to create several transactions in parallel based on distinct fields in the same search using e.g. XOR?

hettervik
Builder

Hi everyone!

To save resources, I want to combine several scheduled alerts into one. Each of the alerts is running a search and creates transactions. What I'm wondering is if it's possible to use the transaction command to create transactions in parallel based on distinct fields, where the starting event defines what fields are used to group events for this transaction? That probably made no sense, I'll give an example.

eventtype=UpdateWorkerSystem1 OR eventtype=UpdateWorkerSystem2

| transaction id_worker_sys1 XOR (id_worker_sys2a id_worker_sys2b) startswith="PATCH /api/workers/" maxspan=100s connected=t

What I'm trying to do in this example is to create transactions where the grouped events is grouped based on either the id of a worker in system 1 or one of the two id's of a worker in system 2. The grouped events could contain the id of both a worker in system 1 and one of the id's of a worker in system 2, but if the first event in that transaction has a worker id from e.g. system 2, then all events in that transaction has to have a worker id from system 2 (id_worker_sys2a or id_worker_sys2b).

It seems to me that when I create a transaction like this, the transaction command only "sees" the id_worker_sys1 field. Does the transaction syntax even support brackets and/or the "exclusive or" (XOR)? If my question made sense, is there another way to achieve what I want to do?

Thanks! Best regards.

0 Karma
1 Solution

Richfez
SplunkTrust
SplunkTrust

Transaction alone can't do that, but used with append can. Append takes the pile of events from one search and tacks onto the end another pile of events from a second search. Where fieldnames overlap it "reuses" the field names, otherwise it just sticks the appended stuff in new fields with new names (you can see that in the first example given, where "count" matches so it goes in the same column).

The idea will be to create a search that pulls out all the items you want in one transaction and build the transaction out of them, then create a second search and run it with append that pulls out all the items you want in the second transaction and build the transaction you want out of them. Those two searches are completely independent from one another and can have the same or different contents. So, in your case, something like the below (which is not exactly complete or full, but more of a general idea - give the techniques a try and see how far you get, ask for more help if you hit a specific stumbling block).

eventtype=UpdateWorkerSystem1 
| transaction id_worker_sys1 startswith="PATCH /api/workers/" maxspan=100s connected=t 
| append [search eventtype=UpdateWorkerSystem2 OR (eventtype=UpdateWorkerSystem1 ..OtherSpecialInclusionCriteriaHere...
| eval workerID="2b".id_worker_sys2b | eval workerID="2a".id_worker_sys2a 
| transaction workerID startswith="yourCriteria" maxspan=100s connected=t ]
| ... other stuff here
  1. The first line is our base search where we ONLY include the events we want in one of our transactions.
  2. Line two creates that transaction.
  3. Line three appends data, starting from a search that pulls in all events you want in the second transaction. You should be able to add in anything you want here, though keep in mind you'll have to squeeze it into the transaction somehow.
  4. The next line creates a single field to make the transaction from, building a single "workerID" field that prepends "2a" or "2b" to the id_worker_sys field so that they're a) in a single field so you can build the transaction out of it but still maintain separateness so they won't overlap. I included this as an example in case you need it. This technique may be useful in sticking your "outlier" event into the transaction.
  5. The we create a transaction out of that second set of events and close the append.
  6. Then... do whatever you want. 🙂

A tip on developing the two searches (the base and the appended) - those are entirely separate so you can develop each all on its own, then when you get each right, just paste one into an | append [search ...] on the other one. Makes it way easier in more complex examples to get the data right. And if you get something odd in one side, there's no problem with just copying the appended search into a new search and running it by itself.

View solution in original post

jplumsdaine22
Influencer

Presumably you thought of this already, but just in case you have control over the logs...

Any chance you can genericise your fields? IE instead of

id_worker_sys1=2
id_worker_sys1=1
id_worker_sys2=1
id_worker_sys2a=1

log the following
id_worker_sys=1.2
id_worker_sys=1.1
id_worker_sys=2.1
id_worker_sys=2a.1

Then you could just use | stats values(*) by id_worker_sys instead of transaction. Or better still, log the id_sys and id_worker fields seperately.

hettervik
Builder

Thanks! Your solution would have worked perfectly, but unfortunately I don't have control over the logs. Our long term goal is indeed to standardize as many logs as possible, though there are a lot of different stakeholders with a lot of different systems involved in our Splunk environment, so changes take time.

I'll keep your answer in mind when we've got a more standardized set of logs.

0 Karma

jplumsdaine22
Influencer

No worries - rich7177's answer is the way to go then!

0 Karma

Richfez
SplunkTrust
SplunkTrust

Transaction alone can't do that, but used with append can. Append takes the pile of events from one search and tacks onto the end another pile of events from a second search. Where fieldnames overlap it "reuses" the field names, otherwise it just sticks the appended stuff in new fields with new names (you can see that in the first example given, where "count" matches so it goes in the same column).

The idea will be to create a search that pulls out all the items you want in one transaction and build the transaction out of them, then create a second search and run it with append that pulls out all the items you want in the second transaction and build the transaction you want out of them. Those two searches are completely independent from one another and can have the same or different contents. So, in your case, something like the below (which is not exactly complete or full, but more of a general idea - give the techniques a try and see how far you get, ask for more help if you hit a specific stumbling block).

eventtype=UpdateWorkerSystem1 
| transaction id_worker_sys1 startswith="PATCH /api/workers/" maxspan=100s connected=t 
| append [search eventtype=UpdateWorkerSystem2 OR (eventtype=UpdateWorkerSystem1 ..OtherSpecialInclusionCriteriaHere...
| eval workerID="2b".id_worker_sys2b | eval workerID="2a".id_worker_sys2a 
| transaction workerID startswith="yourCriteria" maxspan=100s connected=t ]
| ... other stuff here
  1. The first line is our base search where we ONLY include the events we want in one of our transactions.
  2. Line two creates that transaction.
  3. Line three appends data, starting from a search that pulls in all events you want in the second transaction. You should be able to add in anything you want here, though keep in mind you'll have to squeeze it into the transaction somehow.
  4. The next line creates a single field to make the transaction from, building a single "workerID" field that prepends "2a" or "2b" to the id_worker_sys field so that they're a) in a single field so you can build the transaction out of it but still maintain separateness so they won't overlap. I included this as an example in case you need it. This technique may be useful in sticking your "outlier" event into the transaction.
  5. The we create a transaction out of that second set of events and close the append.
  6. Then... do whatever you want. 🙂

A tip on developing the two searches (the base and the appended) - those are entirely separate so you can develop each all on its own, then when you get each right, just paste one into an | append [search ...] on the other one. Makes it way easier in more complex examples to get the data right. And if you get something odd in one side, there's no problem with just copying the appended search into a new search and running it by itself.

Richfez
SplunkTrust
SplunkTrust

And see jplumsdaine22's answer as well, which if it can be done is probably a very good way to generically make this situation and the data easier and better.

hettervik
Builder

Thanks a lot! I'd started looking at the append command before your answer as well, and it seems like the way to go. The way you described the use of the append command in your example worked flawlessly for the alert I'm creating.

I encountered a new problem when appending a third search to the alarm. The searches are so long that my server can handle them when all three are combined into one three times as long search. Fortunately it turned out that the third search wasn't needed, just the other two, and now everything works just fine.

Still, the two searches are very similar, and it seems kind of unnecessary to run both of them separately. I'm sure there must be a better and less resource demanding way. I'll look more into optimizing the alarm when I got more times on my hand.

Best regards!

0 Karma

Richfez
SplunkTrust
SplunkTrust

Hmm.

I think I understand the question, but I'm not sure: could you post a dozen events where this could be used and the desired results? That may make it a little easier to follow what it is you want.

0 Karma

hettervik
Builder

Okay, here are some events (I hope this doesn't get out of hand):

1. eventtype=UpdateWorkerSystem1 id_worker_sys1=1
2. eventtype=UpdateWorkerSystem2 id_worker_sys2a=1
3. eventtype=UpdateWorkerSystem1 eventtype=UpdateWorkerSystem2 id_worker_sys1=1  id_worker_sys2a=1
4. eventtype=UpdateWorkerSystem1 id_worker_sys1=1
5. eventtype=UpdateWorkerSystem1 id_worker_sys1=2
6. eventtype=UpdateWorkerSystem2 id_worker_sys2a=1 id_worker_sys2b=1
7. eventtype=UpdateWorkerSystem2 id_worker_sys2b=1
8. eventtype=UpdateWorkerSystem1 id_worker_sys1=1
9. eventtype=UpdateWorkerSystem1 id_worker_sys1=2

With my transaction expression from above I would like to get the following transactions:

Transaction 1:
1. eventtype=UpdateWorkerSystem1 id_worker_sys1=1
3. eventtype=UpdateWorkerSystem1 eventtype=UpdateWorkerSystem2 id_worker_sys1=1 id_worker_sys2a=1
4. eventtype=UpdateWorkerSystem1 id_worker_sys1=1
8. eventtype=UpdateWorkerSystem1 id_worker_sys1=1

Transaction 2:
2. eventtype=UpdateWorkerSystem2 id_worker_sys2a=1
3. eventtype=UpdateWorkerSystem1 eventtype=UpdateWorkerSystem2 id_worker_sys1=1 id_worker_sys2a=1
6. eventtype=UpdateWorkerSystem2 id_worker_sys2a=1 id_worker_sys2b=1
7. eventtype=UpdateWorkerSystem2 id_worker_sys2b=1

Transaction 4:
5. eventtype=UpdateWorkerSystem1 id_worker_sys1=2
9. eventtype=UpdateWorkerSystem1 id_worker_sys1=2

Note that with the "XOR", event 6 and 7 aren't included in transaction 1, only events with id_worker_sys1. Since transaction 1 started with an id_worker_sys1, all events in transaction 1 has to have the field id_worker_sys1.

Transaction 2 goes from having events with id_worker_sys2a to events with id_worker_sys2b. That is my intention by using the brackets in the transaction expression from above.

I realize now that I have another dilemma; is it possible for an event (in this case event 3) to be grouped in two transactions (in this case transaction 1 and transaction 2) in the same search?

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...