Alerting

Can I set up email groups/aliases for multiple recipients in some clever way?

fatsug
Contributor

I am fairly confident that there is a clever workaround for this though I am not 100% sure how.

I have alerts stored in apps on a deployer which makes use of the email function when triggered. I if I need to add/remove recipients from the email alert I have to manually edit several different recipient lists for several different alerts.

What I wan't is a clever way to set up som sort of "list" of recipients which I can name "developers" for instance, and instead of having 20 email adresses as recipients in the alert I could do something like "$devops$". Then just edit recipients at a single location for all alerts instead of each one separately.

I hope this is a clear enough explanation for what I am hoping is possible and welcome all suggestions.

 

Labels (2)
0 Karma
1 Solution

PickleRick
SplunkTrust
SplunkTrust

If you had

| eval recipients="employee1@mail.se,employee2@mail.se"

working, why couldn't you just make a macro containing the whole

eval recipients="employee1@mail.se,employee2@mail.se"

?

This way you'd just do

<yoursearch>
| `your_addresses_macro`

 And be done with this?

View solution in original post

fatsug
Contributor

Wild idea, maybe I can do this with a macro definition?

I'll play around with it and see if it works 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

No. Macros are expanded in search time, not during processing the results. So macro as the email action recipients won't work.

You can use result-based tokens.

https://docs.splunk.com/Documentation/Splunk/9.1.2/Alert/EmailNotificationTokens

EDIT: OK, so you can use macro or a lookup to generate a recipient field in the search results. And then use this result as a token for the given alert setting.

fatsug
Contributor

It was easy enough to test the theory by just using an eval to set the recipients list

index="_internal" earliest="-1s@s" latest="@s" 
| eval recipients="employee1@mail.se,employee2@mail.se"

Then using the results.recipients token for the alert, no problem. However, trying to get the field assigned using a macro, well not so easy.

So I figured I'd try the lookup approach, this works but as I'm not strictly doing a "lookup" i had some problems getting this to work. The best I came up with was this:

| inputlookup email-recipients.csv
| eval recipients = email-adresses
| append [search index="_internal" earliest="-1s@s" latest="@s" | head 1]

"email-adresses" is a comma separated list of email adresses in the lookup (just like above, one long line). Though here I have to lead with "inputlookup" which makes the "search" part a bit less aesthetically pleasing and the append produces some really nasty looking output.

I have "a" solution, though as I am often "doing it wrong" I thought I'd ask if there were any suggestions on how to improve in this solution.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

If you had

| eval recipients="employee1@mail.se,employee2@mail.se"

working, why couldn't you just make a macro containing the whole

eval recipients="employee1@mail.se,employee2@mail.se"

?

This way you'd just do

<yoursearch>
| `your_addresses_macro`

 And be done with this?

fatsug
Contributor

Sorry for bumping this thread, I got this working just fine. I can run a search with the macro and get the email.

However, when I roll out the alert and macro from our git repo I get recipients="". The macro is there and looks correct but when the scheduled task runs and should send an email content of the macro is returned empty.

The solution is still correct, I'm probably just missing something with permissions 

0 Karma

fatsug
Contributor

The 'polite' way to put it would likely be "limited experienced with macros", the straight one would be "stupidity" 🙂

I'll invest a little more time into this thing before I settle on a solution, thanks for the honest feedback!

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Don't worry. Everyone starts somewhere 🙂

Glad I could help.

0 Karma

fatsug
Contributor

Yeah, straight up using macros did not work 😅 not with sendemail or alert.action, so 100% correct.

I had not noticed the result tokens and this will be a h*ck of a workaround. Though if I understand the suggestion correctly I could maintain a macros.conf to have a "central" "distribution list", either by app or globally, using definitions (the format looks OK at least) to generate a field in the alert searches containing the macro list. Then use that results token as $result.recipients$ to actually populate the recipients with the list of email adresses from the macros definition.

I'll give this the old college try and push it to the testing environment tomorrow. Thank you and fingers crossed.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

While there are probably solutions within the splunk itself I suppose the easiest solution to manage would be to create distribution lists in your company email system and simply manage recipients of the reports by membership in this list.

0 Karma

fatsug
Contributor

Yes, 100% agreed and I have tried to do this though for some reason the "splunk" sender was not allowed access to distribution lists and using group inboxes would not achieve the desired outcome.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

That sounds like a problem with your email system which should be handled with your mail admins 🙂

0 Karma

fatsug
Contributor

I think it was acually a problem with the "security" angle, though I can't remember. So I'll keep my fingers crossed for some creative suggestions 😉

0 Karma
Get Updates on the Splunk Community!

Combine Multiline Logs into a Single Event with SOCK - a Guide for Advanced Users

This article is the continuation of the “Combine multiline logs into a single event with SOCK - a step-by-step ...

Everything Community at .conf24!

You may have seen mention of the .conf Community Zone 'round these parts and found yourself wondering what ...

Index This | I’m short for "configuration file.” What am I?

May 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with a Special ...