Alerting

Splunk alert email messages causing reject entries in /var/log/maillog

wrangler2x
Motivator

Every time a splunk alert triggers and sends an email two lines like this appear in the maillog:

Sep 29 16:00:51 abc.def.uci.edu sendmail[6208]: p8TN0pr1006208: ruleset=check_rcpt, arg1=<>, relay=abc.def.uci.edu [127.0.0.1], reject=553 5.0.0 <>... User address required
Sep 29 16:00:51 abc.def.uci.edu sendmail[6208]: p8TN0pr1006208: ruleset=check_rcpt, arg1=<>, relay=abc.def.uci.edu [127.0.0.1], reject=553 5.0.0 <>... User address required

There is nothing wrong with the email list in the alert. The email does get delivered, but these errors occur every time. We do not see these using other methods of sending mail, such as sendmailing a file or using mutt. What is splunk doing here?

1 Solution

sgarvin
Engager

This is a known bug and it will be fixed in the next release of Splunk. I don't have a time frame for that release at the moment.
Basically, the python script sendemail.py will call the "rcpt to:" command twice without an email address.

View solution in original post

wrangler2x
Motivator

I forgot to come back and mention that when we upgraded to 4.3 the problem went away.

0 Karma

sgarvin
Engager

This is a known bug and it will be fixed in the next release of Splunk. I don't have a time frame for that release at the moment.
Basically, the python script sendemail.py will call the "rcpt to:" command twice without an email address.

wrangler2x
Motivator

For anyone reading this the problem went away on 4.3

0 Karma

wrangler2x
Motivator

I verified that no email disto lists have trailing commas. But I did find that the python.log has numerous entries like so:

/var/log/splunk/python.log:2011-06-10 00:00:46,157 INFO Sending email. subject="Splunk Alert: Non-responding Forwarders", results_link="https://scrubbed_URL/app/search/@go?sid=scheduler__xyz__search_Tm9uLXJlcG9ydGluZyBob3N0cyBhbGVydA_at_1307689200_9af43686a33280c4", recepients="['xyz@uci.edu', 'nummod@uci.edu', 'evigchi@uci.edu', '', '']"

The python script for sending email is adding-on two null fields, and this matches the two system log errors I see. Definitely a bug.

0 Karma

dwaddle
SplunkTrust
SplunkTrust

My first guess would be that one of your alerts has an "empty" mailing address in it. I would check the various savedsearches.conf config files ( splunk cmd btool --debug savedsearches list can help, but does not cover all user-private saved searches ) for an action.email.to that has a comma at the end of the line. Something like:

action.email.to = user1@company.net,user2@company.net,

Reading Splunk's sendmail.py script, it's possible that a comma at the end results in a null recipient. If you do find this to be the case, remove the comma and restart splunkd to see if your alerts continue to have this issue. If this fixes it, then I'd think a splunk support ticket would be appropriate to get the trailing-comma thing logged as a defect. (I do not KNOW it to be a defect at this time, but suspect it based on a cursory reading of the code)


UPDATE: Splunkweb's UI actively tries to defend against this type of problem (in 4.2.3 anyway). I've not been able to test yet exactly what happens if the trailing comma winds up in the config files anyway.

wrangler2x
Motivator

The support case number is 68303

0 Karma

dwaddle
SplunkTrust
SplunkTrust

Yeah, probably time to open a support case. For the community's benefit, please update this with a new answer once you know something. Thanks!

0 Karma

dwaddle
SplunkTrust
SplunkTrust

Did you ever have any success finding more information about your problem here?

0 Karma

dwaddle
SplunkTrust
SplunkTrust

It might be worth it to fire up tcpdump for a few minutes and watch some of these as they go off to sendmail, see if you can spot where in the SMTP transaction it's going off the rails.

0 Karma

wrangler2x
Motivator

Okay, still getting the rejects in /var/log/maillog after removing the semicolons. All the email lists in all the various action.email.to lines look okay. Any other ideas?

0 Karma

dwaddle
SplunkTrust
SplunkTrust

Semicolons should be fine, as should spaces inbetween. I'm looking at $SPLUNK_HOME/etc/apps/search/bin/sendemail.py:

EMAIL_DELIM = re.compile('\s*[,;]\s*')

When I look at this script the 'to' logic is slightly than the cc/bcc logic. But, I don't think the difference really matters because it's a difference in the header vs envelope and Sendmail calls check_rcpt for the envelope addresses.

0 Karma

wrangler2x
Motivator

I used this to look at the action.email.to lines after reading your note:

find . -name savedsearches.conf | xargs grep action.email.to | grep '@' | grep -v example | less

I did not find any trailing commas. But I did find two entries where one of our users separated two email addresses with a semicolon. I'm going to go fix that now and see if that makes a difference.

Also wondering--I've been separating email addresses with a , but I notice that in the example you gave there is no space... is the space optional or should I leave it out?

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...