Alerting

Email on scheduled alert trouble

derekleuridan
New Member

Hi Guys and Gals, been scratching my head on this one for days, I'm hoping I might get some fresh eyes and opinions.

3 searches, all scheduled and set to email on results > 0.
Each search belongs to different admin-roled user (and only admin-roled), so permissions should be identical.

The emails only get sent for searches created by user "admin", other users can create searches, they will show in alerts, with proper counts, and even show in scheduler.log, but emails will only obey admin's call.

scheduler.log entries for the searches:

04-24-2012 13:01:33.720 -0400 INFO SavedSplunker - savedsearch_id="derek;search;Alert Create Test Derek", user="derek", app="search", savedsearch_name="Alert Create Test Derek", status=success, digest_mode=1, scheduled_time=1335286800, dispatch_time=1335286889, run_time=2.528, result_count=26, alert_actions="email", sid="scheduler_derek_search_QWxlcnQgQ3JlYXRlIFRlc3QgRGVyZWs_at_1335286800_2d003f01685ac2cd", suppressed=0, thread_id="AlertNotifierWorker-0"

04-24-2012 13:01:39.960 -0400 INFO SavedSplunker - savedsearch_id="syed;search;Syed testing email", user="syed", app="search", savedsearch_name="Syed testing email", status=success, digest_mode=1, scheduled_time=1335286800, dispatch_time=1335286895, run_time=3.120, result_count=36, alert_actions="email", sid="scheduler_syed_search_U3llZCB0ZXN0aW5nIGVtYWls_at_1335286800_d9f615bc8f486b94", suppressed=0, thread_id="AlertNotifierWorker-2"

04-24-2012 13:00:31.242 -0400 INFO SavedSplunker - savedsearch_id="admin;search;Email Test", user="admin", app="search", savedsearch_name="Email Test", status=success, digest_mode=1, scheduled_time=1335286800, dispatch_time=1335286812, run_time=3.339, result_count=36, alert_actions="email", sid="scheduler_admin_search_RW1haWwgVGVzdA_at_1335286800_a7b1214726ce6405", suppressed=0, thread_id="AlertNotifierWorker-0"

python.log, where only the admin query is seen :

2012-04-24 13:00:28,714 DEBUG simpleRequest > GET https://127.0.0.1:8089/servicesNS/nobody/search/admin/alert_actions/email [] sessionSource=direct

2012-04-24 13:00:29,822 DEBUG simpleRequest < server responded status=200 responseTime=1.0920s
2012-04-24 13:00:29,822 DEBUG simpleRequest > GET https://127.0.0.1:8089/services/search/jobs/scheduler__admin__search_RW1haWwgVGVzdA_at_1335286800_a7... [] sessionSource=direct
2012-04-24 13:00:29,884 DEBUG simpleRequest < server responded status=200 responseTime=0.0470s
2012-04-24 13:00:29,884 DEBUG getStatus - elapsed=0.0620000362396 nextRetry=0.0500019066273
2012-04-24 13:00:30,525 INFO Sending email. subject="Splunk Alert owned by admin : Email Test", results_link="http://papp01splunk:8000/app/search/@go?sid=scheduler__admin__search_RW1haWwgVGVzdA_at_1335286800_a7...", recepients="['john@doe.com', 'jack@doe.com']"

What am I missing? Whats the disconnect?

0 Karma

woodcock
Esteemed Legend

You did not share the searches but I suspect that you are doing this:

sourcetype=MyEvents ...

If so, the solution is to specify the index in your search like this:

index=MyIndex sourcetype=MyEvents ...

Your admin user has more roles that the other users and one of these has a the important index in the Indexes searched by default"setting. so his searches had results without having to specify an index. The other users, even though they have admin role, lack this other crucial role. This problem is the result of a very common but VERY bad habit. A user-level best-practice is to always be as specific about your search query as possible and, to that end, always include index= and sourcetype= directives. What is worse than a no-results situation is a wrong-results one where you get one set of results but your boss gets a different set (because you are not in the same role and do not have the same Indexes searched by default setting). There are 2 ways to preclude this problem. You can change this setting to All non-internal indexes in the user role so that every new index is automatically included in non-index-specific searches without any extra administration. The better way is to set it to nothing (empty) for every role thus forcing users to be habitually index-specific!

0 Karma
Get Updates on the Splunk Community!

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...

Introducing the 2024 Splunk MVPs!

We are excited to announce the 2024 cohort of the Splunk MVP program. Splunk MVPs are passionate members of ...

Splunk Custom Visualizations App End of Life

The Splunk Custom Visualizations apps End of Life for SimpleXML will reach end of support on Dec 21, 2024, ...