Deployment Architecture

Multiple search heads, config replication and dealing with scheduled searches (without pooling)

Glenn
Builder

Hi,

I'd like a system with multiple dedicated search heads, but for various reasons (avoiding added complexity, dependencies on NFS, pooling across a WAN, etc) would prefer to avoid setting up search head pooling between them. One of the search heads will function as primary, with one as backup in case the primary fails.

I need to replicate the $SPLUNK_HOME/etc/users and $SPLUNK_HOME/etc/apps between the search heads, and have seen mention of others using rsync for this. But how do you deal with scheduled searches in this situation?

Users are free to set up their scheduled searches, alerts and even summary indexing. I do not want to duplicate these across both search heads.

Splunk version 4.3.1.

Cheers,

Glenn

premg
Engager

Hi,

Even we are trying something similar.

In our system we have search heads in two different sites and we are using Splunk authentication\NO LDAP. we want to create users in each search head and need to replicate the same in all search heads.Is it possible to do this via Deployment server? what is the best/optimized way of doing it.

question posted :http://answers.splunk.com/answers/125959/user-replication-in-search-heads-at-two-different-sites

0 Karma

rmorlen
Splunk Employee
Splunk Employee

We have gone through this same issue. We went with Pooling. The apps aren't really the issue since they probably don't change very often. The users information can change. (We have over 2000 users and get about 400 unique user logins per day so this was an issue for us.)

Scheduled searches can be a real headache if you don't get them under control. We setup a jobs server (basically a searchhead that users don't access directly). We disabled scheduled searches on the searchheads that the users access so that they don't directly affect performance for the users. All scheduled searches run on the jobs server. We also turned off the ability for users to schedule searches. We force them to come to us to schedule the searches. A better option is for them to create a splunk app that contains their scheduled searches. Then we can use deployment server to push out any updates they might have.

Glenn
Builder

Thanks for sharing. A bit sad for me though... I don't want to have to be a bottleneck for creating saved searches with 200 Splunk users and a Unix sysadmin job to do most of the time... The other thing is that our apps do change all the time - people save searches, create views etc every day. Anyway good to know.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...