Installation

Best way to copy the $Splunk_Home/etc/apps/xx/local/ directories from existing Search head cluster (7.0.x version) to new Search head Cluster using deployer in 7.3.0 version

kchaitanya
Explorer

Can Some one suggest the best approach to follow while migrating the Knowledge Objects from a existing Search head cluster running on 7.0.x version to a new Search head cluster running on 7.3.0 version .. I am specifically looking for information on how to make use thne deployer_push_mode or any other best practices to follow

Labels (1)
0 Karma
1 Solution

gjanders
SplunkTrust
SplunkTrust

@harsmarvania57 mentioned my script https://github.com/gjanders/Splunk/blob/master/bin/transfersplunkknowledgeobjects.py it was literally built for this purpose prior to 7.3.x, however 7.3.x includes push modes.

As per:
https://docs.splunk.com/Documentation/Splunk/latest/DistSearch/PropagateSHCconfigurationchanges#Set_...

You could use "full" push mode to get the local directories over, so the process would be copy from existing search heads the default/local/metadata et cetera and get that onto your deployer.
Then a full push, then personally I'd wipe the local off the deployer and switch back to merge_to_default or your preferred push mode. Note that you can control push mode per app according to the documentation...

If you pushed the deployer bundle and left it on the default mode of merge_to_default then files on the search head end up in default and you cannot deleted saved searches from the search head UI or similar so obviously you want to avoid that scenario.

I'm making the assumption that you want identical config on the new cluster, if you want more filtering either use my script or you could be selective with what you copy over to the deployer.
Note my script had limitations due to the API, for example you cannot copy a lookup file so either way that involves the deployer.

Good luck

View solution in original post

0 Karma

gjanders
SplunkTrust
SplunkTrust

@harsmarvania57 mentioned my script https://github.com/gjanders/Splunk/blob/master/bin/transfersplunkknowledgeobjects.py it was literally built for this purpose prior to 7.3.x, however 7.3.x includes push modes.

As per:
https://docs.splunk.com/Documentation/Splunk/latest/DistSearch/PropagateSHCconfigurationchanges#Set_...

You could use "full" push mode to get the local directories over, so the process would be copy from existing search heads the default/local/metadata et cetera and get that onto your deployer.
Then a full push, then personally I'd wipe the local off the deployer and switch back to merge_to_default or your preferred push mode. Note that you can control push mode per app according to the documentation...

If you pushed the deployer bundle and left it on the default mode of merge_to_default then files on the search head end up in default and you cannot deleted saved searches from the search head UI or similar so obviously you want to avoid that scenario.

I'm making the assumption that you want identical config on the new cluster, if you want more filtering either use my script or you could be selective with what you copy over to the deployer.
Note my script had limitations due to the API, for example you cannot copy a lookup file so either way that involves the deployer.

Good luck

0 Karma

kchaitanya
Explorer

Thank you very much for the reply. .Yes.. i believe the deployer_push_mode = full will work for the local directories and we need to wipe them off and create an emplty local directory with the default push mode

Even the default directory on the current SHs have some KOs merged .. How can we get them on to new shc ... Do you suggest placing the Savedsearches.conf under /apps/default/xxx on existing SHC to be placed on new Deployer's shcluster/app/xx/local/ and pushed with the default mode ?

0 Karma

gjanders
SplunkTrust
SplunkTrust

Well if you are happy with the way it is now you could just pull the default directory from the search heads and push it via the deployer, that way it is the same as it is now.

Alternatively you could take the default/local from the deployer and then merge in the local from the search heads, this might take a little bit more effort as you might have to actually merge the files rather than just copy...

0 Karma

harsmarvania57
Ultra Champion

Have a look at python script https://github.com/gjanders/Splunk/blob/master/bin/transfersplunkknowledgeobjects.py , created by @gjanders . This will help you to migrate knowledge objects from one instance to another using REST API.

0 Karma

p_gurav
Champion
0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...