Deployment Architecture

Is it possible to limit bucket replication with limits.conf?

mudragada
Path Finder

We recently set up a multisite with replication between the sites.
This is causing network congestion when it comes to replicating the buckets. Is there a way to limit this using something like the limits.conf?

0 Karma

sowings
Splunk Employee
Splunk Employee

No. Splunk's data streaming and "fixup" activity in the case of failure is designed to return the cluster to a healthy state as soon as possible. For "live data" streaming (we call this hot buckets), you're sending a copy of the data "slice" (~128kb by default) from the source indexer to as many peers as required to meet the replication factor. When you've got downtime or another event that requires "fixing" a cluster, you can throttle the number of jobs (that is, active simultaneous attempts to copy the data), but not the bandwidth consumed.

Unfortunately, if you're counting bytes on the WAN, you may not be ready for multi-site clustering.

0 Karma
Get Updates on the Splunk Community!

Threat Hunting Unlocked: How to Uplevel Your Threat Hunting With the PEAK Framework ...

WATCH NOWAs AI starts tackling low level alerts, it's more critical than ever to uplevel your threat hunting ...

Splunk APM: New Product Features + Community Office Hours Recap!

Howdy Splunk Community! Over the past few months, we’ve had a lot going on in the world of Splunk Application ...

Index This | Forward, I’m heavy; backward, I’m not. What am I?

April 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...