This blog represents the instructions on how to add arbiter to existing Mongodb replica set. Arbiters are MongoDB instances whose primary role is to participate in replica set election in order to break ties and select PRIMARY.This instance do not hold any data and have minimal resource requirements. As a matter of fact, it does not need or require a dedicated hardware to run. However, it is advised to run arbiter on different server than the replica set. It can be any other server such as application server or monitoring server. Further details can be found on this page.
Configure Arbiter Data Storage
An arbiter does not store data. However, that does not prevent Mongod instance start with a set of data files and a full-fledged journal. In order to minimze the default creation of data, the following configuration needs to be set in MongoDB configuration file, /etc/mongod.conf. For WiredTiger storage engine, open the configuration file, /etc/mongod.conf and set the value of storage.journal.enabled as false. The following represents the required change in the conf file.
storage dbPath: /path/to/mongo/data/directory journal: enabled: false
For MMAPv1 storage engine, this is how the configuration would look like:
storage dbPath: /path/to/mongo/data/directory mmapv1: smallFiles: true
Create a Custom Data Directory for Arbiter
In order to have Arbiter store their data files, a separate data directory is recommended. Create a data directory such as /data/arb.
Start Arbiter Instance
Start the mongo instance with custom data directory and –replSet option passing the value of replica set with which the current replicaset is running.
sudo mongod --dbpath /path/to/mongo/arb/data/directory --replSet vflux01
Add Arbiter to ReplicaSet
Login to Primary MongoDB server. Add the Arbiter MongoDB server to the replicaset using following command:
rs.addArb("ip.address.mongodb:27017");
Here is the page describing the rs.addArb command to add arbiter to the existing replicaset.
- Agentic Reasoning Design Patterns in AI: Examples - October 18, 2024
- LLMs for Adaptive Learning & Personalized Education - October 8, 2024
- Sparse Mixture of Experts (MoE) Models: Examples - October 6, 2024
I found it very helpful. However the differences are not too understandable for me