version 2.0
A two-way Migrator (supports Elasticsearch --> MongoDB & MongoDB --> Elasticsearch)
- Migration supports Elasticsearch --> MongoDB & MongoDB --> Elasticsearch
- No external queues/message brokers needed
- Resumes from the custom
mongoes_id
custom built-in checkpoint, in case of data transfer failures.
- Python - v3.8
- Elasticsearch
- pymongo
-
Clone the repository.
-
Initialize virtual environment and install all the Prerequisites.
$ python3 -m venv .env
$ source .env/bin/activate
$ pip install -r requirements.txt
- Edit the mongoes.json file according to your requirements.
If Elasticsearch is your source:
{
"COMMIT": {
"HOST": "localhost",
"INDEX": "shakespeare4",
"USER": "elastic",
"PASSWORD": "passopasso",
"DBENGINE": "elasticsearch",
"PORT":9200,
"PROTOCOL": "http"
},
"EXTRACT": {
"HOST": "localhost",
"DATABASE": "shakespeare",
"COLLECTION": "shakespeare",
"USER": "mongobongo",
"PASSWORD": "passopasso",
"DBENGINE": "mongo",
"PORT":27017,
"SSL": false
},
"SETTINGS": {
"FREQUENCY": 10000
}
}
or in case of mongodb is the source:
{
"COMMIT": {
"HOST": "localhost",
"INDEX": "shakespeare4",
"USER": "elastic",
"PASSWORD": "passopasso",
"DBENGINE": "elasticsearch",
"PORT":9200,
"PROTOCOL": "http"
},
"EXTRACT": {
"HOST": "localhost",
"DATABASE": "shakespeare",
"COLLECTION": "shakespeare",
"USER": "mongobongo",
"PASSWORD": "passopasso",
"DBENGINE": "mongo",
"PORT":27017,
"SSL": false
},
"SETTINGS": {
"FREQUENCY": 10000
}
}
-
Make sure that both the elasticsearch and mongoDB services are up and running.
-
And finally, fire the migrator by keying in:
$ python3 __init__.py
- Sit back and relax; for we got you covered! The migration's default value is 1000 entries per transfer.
Happy Wrangling!!! :)