Copy all collections from one database to another on a separate server.
The workflow for this project is as follows:
- Export collections from source database as CSVs to
backupdirectory - Start MongoDB container
- Import collections into destination database
After the first run, entrypoint.sh will no longer overwrite the existing data in the destination database. To overwrite the existing data, run docker-compose down --volumes to remove the container(s) and then run docker-compose up -d to start the container(s) again.
- Copy
.env.exampleto.envand update the environment variables
# setup virtual environment
python -m venv .venv
source .venv/bin/activate
# install dependencies
pip install -r requirements.txt
# export collections from source database
python export.py
# deactivate virtual environment
deactivate
# start container
docker-compose up -d
# connect to mongodb container
export $(grep -v '^#' .env | xargs)
mongosh --host "$DB_HOST" \
--port "$PORT" \
--username "$DB_USER" \
--password "$DB_PASS"
# show dbs
show dbs
# use destination database
use <DB_NAME>
# show collections
show collections
# run query
db.<COLLECTION_NAME>.find()
# quit
exit
# stop container(s)
docker-compose stop
# remove container(s)
docker-compose down- Issues
- Finish
main.pyfunctions to connect to MongoDB and handle exceptions based on source material- Merge
copy_mongodb.pyintomain.py
- Merge
- Clean up documentation
bitnami/mongodb - Docker Image | Docker Hub
Is there a sample MongoDB Database along the lines of world for MySql? - Stack Overflow