A minimal, production-ready database container setup using Docker Compose, modeling a simplified publishing domain with MySQL and a custom Python CLI migration tool.
- Docker Desktop (running)
- Python 3.9+ (for local development)
- Git
git clone <repository-url>
cd db_container_sample_1
# Generate secure environment file
python setup-environment.py
# Start the database
docker-compose up -d
# Initialize migration system
python db-migrate.py init
# Apply initial schema
python db-migrate.py migrate
# Seed with sample data
python db-migrate.py seed
# Validate database schema
python db-migrate.py validate
# Check migration status
python db-migrate.py status
- SQL Database: MySQL 8.0 container with persistent storage
- Three Core Entities: Author, Book, Publisher with proper relationships
- Schema Migration: Custom Python CLI tool for database schema management
- Data Seeding: Realistic sample data with referential integrity
- Validation: Comprehensive schema and data integrity validation
- Many-to-Many Relationships: Books β Genres with junction table
- Migration Tracking: Version-controlled schema changes
- Error Handling: Robust error recovery and validation
- Security: Environment-based credential management
- Containerization: Fully containerized with Docker Compose
# Initialize migration system
python db-migrate.py init
# Apply pending migrations
python db-migrate.py migrate
# Show migration status
python db-migrate.py status
# Validate database schema
python db-migrate.py validate
# Reset database (clears all data)
python db-migrate.py reset
# Reset and reseed in one command
python db-migrate.py reset --reseed
# Seed database with sample data
python db-migrate.py seed
# Force overwrite existing data
python db-migrate.py seed --force
# Create new table via migration
python db-migrate.py add-table --name "table_name" --columns "col1,col2,col3" --datatypes "INT,VARCHAR(100),TEXT"
# Example: Create genre table
python db-migrate.py add-table --name "genre" --columns "id,name,description" --datatypes "INT,VARCHAR(100),TEXT"
db_container_sample_1/
βββ docker-compose.yml # Main orchestration
βββ Dockerfile # Migration service container
βββ requirements.txt # Python dependencies
βββ .env.example # Environment template
βββ README.md # This file
β
βββ python/ # Python application code
β βββ db_migrate.py # Main CLI tool
β βββ utils/
β β βββ db_connection.py # Database connection utility
β βββ scripts/
β βββ setup_environment.py # Environment setup
β βββ secure_migrate.py # Secure migration runner
β
βββ sql/ # SQL files and migrations
β βββ migrations/ # Schema migration files
β β βββ 001_create_schema.sql
β β βββ 003_add_genre_table.sql
β β βββ 004_add_book_genres_table.sql
β βββ seeds/
β βββ sample_data.sql # Sample data
β
βββ config/ # Configuration files
β βββ database.yaml
β
βββ Docs/ # Documentation
βββ ProblemDefinition.md
βββ FeasibilityStudy.md
βββ ImplementationPlan.md
id
(INT, PRIMARY KEY, AUTO_INCREMENT)name
(VARCHAR(255), NOT NULL)country
(VARCHAR(100), NOT NULL)created_at
(TIMESTAMP)updated_at
(TIMESTAMP)
id
(INT, PRIMARY KEY, AUTO_INCREMENT)name
(VARCHAR(255), NOT NULL)city
(VARCHAR(100), NOT NULL)created_at
(TIMESTAMP)updated_at
(TIMESTAMP)
id
(INT, PRIMARY KEY, AUTO_INCREMENT)title
(VARCHAR(500), NOT NULL)publication_year
(INT, CHECK constraint: 1000-2030)author_id
(INT, FOREIGN KEY β authors.id)publisher_id
(INT, FOREIGN KEY β publishers.id)created_at
(TIMESTAMP)updated_at
(TIMESTAMP)
id
(INT, PRIMARY KEY, AUTO_INCREMENT)name
(VARCHAR(100), NOT NULL, UNIQUE)description
(TEXT)created_at
(TIMESTAMP)updated_at
(TIMESTAMP)
book_id
(INT, FOREIGN KEY β books.id)genre_id
(INT, FOREIGN KEY β genre.id)created_at
(TIMESTAMP)- PRIMARY KEY (book_id, genre_id)
Create a .env
file from .env.example
:
# Database Configuration
MYSQL_ROOT_PASSWORD=your_secure_root_password_here
MYSQL_DATABASE=publishing_db
MYSQL_USER=app_user
MYSQL_PASSWORD=your_secure_app_password_here
# Application Configuration
DB_HOST=mysql
DB_PORT=3306
DB_NAME=publishing_db
DB_USER=app_user
DB_PASSWORD=your_secure_app_password_here
The config/database.yaml
file contains database connection settings and migration parameters.
# Comprehensive schema validation
python db-migrate.py validate
The system includes realistic sample data:
- 10 Authors from different countries
- 8 Publishers from various cities
- 32 Books with proper relationships
- 8 Genres with many-to-many relationships
# Check if Docker Desktop is running
docker ps
# Start the database
docker-compose up -d
# Check container status
docker-compose ps
# View container logs
docker-compose logs mysql
# Check migration status
python db-migrate.py status
# Reset and start fresh
python db-migrate.py reset --reseed
# Make scripts executable (Linux/Mac)
chmod +x db-migrate.py setup-environment.py secure-migrate.py
# Stop containers
docker-compose down
# Remove volumes (WARNING: deletes all data)
docker-compose down -v
# Start fresh
docker-compose up -d
python db-migrate.py init
python db-migrate.py migrate
python db-migrate.py seed
- Environment Variables: All credentials stored in
.env
file - Secure Passwords: Auto-generated strong passwords
- Database Isolation: Containerized database with network isolation
- No Hardcoded Credentials: All sensitive data externalized
# Create migration for new table
python db-migrate.py add-table --name "new_table" --columns "id,name" --datatypes "INT,VARCHAR(100)"
# Apply migration
python db-migrate.py migrate
- Create SQL file in
sql/seeds/
- Use
python db-migrate.py seed
to apply
- Create SQL file in
sql/migrations/
with format:XXX_description.sql
- Use
python db-migrate.py migrate
to apply
- SQL-based database (MySQL)
- Three entities: Author, Book, Publisher
- Initial schema with relationships
- Seed process for sample data
- Schema migration capability
- Docker Compose orchestration
- Predictable and repeatable deployments
- Schema validation process
- Fully containerized solution
- Comprehensive error handling
This project is part of a database containerization demonstration.
This is a demonstration project. For production use, consider:
- Adding rollback capabilities
- Implementing backup/restore
- Adding monitoring and logging
- Creating CI/CD pipelines
Built with β€οΈ using MySQL, Python, and Docker