High-performance SQLite to MySQL database migration tool with performance optimizations, robust error handling, and automated post-migration scripts.
- Performance optimized: Batch inserts with temporary check disabling
- Real-time feedback: Live progress monitoring during migration
- Execution time tracking: Duration display for each migrated table
- Smart error handling: Detailed logging created only when necessary
- Prerequisites validation: Pre-migration requirement checks
- Connection compression: Uses MySQL compression when available
- Transaction safety: Automatic rollback on failure
- Selective migration: Migrate specific tables, multiple tables, or from file lists
- Type optimization: Analyzes real data to optimize VARCHAR/CHAR types
- Strict mode: Exact sizing for controlled data (no growth expected)
- Configurable fields: Customizable list of indexable fields
- Sync verification: Compare record counts between SQLite and MySQL
- Bulk comparison: Analyze all tables simultaneously
- Database analysis: Detailed overview of SQLite and MySQL databases
- Post-migration scripts: Automated execution of custom indexes, views, and procedures
- Automation mode: --force option for scripts and automation
- Exit codes: Specific exit codes for script integration
- DBMS compatibility: Supports MySQL and MariaDB
- Built-in help: -h command to see all parameters
- Flexibility: Option to migrate schema-only or complete data
pip install mysql-connector-pythonpython migrator.py -hpython -m venv venv
source venv/bin/activate # Linux/Mac
# or
venv\Scripts\activate # Windowspip install mysql-connector-pythonRun the command to generate an example file:
python migrator.py --create-configRename mysql_config.ini.example to mysql_config.ini and configure:
[mysql]
host = localhost
port = 3306
user = root
password = your_password
database = database_nameOn first run with --analyze-types, indexable_fields.txt will be created:
# Indexable Fields Configuration File
#
# List of field names that should be optimized for indexes
# One field per line
# Lines starting with # are comments
# Identification fields
id
codigo
cod
code
# Your project-specific fields:
product_code
client_ref
order_num
The system automatically creates template files for custom scripts:
post_migration.sql- Main script (configurations, optimizations)indexes.sql- Custom indexesviews.sql- System viewsprocedures.sql- Stored procedures and functions
# Show all available parameters
python migrator.py -h
python migrator.py --helppython migrator.py my_database.dbpython migrator.py my_database.db --schema-only# Migrate single table
python migrator.py my_database.db --table users
# Migrate multiple tables (comma-separated)
python migrator.py my_database.db --table users,products,orders
# Migrate schema-only for multiple tables
python migrator.py my_database.db --table products,categories --schema-only# Analyze data to optimize types (conservative mode)
python migrator.py my_database.db --table users --analyze-types
# Strict mode: exact sizing for controlled/stable data
python migrator.py my_database.db --table users --analyze-types --strict-sizing
# Example analysis output:
# π Processing table: users
# π Loaded 25 indexable fields from indexable_fields.txt
# Analyzing indexable field: email
# Data: 1000 records, size 12-45, CONSERVATIVE mode, suggested: 67
# Analyzing indexable field: code
# Data: 250 records, size 3-3, FIXED mode, suggested: 4
# β
1,250 records migrated# Migration with automatic scripts (default)
python migrator.py my_database.db --table users
# Skip post-migration script execution
python migrator.py my_database.db --table users --skip-post-scripts
# Example script execution:
# π§ Executing post-migration scripts...
# π Executing indexes.sql...
# β
3/3 statements executed
# β
indexes.sql executed successfully
# π Executing views.sql...
# β
2/2 statements executed
# β
views.sql executed successfully
# β
Scripts executed: indexes.sql, views.sql# Compare records for specific table
python migrator.py my_database.db --table-info users
# Example output:
# π Table information: users
# --------------------------------------------------
# SQLite: 1,250 records
# MySQL: 1,180 records
# β οΈ SQLite has 70 more records
# Compare all tables
python migrator.py my_database.db --table-compare
# Example output:
# π SQLite β MySQL table comparison
# ================================================================================
# Table SQLite MySQL Status
# --------------------------------------------------------------------------------
# ESTABLISHMENTS 65,168,570 65,168,570 β
Synchronized
# COMPANIES 62,085,952 62,080,000 β οΈ +5,952 in SQLite
# SIMPLE 42,772,819 N/A β Does not exist
# --------------------------------------------------------------------------------
# π Summary: 3 table(s) analyzed
# β
Synchronized: 1
# β οΈ Different: 1
# β Not in MySQL: 1# Show SQLite overview
python migrator.py my_database.db --info-sqlite
# Example output:
# π SQLite general information
# ============================================================
# Table Records Columns
# --------------------------------------------------
# users 25,340 8
# products 12,890 12
# orders 8,750 15
# --------------------------------------------------
# TOTAL 52,376 3 tables
# π File: my_database.db (127.45 MB)
# Show MySQL overview
python migrator.py my_database.db --info-mysql
# Example output:
# π MySQL general information
# ============================================================
# Table Records Total MB Data MB Index MB
# ----------------------------------------------------------------------
# users 25,340 45.67 42.12 3.55
# products 12,890 23.45 20.11 3.34
# orders 8,750 15.23 13.89 1.34
# ----------------------------------------------------------------------
# TOTAL 47,136 84.47
# π Database: my_app | Version: 11.6.2-MariaDB# Replace existing tables automatically
python migrator.py my_database.db --force
python migrator.py my_database.db -f
# Combine with other options
python migrator.py my_database.db --table users --force# Custom configuration file
python migrator.py my_database.db --config /path/to/config.ini
# Custom batch size
python migrator.py my_database.db --batch-size 5000
# Combining multiple options
python migrator.py my_database.db --schema-only --config prod_config.ini --forceAfter a successful migration, the system automatically:
- Checks for table-specific
.sqlfiles based on migrated tables - Analyzes if they contain valid SQL commands (not just comments)
- Executes only relevant scripts for the migrated tables
- Falls back to general scripts only for complete migrations
- Reports execution results
| Pattern | Example | Executed When |
|---|---|---|
indexes_[table].sql |
indexes_companies.sql |
COMPANIES table is migrated |
[table]_indexes.sql |
companies_indexes.sql |
COMPANIES table is migrated |
views_[table].sql |
views_companies.sql |
COMPANIES table is migrated |
[table]_views.sql |
companies_views.sql |
COMPANIES table is migrated |
procedures_[table].sql |
procedures_companies.sql |
COMPANIES table is migrated |
[table]_procedures.sql |
companies_procedures.sql |
COMPANIES table is migrated |
| File | Purpose | Executed When |
|---|---|---|
post_migration.sql |
General configurations, optimizations | Complete migration (>5 tables) |
indexes.sql |
General indexes | Complete migration (>5 tables) |
views.sql |
General views | Complete migration (>5 tables) |
procedures.sql |
General procedures and functions | Complete migration (>5 tables) |
python migrator.py database.db --table COMPANIES --forceExecutes: Only indexes_companies.sql (if exists)
python migrator.py database.db --table COMPANIES,ESTABLISHMENTS --forceExecutes: indexes_companies.sql + indexes_establishments.sql
python migrator.py database.db --forceExecutes: General scripts (indexes.sql, views.sql, etc.)
On first run, if no files are found, the system creates templates with examples:
python migrator.py my_database.db --table COMPANIES
# If no scripts exist, automatically creates:
# π Example files created: post_migration.sql, indexes.sql, views.sql, procedures.sql, indexes_companies.sql
# π‘ For table-specific scripts, use patterns:
# - indexes_[table].sql or [table]_indexes.sql
# - views_[table].sql or [table]_views.sql
# - procedures_[table].sql or [table]_procedures.sql-- Table-specific indexes for COMPANIES
-- Executed only when COMPANIES table is migrated
-- Unique index on COD field (correlation key)
CREATE UNIQUE INDEX IF NOT EXISTS idx_companies_cod ON COMPANIES(COD);
-- Indexes for frequent queries
CREATE INDEX IF NOT EXISTS idx_companies_status ON COMPANIES(status);
CREATE INDEX IF NOT EXISTS idx_companies_type ON COMPANIES(company_type);
-- Composite indexes for reports
CREATE INDEX IF NOT EXISTS idx_companies_status_type ON COMPANIES(status, company_type);-- Table-specific indexes for ESTABLISHMENTS
-- Executed only when ESTABLISHMENTS table is migrated
-- Unique index on COD field (correlation key)
CREATE UNIQUE INDEX IF NOT EXISTS idx_establishments_cod ON ESTABLISHMENTS(COD);
-- Geographic location indexes (very frequent queries)
CREATE INDEX IF NOT EXISTS idx_establishments_state ON ESTABLISHMENTS(state);
CREATE INDEX IF NOT EXISTS idx_establishments_city ON ESTABLISHMENTS(city_code);
CREATE INDEX IF NOT EXISTS idx_establishments_zip ON ESTABLISHMENTS(zip_code);
-- Correlation with other tables
CREATE INDEX IF NOT EXISTS idx_establishments_company ON ESTABLISHMENTS(company_cod);
CREATE INDEX IF NOT EXISTS idx_establishments_activity ON ESTABLISHMENTS(main_activity_code);# Execute scripts automatically (default)
python migrator.py my_database.db
# Skip script execution
python migrator.py my_database.db --skip-post-scripts
# Table-specific execution (only relevant scripts)
python migrator.py my_database.db --table COMPANIES --force
# Output: π§ Executing post-migration scripts...
# π Executing indexes_companies.sql...
# β
indexes_companies.sql executed successfully
# Multiple tables (executes scripts for each table)
python migrator.py my_database.db --table COMPANIES,PRODUCTS --force
# Output: π§ Executing post-migration scripts...
# π Executing indexes_companies.sql...
# π Executing indexes_products.sql...
# β
Scripts executed: indexes_companies.sql, indexes_products.sql
# In case of script error, asks to continue:
# π Executing indexes_companies.sql...
# β οΈ Error in statement 2: Table 'companies' doesn't exist
# π¬ SQL: CREATE INDEX idx_company_name ON companies(name)...
# Continue executing indexes_companies.sql? (y/N):# OLD BEHAVIOR: Always executes all scripts
python migrator.py gov_data.db --table SMALL_TABLE --force
# β Would execute indexes for ALL tables (slow!)
# NEW BEHAVIOR: Only relevant scripts
python migrator.py gov_data.db --table SMALL_TABLE --force
# β
Executes only indexes_small_table.sql (fast!)# Migrate only tax classification table (small)
python migrator.py revenue.db --table TAX_CODES --force
# Executes: indexes_tax_codes.sql (< 1 second)
# Skips: indexes for COMPANIES table (would take hours!)
# Migrate companies table specifically
python migrator.py revenue.db --table COMPANIES --force
# Executes: indexes_companies.sql (optimized for this table only)
# Complete migration
python migrator.py revenue.db --force
# Executes: General scripts for all tables| Parameter | Description | Default |
|---|---|---|
sqlite_db |
Path to SQLite database (required) | - |
-h, --help |
Show help with all parameters | - |
--config |
MySQL configuration file | mysql_config.ini |
--schema-only |
Migrate schema only, no data | False |
-f, --force |
No confirmations, always replace existing tables | False |
--table |
Migrate specific table(s) - accepts comma-separated | None |
--table-info |
Show comparative information for a specific table | None |
--table-compare |
Compare all tables between SQLite and MySQL | False |
--info-sqlite |
Show SQLite database general information | False |
--info-mysql |
Show MySQL database general information | False |
--analyze-types |
Analyze real data to optimize column types | False |
--strict-sizing |
Use exact sizes without safety margin (controlled data only) | False |
--batch-size |
Batch size for insertion | 1000 |
--create-config |
Create example configuration file | - |
--skip-post-scripts |
Skip post-migration scripts (.sql) execution | False |
- Verify SQLite file existence
- Test connectivity with both databases
- Validate configuration file
- List all SQLite tables
- Check MySQL conflicts (existing tables)
- Map SQLite β MySQL data types
During data insertion, the system:
- Disables
FOREIGN_KEY_CHECKS - Disables
UNIQUE_CHECKS - Uses manual transactions (
AUTOCOMMIT = 0)
- Create table structure (columns, PKs, indexes)
- Migrate data in configurable batches
- Restore default MySQL settings
- Check for valid
.sqlfiles - Execute custom indexes
- Create views and procedures
- Apply specific optimizations
The program can analyze real data in SQLite columns to suggest more efficient MySQL types:
- No optimization: All TEXT columns remain as TEXT
- With
--analyze-types: Analyzes only "indexable" fields defined inindexable_fields.txt - With
--strict-sizing: Uses near-exact sizes (ideal for controlled data)
| Mode | Recommended use | Example |
|---|---|---|
| Default | Fast migration | TEXT β TEXT |
| Conservative | Data that may grow | TEXT (3 chars) β VARCHAR(20) |
| Strict | Controlled/stable data | TEXT (3 chars) β CHAR(4) |
Edit indexable_fields.txt to define which fields should be analyzed:
# Fields that typically have indexes
id
code
cod
email
ssn
tax_id
# Your specific fields
product_code
customer_ref
establishment_code
Example with "code" field (3 digits, 1M records):
| Type | Index space | SELECT speed | RAM usage |
|---|---|---|---|
TEXT |
~45MB | 850ms | High |
VARCHAR(100) |
~8MB | 120ms | Medium |
CHAR(4) |
~1MB | 25ms | Low |
Gain: up to 35x faster in indexed queries!
During execution, the system displays:
- Prerequisites validation status
- Progress per table
- Processed record counters
- Post-migration script execution
- Completion confirmations
Errors are automatically logged only when they occur in:
migration_errors_YYYYMMDD_HHMMSS.log
Smart logging system:
- β No errors: No log file is created
- β With errors: Detailed log is saved and reported to user
- Batch insertion: Process records in groups (default: 1000)
- Check disabling: Remove unnecessary validations during insertion
- Compression: Reduce MySQL network traffic
- Transactions: Minimize disk I/O
- Custom scripts: Specific indexes and optimizations post-migration
- Pre-validation: Avoid failures during process
- Automatic rollback: Undo changes on error
- Conflict handling: Option to overwrite existing tables
- Structure preservation: Maintain PKs, FKs, and indexes
- Controlled execution: Scripts with individual error handling
# Small database, fast migration
python migrator.py dev_app.db --batch-size 500# Large database, no confirmations for automation
python migrator.py prod_app.db --batch-size 10000 --config prod_mysql.ini --force# 1. Check status of all important tables
python migrator.py app.db --table-info users
python migrator.py app.db --table-info products
python migrator.py app.db --table-info orders
# 2. Migrate only outdated tables (without asking)
python migrator.py app.db --table users --force
python migrator.py app.db --table products --force
# 3. Confirm synchronization
python migrator.py app.db --table-info users# 1. Basic migration (creates general templates)
python migrator.py app.db --table users
# 2. Create table-specific script
cat > indexes_users.sql << EOF
-- Table-specific indexes for users table
CREATE UNIQUE INDEX IF NOT EXISTS idx_users_cod ON users(COD);
CREATE INDEX IF NOT EXISTS idx_users_email ON users(email);
CREATE INDEX IF NOT EXISTS idx_users_status_date ON users(status, created_at);
EOF
# 3. Execute migration with table-specific script
python migrator.py app.db --table users --force
# Expected output:
# π§ Executing post-migration scripts...
# π Executing indexes_users.sql...
# β
3/3 statements executed
# β
indexes_users.sql executed successfully
# β
Scripts executed: indexes_users.sql
# 4. Migrate different table (won't execute users scripts)
python migrator.py app.db --table products --force
# Only executes indexes_products.sql (if exists)# 1. Create table-specific scripts for critical tables
cat > indexes_companies.sql << EOF
CREATE UNIQUE INDEX IF NOT EXISTS idx_companies_cod ON COMPANIES(COD);
CREATE INDEX IF NOT EXISTS idx_companies_status ON COMPANIES(status);
EOF
cat > indexes_establishments.sql << EOF
CREATE UNIQUE INDEX IF NOT EXISTS idx_establishments_cod ON ESTABLISHMENTS(COD);
CREATE INDEX IF NOT EXISTS idx_establishments_state ON ESTABLISHMENTS(state);
CREATE INDEX IF NOT EXISTS idx_establishments_city ON ESTABLISHMENTS(city_code);
EOF
# 2. Migrate small/medium tables with specific optimizations
python migrator.py revenue.db --table TAX_CODES,CITIES,STATES --force
# Fast execution: only relevant scripts
# 3. Migrate large tables individually with their specific indexes
python migrator.py revenue.db --table COMPANIES --force
# Executes: indexes_companies.sql only
python migrator.py revenue.db --table ESTABLISHMENTS --force
# Executes: indexes_establishments.sql only
# 4. Complete migration for remaining tables
python migrator.py revenue.db --force
# Executes: General scripts for any remaining tables# 1. Analyze SQLite database before migration
python migrator.py app.db --info-sqlite
# 2. Check current MySQL state
python migrator.py app.db --info-mysql
# 3. Compare all tables at once
python migrator.py app.db --table-compare
# 4. Check specific tables
python migrator.py app.db --table-info users
python migrator.py app.db --table-info products
# 5. Migrate with type optimization (conservative mode)
python migrator.py app.db --table users,products --analyze-types --force
# 6. Migrate with strict optimization (controlled data)
python migrator.py app.db --table BRANCH_OFFICE --analyze-types --strict-sizing --force# 1. First migration (creates general + specific templates)
python migrator.py app.db --table users
# 2. Customize table-specific indexes for performance
cat > indexes_users.sql << EOF
-- Indexes for frequent queries on users table
CREATE UNIQUE INDEX IF NOT EXISTS idx_users_cod ON users(COD);
CREATE INDEX IF NOT EXISTS idx_users_email ON users(email);
CREATE INDEX IF NOT EXISTS idx_users_status_date ON users(status, created_at);
CREATE INDEX IF NOT EXISTS idx_users_company ON users(company_id, active);
EOF
# 3. Create specific views for users table
cat > views_users.sql << EOF
-- Views specific to users table
CREATE VIEW v_active_users AS
SELECT u.*,
COUNT(o.id) as total_orders,
SUM(o.amount) as total_amount
FROM users u
LEFT JOIN orders o ON u.COD = o.user_cod
WHERE u.active = 1
GROUP BY u.COD;
EOF
# 4. Execute migration with table-specific scripts
python migrator.py app.db --table users --force
# Output:
# π§ Executing post-migration scripts...
# π Executing indexes_users.sql...
# β
3/3 statements executed
# β
indexes_users.sql executed successfully
# π Executing views_users.sql...
# β
1/1 statements executed
# β
views_users.sql executed successfully
# β
Scripts executed: indexes_users.sql, views_users.sql
# 5. Migrate different table (independent scripts)
python migrator.py app.db --table products --force
# Only executes product-specific scripts (if they exist)
# 6. Check results
python migrator.py app.db --info-mysql# During migration, each table shows elapsed time:
# π Processing table: users
# π Loaded 25 indexable fields from indexable_fields.txt
# Analyzing indexable field: email
# Data: 50000 records, size 12-45, CONSERVATIVE mode, suggested: 68
# Creating structure...
# Migrating 50,000 records...
# β
50,000 records migrated
# β
Table users completed (15s)
#
# π§ Executing post-migration scripts...
# π Executing indexes.sql...
# β
3/3 statements executed
# β
indexes.sql executed successfully
# β
Scripts executed: indexes.sql
#
# π Migration completed successfully!# Nightly complete migration with scripts
python migrator.py backup_$(date +%Y%m%d).db --force --batch-size 5000
# Optimized migration for government data
python migrator.py gov_data.db --table ESTABLISHMENTS,COMPANIES --analyze-types --strict-sizing --force
# Daily status report
echo "=== SQLite ===" > report.txt
python migrator.py app.db --info-sqlite >> report.txt
echo "=== MySQL ===" >> report.txt
python migrator.py app.db --info-mysql >> report.txt
echo "=== Comparison ===" >> report.txt
python migrator.py app.db --table-compare >> report.txtpython migrator.py --create-config
# Edit mysql_config.ini with your credentials- Check credentials in .ini file
- Confirm MySQL server is running
- Test network connectivity
- Increase
--batch-sizefor large databases - Check if compression is enabled
- Monitor MySQL server resources
# View error details in log
cat migration_errors_*.log
# Test script separately
mysql -u user -p database < indexes.sql
# Skip scripts temporarily
python migrator.py app.db --skip-post-scriptsBy default, the system asks before replacing existing tables:
python migrator.py app.db
# β οΈ Tables already exist in MySQL: users, products
# Continue anyway? (y/N):Use --force for automation:
python migrator.py app.db --force
# β οΈ Tables already exist in MySQL: users, products
# π Force mode active: replacing tables automaticallypython migrator.py app.db --table wrong_name
# β Table(s) not found: wrong_name
# π Available tables: users, products, orderspython migrator.py app.db --table-info users
# Possible outputs:
# β
Tables synchronized
# β οΈ SQLite has 70 more records
# β Table needs migration- Foreign Keys: Created only in initial structure, not recreated after optimization
- Triggers: Not migrated (DBMS-specific)
- Views: Migrated via custom post-migration scripts
- Procedures/Functions: Migrated via custom post-migration scripts
migrator.py
βββ SQLiteToMySQLMigrator
β βββ __init__() # Initial configuration
β βββ validate_prerequisites() # Pre-migration validation
β βββ connect_databases() # Establish connections
β βββ get_sqlite_tables() # List SQLite tables
β βββ parse_table_parameter() # Process --table parameter
β βββ get_table_schema() # Extract table structure
β βββ map_sqlite_to_mysql_type() # Map data types
β βββ load_indexable_fields() # Load indexable fields
β βββ analyze_text_column() # Analyze columns for optimization
β βββ create_mysql_table() # Create table in MySQL
β βββ optimize_mysql_for_insert() # Performance optimizations
β βββ migrate_table_data() # Migrate data in batches
β βββ execute_post_migration_scripts() # Execute .sql scripts
β βββ has_valid_sql_content() # Check valid SQL content
β βββ execute_sql_script() # Execute individual script
β βββ create_sample_post_migration_files() # Create templates
β βββ show_table_info() # Compare records between databases
β βββ show_sqlite_info() # SQLite general information
β βββ show_mysql_info() # MySQL general information
β βββ compare_all_tables() # Compare all tables
β βββ format_duration() # Format execution time
β βββ migrate() # Main orchestrator
βββ main() # CLI interface
To contribute or customize:
- Clone the repository
- Install dependencies:
pip install mysql-connector-python - Run tests with sample databases
- Submit PRs with improvements
The program returns specific codes for different situations, enabling script integration:
| Code | Situation |
|---|---|
0 |
Success |
1 |
General migration error |
2 |
SQLite file not found |
3 |
MySQL connection error |
4 |
Unexpected error |
Windows Batch:
python migrator.py database.db --table users
if %ERRORLEVEL% EQU 0 (
echo Migration completed successfully
) else if %ERRORLEVEL% EQU 2 (
echo File not found
) else if %ERRORLEVEL% EQU 3 (
echo MySQL connection error
) else (
echo Migration error
)Linux/Mac Bash:
python migrator.py database.db --table users
case $? in
0) echo "Migration completed successfully" ;;
2) echo "File not found" ;;
3) echo "MySQL connection error" ;;
*) echo "Migration error" ;;
esacPowerShell:
python migrator.py database.db --table users
switch ($LASTEXITCODE) {
0 { Write-Host "Migration completed successfully" }
2 { Write-Host "File not found" }
3 { Write-Host "MySQL connection error" }
default { Write-Host "Migration error" }
}- Table-specific post-migration scripts: Intelligent system for executing scripts only for migrated tables
- Performance optimization: Avoid unnecessary script execution on large tables (1+ billion records)
- Smart script selection: Automatically chooses table-specific vs. general scripts based on migration scope
- Lazy logging: Log files created only when there are actual errors
- Enhanced templates: Creation of both general and table-specific example files
--skip-post-scriptsparameter: Optional control to skip script execution
indexes_[table].sql/[table]_indexes.sql- Table-specific indexesviews_[table].sql/[table]_views.sql- Table-specific viewsprocedures_[table].sql/[table]_procedures.sql- Table-specific procedures
post_migration.sql- General configurations and optimizationsindexes.sql- General indexes for multiple tablesviews.sql- General views for reports and complex queriesprocedures.sql- General stored procedures and functions
- Single table migration β Execute only table-specific scripts
- Multiple table migration β Execute scripts for each migrated table
- Complete migration (>5 tables) β Execute general scripts
- No relevant scripts β Silent execution (no unnecessary messages)
- Smart system: Log created only when necessary
- Automatic cleanup: Removes empty files on successful executions
- Clean feedback: No unnecessary messages when there are no errors
- Massive improvement for large databases with selective table migration
- Eliminates script execution on irrelevant tables
- Optimizes CI/CD pipelines with frequent single-table updates
- Reduces migration time from hours to seconds for specific table updates
# OLD: Migrating 1 small table executed scripts for ALL tables
python migrator.py huge_db.db --table CODES --force
# β Previously: 30+ minutes (executed scripts for billion-record tables)
# NEW: Only relevant scripts executed
python migrator.py huge_db.db --table CODES --force
# β
Now: < 1 minute (only codes-specific scripts)- Development: Table-specific scripts facilitate rapid iteration
- Production: Large tables get optimized indexes without affecting others
- CI/CD: Selective deployments execute only relevant optimizations
- Maintenance: Independent table updates with isolated script execution
- Government databases: Tax records, company registrations (billion+ records)
- E-commerce platforms: Product catalogs, user data, transaction logs
- Analytics systems: Event tracking, user behavior, reporting tables
- Legacy migrations: Gradual migration of large legacy systems
MIT License
Copyright (c) 2025 Vailton Renato
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.