A minimalistic, menu-driven interface for PostgreSQL database operations without Django ORM. This tool provides safe and user-friendly access to common database tasks including data export, import, backup, and bulk operations.
-
Setup & Install:
python setup.py
-
Test Connection:
python test_connection.py
Note: To setup database
sudo -u postgres psql. ThenCREATE USER username WITH PASSWORD 'pasword'; -
Run Application:
python main.py # or with environment check python start.py
That's it! The setup script will guide you through configuration and dependency installation.
- Data Statistics: Get comprehensive database and table statistics
- Metadata Generation: Export detailed database schema information
- Database Backup: Create full or partial database backups
- Plain CSV Export: Export tables with ID references only
- Joined CSV Export: Export with related table data joined
- Selected Columns Export: Export specific columns only
- Filtered Exports: Export with category, date range, or combined filters
- Bulk Import: Import data from CSV with column mapping
- Data Validation: Validate data before import
- Update Existing: Option to update existing records
- Bulk Update: Update multiple records with filters
- Bulk Delete: Delete records with mandatory safety filters
- Transaction Safety: All operations use database transactions
β οΈ Critical Operation Warnings: Clear alerts for destructive operations- π Mandatory Confirmations: Required user confirmation for critical operations
- π Operation Summaries: Detailed preview before execution
- π Transaction Rollback: Automatic rollback on errors
- π Impact Estimation: Preview of affected rows before execution
- Python 3.7 or higher
- PostgreSQL database
- PostgreSQL client tools (pg_dump, pg_restore) for backup operations
Run the setup script for guided installation:
python setup.pyThis will:
- Check Python version and install dependencies
- Create required directories
- Set up configuration with your database settings
- Test the database connection
- Check for PostgreSQL tools
-
Clone or download the project:
git clone <repository-url> cd rmis-database-direct-script
-
Install Python dependencies:
pip install -r requirements.txt
-
Configure database connection:
# Copy and edit the configuration file cp config.json.example config.json -
Update
config.jsonwith your database settings:{ "database": { "host": "localhost", "port": 5432, "database": "your_database_name", "username": "your_username", "password": "your_password" } } -
Create required directories (auto-created on first run):
mkdir -p logs backups exports imports
Choose one of these options:
# Simple launcher
python main.py
# With environment checks and setup assistance
python start.py
# Minimal launcher
python run.pyThe application presents a numbered menu of available services:
π Available Services:
----------------------------------------
1. Get Current Data Stats
2. Generate Database Metadata
3. Create Database Backup
4. Export Table to CSV (Plain)
5. Export Table to CSV (With Joins)
6. Export Table to CSV (Selected Columns)
7. Export Table to CSV (Category Filter)
8. Export Table to CSV (Date Range)
9. Export Table to CSV (Combined Filters)
10. β οΈ Bulk Update Table Values
11. β οΈ Bulk Import Data from CSV
12. β οΈ Bulk Delete Records
13. πͺ Exit
----------------------------------------
- Select Service: Choose from the numbered menu
- Configure Parameters: Follow prompts to configure the operation
- Review Summary: Review the operation details and impact
- Confirm Execution: Approve the operation (required for critical operations)
- View Results: See the operation results and any output files
- Select "Export Table to CSV (Plain)"
- Choose the table to export
- Select columns (optional)
- Configure CSV format options
- Review and confirm
- Find exported file in
exports/directory
- Place CSV file in
imports/directory - Select "Bulk Import Data from CSV"
- Choose target table
- Map CSV columns to database columns
- Configure import options
- Review impact and confirm
- Monitor import progress
- Select "Create Database Backup"
- Choose backup type (full, schema-only, etc.)
- Select backup location
- Configure compression options
- Review and confirm
- Find backup file in
backups/directory
{
"database": {
"host": "localhost",
"port": 5432,
"database": "database_name",
"username": "username",
"password": "password",
"connection_timeout": 30,
"max_connections": 5
}
}{
"application": {
"log_level": "INFO",
"log_file": "logs/rmis_db_script.log",
"backup_directory": "backups",
"export_directory": "exports",
"import_directory": "imports"
}
}{
"safety": {
"require_confirmation_for_critical_ops": true,
"max_bulk_operation_size": 10000,
"enable_transaction_rollback": true,
"backup_before_critical_ops": true
}
}rmis-database-direct-script/
βββ main.py # Main application entry point
βββ config.json # Database and application configuration
βββ requirements.txt # Python dependencies
βββ README.md # This file
βββ utils/ # Utility modules
β βββ __init__.py
β βββ config.py # Configuration management
β βββ db_connection.py # Database connection handling
β βββ logger.py # Logging utilities
βββ services/ # Service modules
β βββ __init__.py # Service manager
β βββ data_stats_service.py # Database statistics
β βββ metadata_service.py # Metadata generation
β βββ backup_service.py # Database backup
β βββ csv_export_service.py # CSV export operations
β βββ bulk_operations_service.py # Bulk operations
βββ logs/ # Application logs
βββ backups/ # Database backups
βββ exports/ # Exported CSV files
βββ imports/ # CSV files for import
- Database Connection Errors: Clear error messages with connection troubleshooting
- SQL Errors: Detailed error reporting with query context
- File Operations: Automatic directory creation and permission checks
- Data Validation: Pre-execution validation with clear error messages
- Transaction Failures: Automatic rollback with error reporting
All operations are logged to logs/rmis_db_script.log with:
- Operation start/end times
- User actions and parameter selections
- Query execution details
- Error messages and stack traces
- File operations (exports, imports, backups)
- No Hardcoded Credentials: All credentials stored in config files
- Parameter Sanitization: SQL injection protection through parameterized queries
- Transaction Safety: All operations wrapped in transactions
- Confirmation Requirements: Critical operations require explicit confirmation
- Audit Trail: Complete logging of all operations
-
Database Connection Failed
- Check database server is running
- Verify host, port, database name in config.json
- Check username/password credentials
- Test network connectivity
-
Permission Denied Errors
- Ensure database user has required permissions
- Check file system permissions for backup/export directories
- Verify PostgreSQL client tools are installed
-
Import/Export Errors
- Check CSV file format and encoding
- Verify column mappings are correct
- Ensure target table exists and has correct structure
-
Memory Issues with Large Operations
- Reduce batch sizes for bulk operations
- Use row limits for large exports
- Monitor available memory during operations
- Check the log files in
logs/directory - Review error messages for specific guidance
- Verify configuration settings
- Test database connectivity independently
This tool is designed to be minimal and focused. When extending:
- Follow the existing service pattern in
services/directory - Implement proper error handling and logging
- Add appropriate safety checks for destructive operations
- Update this README with new features
[Add your license information here]
- v1.0.0: Initial release with core functionality
- Database statistics and metadata generation
- CSV export with various filtering options
- Database backup functionality
- Bulk operations (update, import, delete) with safety checks