A production-ready Flask backend application demonstrating modern cloud-native development practices with AWS integration, complete with automated setup and deployment.
STATUS: WORK IN PROGRESS
This project is currently under active development. While the core functionality is complete and operational, some advanced features and documentation are still being refined. Local development is fully functional. AWS deployment workflows are being updated to use OIDC authentication and AWS Secrets Manager best practices.
- RESTful API with JWT authentication and API versioning
- PostgreSQL database with SQLAlchemy ORM and migrations
- Redis caching for optimized performance
- AWS Integration - Lambda, S3, RDS, ElastiCache, ECS Fargate
- Docker containerization with docker-compose
- Terraform infrastructure as code for AWS deployment
- CI/CD pipelines with GitHub Actions
- Comprehensive tests with pytest and coverage reporting
- Automated setup - Get started in minutes!
- macOS - Fully supported for local development
- Linux - Fully supported for local development and CI/CD
- Windows - Not supported (use WSL2 with Ubuntu for development)
Only 2 things needed:
# One-time setup
./scripts/setup.sh
# Run the application
./scripts/run.shThe API will be available at http://localhost:5000
That's it! The setup script handles everything:
- Creates virtual environment
- Installs dependencies
- Starts PostgreSQL and Redis
- Generates secure secrets
- Initializes database
- Seeds sample data
See QUICKSTART.md for more details.
- Python 3.11 with type hints and modern syntax
- Flask web framework with blueprint architecture
- SQLAlchemy 2.0 ORM with relationship modeling
- JWT Authentication with access and refresh tokens
- RESTful API Design with proper HTTP methods and status codes
- Redis Caching patterns for performance optimization
- Database Migrations with Flask-Migrate / Alembic
- Input Validation and error handling
- Structured Logging with JSON formatting
- Lambda Functions - Serverless compute for event-driven tasks
- API Gateway - HTTP API for Lambda integration
- ECS Fargate - Container orchestration without managing servers
- RDS PostgreSQL - Managed relational database
- ElastiCache Redis - Managed in-memory cache
- S3 - Object storage with encryption
- CloudWatch - Logging and monitoring
- VPC - Network isolation and security
- Terraform - Complete infrastructure as code
- Modular design for reusability
- Multiple environments (dev, staging, prod)
- State management with S3 backend
- Docker - Multi-stage builds for optimization
- docker-compose - Local development environment
- GitHub Actions - Automated CI/CD pipelines
- Linting and code quality checks
- Automated testing with coverage
- Security scanning with Trivy
- Automated deployments to AWS
- Manual deployment triggers from GitHub UI
- Makefile - Common development tasks
- pytest - Unit and integration tests
- Test Coverage - Comprehensive test suite
- Code Formatting - Black for consistent style
- Linting - Flake8 for code quality
- Type Checking - MyPy for type safety
- CI Integration - Automated on every push
This project implements AWS security best practices:
- No AWS Credentials in GitHub - Uses AWS OIDC provider for authentication
- Zero Long-Lived Keys - Temporary credentials expire automatically (~1 hour)
- AWS Secrets Manager - All sensitive data (DB passwords, API keys) stored in AWS
- Separate IAM Roles - Isolated permissions per environment (dev/staging/prod)
- Least Privilege IAM - Fine-grained permissions with explicit resource ARNs
- Dynamic Infrastructure IDs - Retrieved from Terraform outputs (no manual entry)
- CloudTrail Auditing - Complete audit trail of all AWS API calls
- Encryption at Rest - S3, RDS, and EBS volumes encrypted
- GitHub Variables - Non-sensitive config (bucket names) in repository variables
Security Documentation:
- AWS OIDC Setup Guide - Complete setup instructions
- Security Migration - Migration from access keys
- GitHub Actions Workflows - Deployment pipelines
Security Impact: Reduced GitHub Secrets from 9 to 2 (78% reduction)
.
├── app/ # Application code
│ ├── api/v1/ # API endpoints (health, auth, items)
│ ├── models/ # Database models (User, Item)
│ ├── services/ # Business logic (caching, S3)
│ ├── utils/ # Utilities (logging, responses)
│ ├── __init__.py # App factory
│ └── main.py # Entry point
├── scripts/ # Automation scripts
│ ├── setup.sh # Complete environment setup
│ ├── run.sh # Start application
│ ├── test.sh # Run tests
│ ├── clean.sh # Clean project
│ ├── dev.sh # Development helpers
│ ├── docker-dev.sh # Docker helpers
│ └── seed_db.py # Database seeding
├── tests/ # Test suite
│ ├── unit/ # Unit tests
│ └── integration/ # Integration tests
├── terraform/ # Infrastructure as Code
│ ├── modules/ # Reusable Terraform modules
│ │ ├── vpc/ # VPC with subnets, NAT gateways
│ │ ├── rds/ # PostgreSQL database
│ │ ├── elasticache/ # Redis cluster
│ │ ├── s3/ # S3 buckets
│ │ ├── lambda/ # Lambda functions
│ │ ├── ecs/ # ECS Fargate cluster
│ │ ├── api_gateway/ # API Gateway
│ │ └── cloudwatch/ # Monitoring
│ └── main.tf # Main configuration
├── lambda/ # AWS Lambda functions
│ ├── api_handler/ # API Gateway integration
│ ├── image_processor/ # S3 event processing
│ └── scheduled_task/ # Cron jobs
├── .github/workflows/ # CI/CD pipelines
├── docs/ # Documentation
├── config.py # Configuration management
├── docker-compose.yml # Local services
├── Dockerfile # Container definition
├── Makefile # Build automation
└── requirements.txt # Python dependencies
# Register a new user
curl -X POST http://localhost:5000/api/v1/auth/register \
-H "Content-Type: application/json" \
-d '{
"email": "user@example.com",
"username": "username",
"password": "password123"
}'
# Login and get JWT token
curl -X POST http://localhost:5000/api/v1/auth/login \
-H "Content-Type: application/json" \
-d '{
"email": "user@example.com",
"password": "password123"
}'# Get current user (requires authentication)
curl http://localhost:5000/api/v1/users/me \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN"
# List items
curl http://localhost:5000/api/v1/items \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN"
# Create an item
curl -X POST http://localhost:5000/api/v1/items \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Sample Item",
"description": "Item description",
"price": 99.99
}'The setup script creates these test accounts:
Admin: admin@example.com / admin123
User 1: john@example.com / password123
User 2: jane@example.com / password123
Full API documentation: docs/API.md
# Run all tests with coverage
./scripts/test.sh
# Or manually with pytest
source venv/bin/activate
pytest --cov=app --cov-report=html tests/# Activate virtual environment
source venv/bin/activate
# Format code
black app tests
# Lint code
flake8 app tests
# Type checking
mypy app
# Or use Makefile
make format
make lint# Create a new migration
flask db migrate -m "Add new field"
# Apply migrations
flask db upgrade
# Rollback migration
flask db downgrade# View logs
docker-compose logs -f
# Access database
docker-compose exec db psql -U postgres -d flask_aws_db
# Access Redis CLI
docker-compose exec redis redis-cli
# Python shell with app context
flask shell
# Development helper
./scripts/dev.sh # Show all commands
./scripts/dev.sh db-shell # PostgreSQL shell
./scripts/dev.sh status # Check servicesThe project includes comprehensive GitHub Actions workflows for automated deployment:
- Go to Actions tab in GitHub
- Select "Deploy to AWS" workflow
- Click "Run workflow"
- Choose environment and components to deploy
- Click "Run workflow"
# Deploy to development (automatic)
git push origin develop
# Deploy to production (automatic)
git push origin main
# Create infrastructure only
git tag -a infra-v1.0.0 -m "Deploy infrastructure"
git push origin infra-v1.0.0- AWS Account
- AWS CLI configured
- Terraform installed
cd terraform
# Initialize Terraform
terraform init
# Review changes
terraform plan
# Deploy
terraform applySee docs/DEPLOYMENT.md for detailed deployment instructions.
# View application logs
tail -f logs/app.log
# View Docker logs
docker-compose logs -f app- Application logs:
/ecs/production - Lambda logs:
/aws/lambda/function-name - Database metrics: RDS dashboard
- Cache metrics: ElastiCache dashboard
┌─────────────┐ ┌──────────────┐ ┌─────────────┐
│ Browser │────▶│ Flask │────▶│ PostgreSQL │
│ /Postman │ │ (port 5000) │ │ (port 5432) │
└─────────────┘ └──────────────┘ └─────────────┘
│
▼
┌──────────────┐
│ Redis │
│ (port 6379) │
└──────────────┘
┌─────────────┐ ┌──────────────┐ ┌─────────────┐
│ Client │────▶│ ALB │────▶│ ECS/Flask │
└─────────────┘ └──────────────┘ └─────────────┘
│
┌───────────────────────────┼────────────────┐
│ │ │
▼ ▼ ▼
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ RDS │ │ ElastiCache │ │ S3 │
│ PostgreSQL │ │ Redis │ │ Storage │
└──────────────┘ └──────────────┘ └──────────────┘
All scripts are in the scripts/ directory:
| Script | Purpose |
|---|---|
setup.sh |
Complete environment setup (run once) |
run.sh |
Start the application |
test.sh |
Run tests with coverage |
clean.sh |
Clean up generated files |
dev.sh |
Development helper commands |
docker-dev.sh |
Docker operations helper |
seed_db.py |
Seed database with sample data |
See SCRIPTS.md for detailed documentation.
- QUICKSTART.md - Get started in 5 minutes
- SCRIPTS.md - Complete scripts documentation
- docs/API.md - Complete API reference
- docs/DEPLOYMENT.md - AWS deployment guide
Problem: Setup script fails
- Make sure Python 3.11+ is installed
- Make sure Docker Desktop is running
- Try running with sudo if needed
Problem: Port already in use
docker-compose down # Stop existing containers
# Or change ports in docker-compose.ymlProblem: Database connection error
docker-compose restart db
docker-compose ps # Verify it's runningProblem: Redis connection error
docker-compose restart redisProblem: Import errors
source venv/bin/activate # Make sure venv is activated
pip install -r requirements.txt # Reinstall dependenciesProblem: Migration errors
flask db downgrade # Rollback
flask db upgrade # Reapply- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Run tests (
./scripts/test.sh) - Run linting (
make lint) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License.
This project demonstrates best practices for:
- Modern Python web development
- Cloud-native application architecture
- Infrastructure as Code with Terraform
- CI/CD automation
- Containerization and orchestration
- RESTful API design
- Test-driven development
For issues, questions, or contributions:
- Open an issue on GitHub
- Check existing documentation
- Review CloudWatch logs (for AWS deployment)
Built with Python, Flask, PostgreSQL, Redis, Docker, Terraform, and AWS