A comprehensive, production-ready FastAPI application showcasing modern Python web development practices. Built with Python 3.13, this project demonstrates enterprise-grade API development with authentication, caching, database integration, containerization, and automated code quality tools.
- Async-First Design: Built with FastAPI and async/await throughout for maximum performance
- Clean Architecture: Layered design with clear separation of concerns (routers, services, models, schemas)
- Type Safety: Full type hints with Pydantic v2 for request/response validation
- Connection Pooling: Optimized database and Redis connection management
- Smart Caching: Redis-based caching with intelligent cache invalidation
- JWT Authentication: Secure token-based authentication with configurable expiration
- Password Security: Bcrypt hashing with salt for password storage
- CORS Protection: Configurable Cross-Origin Resource Sharing
- Security Headers: Comprehensive security headers via Nginx
- Rate Limiting: Request rate limiting to prevent abuse
- Input Validation: Comprehensive request validation with Pydantic
- PostgreSQL 16: Async database operations with SQLAlchemy 2.0 and asyncpg
- Redis 7: High-performance caching and session storage
- Database Migrations: Alembic integration for schema management
- Connection Health: Automatic health checks and reconnection
- Query Optimization: Proper indexing and query optimization
- Docker Compose: Multi-service orchestration with health checks
- Nginx Reverse Proxy: Production-ready load balancing and SSL termination
- Health Monitoring: Comprehensive health checks for all services
- Logging: Structured logging with configurable levels
- Environment Management: Secure configuration via environment variables
- Code Quality: Ruff linting and formatting with pre-commit hooks
- CI/CD: GitHub Actions workflow for automated testing and linting
- Interactive Docs: Auto-generated Swagger UI and ReDoc documentation
- Type Checking: Full type safety with modern Python typing
- Hot Reload: Development server with automatic reloading
- Python: 3.13+ (3.11+ supported)
- Docker: 20.10+ with Docker Compose v2
- PostgreSQL: 16+ (Alpine image)
- Redis: 7+ (Alpine image)
- Nginx: Latest Alpine (for production)
rest-python-api/
├── 📁 app/ # Main application package
│ ├── 📁 core/ # Core configuration and utilities
│ │ ├── config.py # Pydantic settings with env validation
│ │ ├── database.py # SQLAlchemy async engine & session
│ │ └── security.py # JWT, password hashing, auth deps
│ ├── 📁 models/ # SQLAlchemy ORM models
│ │ ├── user.py # User model with relationships
│ │ └── item.py # Item model with owner relationship
│ ├── 📁 schemas/ # Pydantic request/response schemas
│ │ ├── common.py # Shared schemas (health, pagination)
│ │ ├── user.py # User schemas with validation
│ │ └── item.py # Item schemas with validation
│ ├── 📁 routers/ # FastAPI route handlers
│ │ ├── auth.py # Authentication endpoints
│ │ ├── users.py # User management endpoints
│ │ ├── items.py # Item CRUD endpoints
│ │ └── health.py # Health check endpoints
│ ├── 📁 services/ # Business logic layer
│ │ ├── user_service.py # User operations & validation
│ │ ├── item_service.py # Item operations & pagination
│ │ └── cache_service.py # Redis caching operations
│ ├── 📁 utils/ # Utility functions
│ └── main.py # FastAPI app factory & middleware
├── 📁 scripts/ # Development scripts
│ └── ruff.ps1 # PowerShell script for linting
├── 📁 .github/workflows/ # CI/CD pipelines
│ └── ruff.yml # GitHub Actions linting workflow
├── 🐳 docker-compose.yml # Multi-service orchestration
├── 🐳 Dockerfile # Application container definition
├── 🔧 pyproject.toml # Ruff configuration & project metadata
├── 📋 requirements.txt # Python dependencies with versions
├── 🌐 nginx.conf # Nginx reverse proxy configuration
├── 🗄️ init-db.sql # PostgreSQL initialization script
├── ⚙️ env.example # Environment variables template
├── 🪝 .pre-commit-config.yaml # Pre-commit hooks configuration
├── 🛠️ Makefile # Development commands (Linux/macOS)
└── 📖 README.md # This comprehensive documentation
# Clone the repository
git clone <repository-url>
cd rest-python-api
# Copy environment template
cp env.example .env
# Edit .env with your settings (see Configuration section)
# Start all services (API, PostgreSQL, Redis)
docker-compose up -d
# View logs
docker-compose logs -f api
# Check service status
docker-compose ps
# Health check
curl http://localhost:8000/health/
# Create virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Set environment variables (or use .env file)
export DATABASE_URL="postgresql+asyncpg://user:password@localhost:5432/fastapi_db"
export REDIS_URL="redis://localhost:6379/0"
export SECRET_KEY="your-secret-key"
# Run the application
python -m app.main
# Or with uvicorn directly:
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
POST /auth/register
- Register new user accountPOST /auth/login
- Authenticate and get JWT tokenPOST /auth/logout
- Logout (client-side token disposal)
GET /users/me
- Get current user profileGET /users/me/with-items
- Get current user with their itemsPUT /users/me
- Update current user profileDELETE /users/me
- Delete current user account
POST /items/
- Create new item (authenticated)GET /items/
- Get all items (public, cached, paginated)GET /items/my-items
- Get current user's items (authenticated, paginated)GET /items/{item_id}
- Get specific item by ID (cached)PUT /items/{item_id}
- Update item (authenticated, owner only)DELETE /items/{item_id}
- Delete item (authenticated, owner only)
GET /health/
- Comprehensive health check (database, Redis, app status)
GET /docs
- Interactive Swagger UI documentationGET /redoc
- ReDoc documentationGET /openapi.json
- OpenAPI schema
Create a .env
file based on env.example
:
# Application Configuration
APP_NAME="FastAPI High-Performance API"
APP_VERSION="1.0.0"
DEBUG=false
# Server Configuration
HOST=0.0.0.0
PORT=8000
# Database Configuration
DATABASE_URL=postgresql+asyncpg://user:password@localhost:5432/fastapi_db
DATABASE_ECHO=false
# Redis Configuration
REDIS_URL=redis://localhost:6379/0
REDIS_EXPIRE_TIME=3600
# JWT Configuration (CHANGE IN PRODUCTION!)
SECRET_KEY=your-super-secret-key-change-this-in-production-make-it-very-long-and-random
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30
# CORS Configuration
ALLOWED_ORIGINS=["http://localhost:3000","http://localhost:8080","http://localhost:8000"]
# Rate Limiting
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_WINDOW=60
The docker-compose.yml
defines:
- FastAPI App: Main application with health checks
- PostgreSQL: Database with persistent volume and initialization
- Redis: Cache with memory optimization and persistence
- Nginx: Reverse proxy with rate limiting (production profile)
# 1. Register a new user
curl -X POST "http://localhost:8000/auth/register" \
-H "Content-Type: application/json" \
-d '{
"email": "user@example.com",
"username": "testuser",
"password": "securepassword123",
"full_name": "Test User"
}'
# 2. Login to get access token
curl -X POST "http://localhost:8000/auth/login" \
-H "Content-Type: application/json" \
-d '{
"username": "testuser",
"password": "securepassword123"
}'
# Response:
# {
# "access_token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
# "token_type": "bearer",
# "expires_in": 1800
# }
# Create an item (authenticated)
curl -X POST "http://localhost:8000/items/" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "My First Item",
"description": "This is a test item"
}'
# Get all items (public, cached)
curl "http://localhost:8000/items/?page=1&size=10"
# Get user's items (authenticated)
curl -H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
"http://localhost:8000/items/my-items?page=1&size=10"
# Update an item (authenticated, owner only)
curl -X PUT "http://localhost:8000/items/1" \
-H "Authorization: Bearer YOUR_ACCESS_TOKEN" \
-H "Content-Type: application/json" \
-d '{"name": "Updated Item Name"}'
# Using Python module
python -m ruff check app # Check for issues
python -m ruff check app --fix # Fix auto-fixable issues
python -m ruff format app # Format code
# Using PowerShell script (Windows)
scripts\ruff.ps1 check # Check only
scripts\ruff.ps1 fix # Fix + format
scripts\ruff.ps1 format # Format only
# Using Makefile (Linux/macOS/WSL)
make lint # Check for issues
make lint-fix # Fix + format
make fmt # Format only
# Run mypy on the project
python -m mypy app
# Using Makefile
make mypy
# Install and setup
pip install pre-commit
pre-commit install
# Run on all files
pre-commit run --all-files
# Hooks run automatically on git commit
git add .
git commit -m "Your changes" # Ruff will run automatically
GitHub Actions workflow (.github/workflows/ruff.yml
) automatically:
- Runs on push/PR to main branch
- Checks code with Ruff linting
- Validates code formatting
- Fails CI if code quality issues found
# Build and start all services
docker-compose up --build -d
# View logs for specific service
docker-compose logs -f api
docker-compose logs -f postgres
docker-compose logs -f redis
# Execute commands in containers
docker-compose exec api bash
docker-compose exec postgres psql -U fastapi_user -d fastapi_db
docker-compose exec redis redis-cli
# Restart services
docker-compose restart api
# Stop all services
docker-compose down
# Stop and remove volumes (data loss!)
docker-compose down -v
# Start with Nginx reverse proxy
docker-compose --profile production up -d
# Access via Nginx (port 80)
curl http://localhost/health/
# SSL/TLS Configuration (add to nginx.conf)
# server {
# listen 443 ssl http2;
# ssl_certificate /path/to/certificate.crt;
# ssl_certificate_key /path/to/private.key;
# # ... rest of configuration
# }
curl http://localhost:8000/health/
Response:
{
"status": "healthy",
"timestamp": "2024-01-01T12:00:00.000000",
"version": "1.0.0",
"database": "healthy",
"redis": "healthy"
}
# Check all services status
docker-compose ps
# Monitor Redis
docker-compose exec redis redis-cli monitor
docker-compose exec redis redis-cli info
# Monitor PostgreSQL
docker-compose exec postgres pg_isready -U fastapi_user
docker-compose logs postgres
# Application metrics
curl http://localhost:8000/health/
- JWT Tokens: Stateless authentication with configurable expiration
- Password Hashing: Bcrypt with salt for secure password storage
- Protected Routes: Dependency injection for authentication requirements
- User Ownership: Items are owned by users, enforced at API level
X-Frame-Options
: Prevents clickjackingX-XSS-Protection
: XSS attack preventionX-Content-Type-Options
: MIME type sniffing preventionReferrer-Policy
: Controls referrer informationContent-Security-Policy
: Content security policy
- Nginx Level: 10 requests/second with burst of 20
- Application Level: Configurable rate limiting
- Health Check Bypass: Health endpoints bypass rate limiting
- Connection Pooling: SQLAlchemy async connection pool
- Query Optimization: Proper indexing and query structure
- Async Operations: Non-blocking database operations
- Health Checks: Automatic connection health monitoring
- Redis Caching: Intelligent caching with TTL
- Cache Invalidation: Smart cache invalidation on data changes
- Pagination Caching: Cached paginated results
- Cache Keys: Structured cache key generation
- Async/Await: Non-blocking I/O throughout
- Pydantic V2: Fast serialization/deserialization
- Connection Reuse: HTTP client connection pooling
- Structured Logging: Efficient logging with structured data
- FastAPI 0.119.0: Modern, fast web framework
- Uvicorn 0.37.0: ASGI server with standard extras
- Pydantic 2.12.0: Data validation and serialization
- SQLAlchemy 2.0.44: Async ORM with modern features
- asyncpg 0.30.0: Fast PostgreSQL async driver
- Redis 6.4.0: Redis client with hiredis for performance
- Alembic 1.17.0: Database migration tool
- python-jose 3.5.0: JWT token handling with cryptography
- bcrypt 5.0.0: Password hashing
- python-multipart 0.0.20: Form data parsing
- Ruff 0.14.0: Fast Python linter and formatter
- pre-commit 4.3.0: Git hooks for code quality
- pytest 8.4.2: Testing framework with async support
# Install test dependencies (included in requirements.txt)
pip install pytest pytest-asyncio pytest-cov
# Run tests
pytest
# Run with coverage
pytest --cov=app --cov-report=html
# Run specific test file
pytest tests/test_auth.py
# Run with verbose output
pytest -v
# Check PostgreSQL status
docker-compose logs postgres
docker-compose exec postgres pg_isready -U fastapi_user
# Verify connection from app
docker-compose exec api python -c "
from app.core.database import engine
import asyncio
async def test():
async with engine.begin() as conn:
result = await conn.execute('SELECT 1')
print('DB OK:', result.scalar())
asyncio.run(test())
"
# Check Redis status
docker-compose exec redis redis-cli ping
docker-compose logs redis
# Test from application
docker-compose exec api python -c "
from app.services.cache_service import get_redis_client
import asyncio
async def test():
client = await get_redis_client()
await client.ping()
print('Redis OK')
asyncio.run(test())
"
# Check application logs
docker-compose logs api
# Verify environment variables
docker-compose exec api env | grep -E "(DATABASE_URL|REDIS_URL|SECRET_KEY)"
# Test application directly
docker-compose exec api python -c "from app.main import app; print('App created successfully')"
# Check slow queries (if enabled in init-db.sql)
docker-compose exec postgres psql -U fastapi_user -d fastapi_db -c "
SELECT query, mean_time, calls
FROM pg_stat_statements
ORDER BY mean_time DESC
LIMIT 10;"
# Check connection pool status
docker-compose logs api | grep -i "pool"
# Check Redis info
docker-compose exec redis redis-cli info stats
# Monitor Redis commands
docker-compose exec redis redis-cli monitor
# Check memory usage
docker-compose exec redis redis-cli info memory
-
Fork and Clone
git clone <your-fork-url> cd rest-python-api
-
Setup Environment
python -m venv .venv source .venv/bin/activate # Windows: .venv\Scripts\activate pip install -r requirements.txt
-
Install Pre-commit
pre-commit install
-
Run Tests
pytest
- Linting: Code must pass Ruff linting (
scripts\ruff.ps1 check
) - Formatting: Code must be formatted with Ruff (
scripts\ruff.ps1 format
) - Type Hints: All functions must have proper type hints
- Documentation: All public functions must have docstrings
- Tests: New features must include tests
- Create feature branch (
git checkout -b feature/amazing-feature
) - Make changes with proper tests
- Ensure all checks pass (
pre-commit run --all-files
) - Commit changes (
git commit -m 'Add amazing feature'
) - Push to branch (
git push origin feature/amazing-feature
) - Open Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Documentation: Check
/docs
endpoint for interactive API documentation - Health Check: Monitor application status at
/health/
endpoint - Issues: Create GitHub issues for bugs or feature requests
- Discussions: Use GitHub Discussions for questions and community support
Built with ❤️ using FastAPI, PostgreSQL, Redis, Docker, and modern Python practices