FastAPITemplate is a production-ready boilerplate repository designed to help developers quickly set up a FastAPI project. It provides a structured foundation for building scalable, secure, and maintainable web applications using Python and FastAPI with best practices baked in.
This template is built on FastAPI, a modern Python web framework for building APIs with automatic interactive documentation. It incorporates asynchronous database access, environment-based configuration, Docker containerization, and robust testing capabilities.
- 🚀 FastAPI Integration: Pre-configured FastAPI setup for rapid development
- ⚡ Asynchronous Architecture: Built with async/await for high-performance, non-blocking I/O
- 🧩 Modular Structure: Organized folder structure for easy scalability and maintenance
- 🔧 Environment Configuration: Support for
.env
files to manage environment variables across different environments - 🗄️ Database Integration: Asynchronous PostgreSQL support with SQLAlchemy
- 🐳 Docker Support: Includes
Dockerfile
anddocker-compose.yml
for containerized deployment - 🧪 Testing Ready: Pre-configured testing setup using
pytest
- 🔄 Pre-commit Hooks: Automates code formatting, linting, and testing before committing changes
- 📚 Interactive API Documentation: Swagger UI and Redoc for exploring and testing API endpoints
- 🛢️ Database Management: Adminer for managing development and test databases through a web interface
- 🔄 Continuous Integration (CI): GitHub Actions pipeline for automated testing and building
- 🔒 Security Features: Password strength validation, CORS configuration, secure database connections
- ✅ Type Safety: Leverages Python type hints and Pydantic for data validation
- FastAPITemplate
- Overview
- Key Features
- Table of Contents
- Getting Started
- Project Architecture
- Building Scalable Applications
- Security and Privacy Features
- Environment Configuration
- Docker & Containerization
- Development Workflow
- Testing Approach
- Deployment Best Practices
- Using This Template for Your Project
- Best Practices
- Acknowledgments
- Python 3.8 or higher
pip
orpoetry
for dependency management- Docker (optional, for containerized deployment)
- PostgreSQL (if not using Docker)
-
Clone the repository:
git clone https://github.com/your-username/FastAPITemplate.git cd FastAPITemplate
-
Set up environment variables:
- Copy the
example.env
file to.env
- Configure your environment variables in the
.env
file
- Copy the
-
If not using Docker, install dependencies:
pip install -r backend/requirements.txt
-
Build the Docker images:
docker-compose build
-
Start the services:
ENVIRONMENT=DEV docker-compose up
Access the following services:
- Swagger UI:
http://localhost:8000/docs
- Redoc:
http://localhost:8000/redoc
- Dev DB Adminer:
http://localhost:8081
- Test DB Adminer:
http://localhost:8082
- Swagger UI:
-
Stop the services:
docker-compose down
-
Ensure the Docker containers are running in STAGING environment:
ENVIRONMENT=STAGING docker-compose up
-
Run the tests inside the Docker container:
docker exec -it <container_name> pytest
- Install pre-commit hooks:
pre-commit install
- Run pre-commit manually:
pre-commit run --all-files
This repository includes a CI pipeline that runs on GitHub Actions. The pipeline is triggered on every pull request to:
trunk
If only changes to .md files are made, the pipeline will return a success status without running the tests.
The pipeline performs the following tasks:
- Building Docker images for the application and database services
- Running tests inside the Docker container
To enable the CI pipeline:
- Ensure you have a GitHub repository set up for your project
- Upload the variables of the
.env
file to the GitHub repository variables using theupload_env_variables.py
script:python upload_env_variables.py
The repository is organized as follows:
backend/
├── src/ # Main application source code
│ ├── main.py # Application entry point
│ ├── api/ # API layer
│ │ ├── endpoints.py # API endpoint registry
│ │ └── routes/ # Individual route modules
│ │ └── account_router.py # Account-related routes
│ ├── config/ # Configuration modules
│ │ ├── logging.py # Logging configuration
│ │ └── settings/ # Environment-based settings
│ │ ├── base.py # Base settings class
│ │ ├── development.py # Development environment settings
│ │ ├── production.py # Production environment settings
│ │ ├── setup.py # Settings initialization
│ │ └── staging.py # Staging environment settings
│ ├── crud/ # CRUD operation modules
│ │ └── account_crud.py # Account CRUD operations
│ ├── models/ # Data models
│ │ ├── db_tables/ # SQLAlchemy table definitions
│ │ │ ├── account_table.py # Account table definition
│ │ │ └── table_collection.py # Table registry
│ │ └── schemas/ # Pydantic schemas
│ │ ├── account_schema.py # Account data schemas
│ │ └── profile_schema.py # Profile data schemas
│ └── utility/ # Utility modules
│ ├── database/ # Database utilities
│ ├── events/ # Event handlers
│ ├── formatters/ # Formatting utilities
│ └── pydantic_schema/ # Base Pydantic schemas
├── tests/ # Test directory
│ ├── conftest.py # Pytest configuration
│ └── router_tests/ # API route tests
├── Dockerfile # Docker configuration
└── requirements.txt # Python dependencies
The application is initialized in main.py
with a modular approach that allows for clean setup of middleware, event handlers, and route registration. The app configuration is loaded from environment-specific settings.
def initialize_application() -> fastapi.FastAPI:
app = fastapi.FastAPI(**settings.set_backend_app_attributes)
# CORS middleware
app.add_middleware(
CORSMiddleware,
allow_origins=settings.ALLOWED_ORIGINS,
allow_credentials=settings.IS_ALLOWED_CREDENTIALS,
allow_methods=settings.ALLOWED_METHODS,
allow_headers=settings.ALLOWED_HEADERS,
)
# Event handlers
app.add_event_handler("startup", execute_backend_server_event_handler(app=app))
app.add_event_handler("shutdown", terminate_backend_server_event_handler(app=app))
# Router registration
app.router.include_router(router)
return app
The application uses a hierarchical settings system based on Pydantic to manage configuration across different environments. The base settings class (base.py
) defines common settings, while environment-specific subclasses override values as needed.
Settings are loaded from .env
files, making it easy to manage environment variables. The system supports multiple environments:
- Development (local development)
- Staging (testing environment)
- Production (live environment)
This approach allows for consistent configuration management with type validation and default values.
The database architecture follows a modern, asynchronous approach using SQLAlchemy's async features:
- Database Class: A singleton
Database
class manages the connection to the PostgreSQL database. - Async Engine & Session: The database uses asynchronous connections for non-blocking I/O.
- Connection Pooling: Configurable connection pooling for efficient resource utilization.
- Session Management: Dependency injection for proper session lifecycle management.
async def get_async_session() -> AsyncGenerator[AsyncSession, None]:
async with db.async_session() as session:
try:
yield session
except Exception as exception:
loguru.logger.error(f"Exception in async session: {exception}")
await session.rollback()
raise
The API layer is organized by domain with versioning:
- Base Router: (
endpoints.py
) - Sets up API versioning with prefixes. - Domain Routers: Separate router modules for different domains (e.g.,
account_router.py
). - Endpoint Documentation: Each endpoint includes OpenAPI documentation with response models and status codes.
Example endpoint from account_router.py
:
@router.post(
path="",
name="account:create_account",
response_model=AccountOut,
status_code=201,
)
async def create_account(
new_account: AccountInAuthentication,
db_session: SQLAlchemyAsyncSession = fastapi.Depends(get_async_session),
) -> AccountOut:
# Implementation...
Data validation and serialization are handled through Pydantic schemas:
- Base Schema: Common validation logic in base schema classes.
- Input Schemas: Schemas for data coming into the API (e.g.,
AccountInAuthentication
). - Output Schemas: Schemas for responses returned to clients (e.g.,
AccountOut
). - Validation Logic: Custom validators for business rules (e.g., password strength).
class AccountInAuthentication(AccountBase):
password: str
@pydantic.validator("password")
def password_strength(cls, v):
policy = PasswordPolicy.from_names(
length=8,
uppercase=1,
numbers=1,
special=1,
)
if policy.test(v) != []:
raise ValueError("Password is not strong enough")
return v
CRUD (Create, Read, Update, Delete) operations are separated into dedicated modules for each entity type, providing a clean interface for database operations:
- Entity-Specific Modules: Each entity has its own CRUD module (e.g.,
account_crud.py
). - Error Handling: Proper error handling with appropriate HTTP exceptions.
- Transaction Management: Automatic transaction management with rollback on error.
Each CRUD function is asynchronous and follows a consistent pattern:
async def create(account: AccountInAuthentication, db_session: SQLAlchemyAsyncSession) -> Account:
# Implementation with error handling...
The application uses event handlers for initialization and cleanup operations:
- Server Startup: Database connection initialization, logging setup.
- Server Shutdown: Graceful connection termination and resource cleanup.
Event handlers are registered in the main application factory and modularized for maintainability.
The template is built with scalability in mind:
- Asynchronous Architecture: Non-blocking I/O for handling large numbers of concurrent connections.
- Database Connection Pooling: Configurable connection pool size and overflow settings.
- Modular Design: Clean separation of concerns for easier horizontal scaling.
- Docker Containerization: Simplified deployment to container orchestration platforms.
- Environment-Based Configuration: Easy adaptation to different deployment environments.
The application's architecture supports several scaling strategies:
- Vertical Scaling: Configure worker count and resource limits
- Horizontal Scaling: Deploy multiple instances behind a load balancer
- Database Scaling: Connection pooling and potential sharding
- Microservices Evolution: Domain-based routers can evolve into separate microservices
For large scale deployments, consider:
- Load Balancing: Deploy multiple instances behind a load balancer (e.g., Nginx, Traefik)
- Container Orchestration: Use Kubernetes or Docker Swarm for managing multiple containers
- Database Replication: Set up read replicas to distribute database load
- Caching: Implement Redis or Memcached for caching frequently accessed data
- CDN Integration: Use a Content Delivery Network for static assets
The template incorporates several security features:
- Password Strength Validation: Enforces strong passwords with specific requirements
- CORS Configuration: Prevents cross-origin request forgery by configuring allowed origins, methods, and headers
- Database Connection Security: Secure database connection handling
- Input Validation: Strict schema validation prevents malformed input
- Error Handling: Prevents leaking sensitive information in error responses
To enhance security and privacy further, consider implementing:
- JWT-based Authentication: Secure user authentication using JSON Web Tokens
- Role-based Access Control: Restrict access to resources based on user roles
- Rate Limiting: Prevent abuse by limiting request rates
- Input Sanitization: Sanitize user inputs to prevent injection attacks
- Data Encryption: Encrypt sensitive data at rest and in transit
- Audit Logging: Log access and changes to sensitive data
- GDPR Compliance: Implement features for user data management and deletion
- HTTPS: Always use HTTPS in production with proper SSL/TLS configuration
Environment variables are managed through .env
files with support for different deployment environments:
- Base Settings: Common settings across all environments
- Environment-Specific Override: Development, staging, and production environments
- Configuration Validation: Type checking and validation through Pydantic
Key configuration categories include:
- Server settings (host, port, workers)
- Database connection parameters
- CORS settings
- Logging configuration
- Feature flags
The template includes Docker configuration for containerized deployment:
- Dockerfile: Multi-stage build process for optimized images
- Docker Compose: Local development setup with database and admin services
- Environment Variables: Docker-compatible environment variable management
Services defined in Docker Compose:
- Application server
- PostgreSQL database
- Adminer for database management
The template supports a productive development workflow:
- Local Development: Docker Compose for local development environment
- Hot Reload: Automatic server reloading for code changes
- Testing: Pre-configured testing with pytest
- Pre-commit Hooks: Code formatting, linting, and testing before commits
- API Documentation: Interactive API documentation with Swagger UI and ReDoc
The testing approach covers multiple levels:
- Unit Tests: Testing individual components in isolation
- Integration Tests: Testing component interactions
- API Tests: Testing HTTP endpoints
- Test Database: Separate database for testing to avoid development data pollution
Tests are organized by domain and run automatically in the CI pipeline.
For production deployments, follow these best practices:
- Environment Configuration: Use production-specific settings
- Database Security: Use strong passwords, restrict network access
- Worker Configuration: Set appropriate number of workers based on server resources
- Logging: Configure proper logging for monitoring and debugging
- Backup Strategy: Implement regular database backups
- Monitoring: Set up health checks and performance monitoring
- CI/CD Pipeline: Automate deployment process
- Blue-Green Deployment: Minimize downtime with blue-green deployment strategy
To use this template for a new project:
- Clone the Repository: Start with a fresh copy of the template
- Configure Environment: Set up environment variables in
.env
file - Define Your Models: Create database models in
models/db_tables/
- Create Schemas: Define Pydantic schemas in
models/schemas/
- Implement CRUD: Add CRUD operations in the
crud/
directory - Create API Routes: Add new routes in
api/routes/
- Register Routes: Register new routers in
api/endpoints.py
- Write Tests: Add tests for new functionality in
tests/
- Deploy: Use Docker to deploy your application
The template encourages several best practices:
- Separation of Concerns: Each module has a specific responsibility
- Dependency Injection: Facilitates testing and component substitution
- Error Handling: Comprehensive error handling with appropriate status codes
- Logging: Structured logging for monitoring and debugging
- Environment-Based Configuration: Different settings for different environments
- Database Migration: Planned support for Alembic migrations
- Testing: Comprehensive test coverage for reliability
- Documentation: Self-documenting API with OpenAPI integration
- Type Safety: Leveraging Python type hints for better code quality
By following these best practices, you can build robust, maintainable web applications that scale efficiently and securely.
- Aeternails Ingenium: For the original FastAPI template that inspired this project.