A comprehensive interview project demonstrating a production-ready microservices architecture using Python, FastAPI, AWS SQS, Lambda, PostgreSQL, and Docker.
βββββββββββββββ ββββββββββββββββ βββββββββββββββ
β FastAPI βββββββΆβ AWS SQS βββββββΆβAWS Lambda β
β Backend β β Queue β β Function β
ββββββββ¬βββββββ ββββββββββββββββ ββββββββ¬βββββββ
β β
β ββββββββββββββββ β
βββββββββββββΆβ PostgreSQL βββββββββββββββββ
β Database β
ββββββββββββββββ
- FastAPI Backend: RESTful API for task management
- PostgreSQL: Relational database for persistent storage
- AWS SQS: Message queue for asynchronous task processing
- AWS Lambda: Serverless function for processing tasks
- Docker: Containerization for easy deployment
- LocalStack: Local AWS services emulation for development
- β RESTful API with FastAPI
- β PostgreSQL database with SQLAlchemy ORM
- β Asynchronous task processing with SQS
- β AWS Lambda function for message processing
- β Docker Compose for local development
- β Pydantic models for data validation
- β Comprehensive test suite
- β Health check endpoints
- β Environment-based configuration
- β Database migrations with Alembic
- Docker and Docker Compose
- Python 3.11+ (for local development)
- AWS Account (for production deployment)
- Git
git clone <repository-url>
cd pythonCreate a .env file from the example:
cp .env.example .envEdit .env with your configuration:
DATABASE_URL=postgresql://postgres:postgres@db:5432/interview_db
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
SQS_QUEUE_URL=https://sqs.us-east-1.amazonaws.com/123456789/interview-queue
ENVIRONMENT=development# Start all services
docker-compose up -d
# View logs
docker-compose logs -f
# Check service status
docker-compose ps# Make the init script executable
chmod +x localstack/init-aws.sh
# The script runs automatically when LocalStack starts
# Or run manually:
docker-compose exec localstack /etc/localstack/init/ready.d/init-aws.sh- API Documentation: http://localhost:8000/docs
- Alternative API Docs: http://localhost:8000/redoc
- Health Check: http://localhost:8000/health
- PostgreSQL: localhost:5432
- LocalStack: http://localhost:4566
| Method | Endpoint | Description |
|---|---|---|
| GET | / |
Root endpoint with API info |
| GET | /health |
Health check |
| POST | /tasks |
Create a new task |
| GET | /tasks |
List all tasks (with pagination) |
| GET | /tasks/{task_id} |
Get a specific task |
| PUT | /tasks/{task_id} |
Update a task |
| DELETE | /tasks/{task_id} |
Delete a task |
| POST | /tasks/{task_id}/process |
Manually trigger task processing |
curl -X POST "http://localhost:8000/tasks" \
-H "Content-Type: application/json" \
-d '{
"title": "Process user data",
"description": "Extract and transform user information"
}'curl "http://localhost:8000/tasks?skip=0&limit=10"curl "http://localhost:8000/tasks/1"curl -X PUT "http://localhost:8000/tasks/1" \
-H "Content-Type: application/json" \
-d '{
"title": "Updated title",
"status": "completed"
}'curl -X DELETE "http://localhost:8000/tasks/1"# Install dependencies
pip install -r requirements.txt
# Run all tests
pytest
# Run with coverage
pytest --cov=app --cov-report=html
# Run specific test file
pytest tests/test_api.py
# Run with verbose output
pytest -vThe test suite includes:
- Unit tests for API endpoints
- Integration tests with database
- Validation tests for Pydantic models
- Error handling tests
# Initialize Alembic (already done)
alembic init alembic
# Create a new migration
alembic revision --autogenerate -m "description"
# Apply migrations
alembic upgrade head
# Rollback migration
alembic downgrade -1
# View migration history
alembic history# Connect to PostgreSQL
docker-compose exec db psql -U postgres -d interview_db
# Common SQL commands
\dt # List tables
\d tasks # Describe tasks table
SELECT * FROM tasks;.
βββ app/
β βββ __init__.py
β βββ main.py # FastAPI application
β βββ config.py # Configuration management
β βββ database.py # Database connection
β βββ models.py # SQLAlchemy models
β βββ schemas.py # Pydantic schemas
β βββ sqs_client.py # AWS SQS client
βββ lambda/
β βββ lambda_function.py # Lambda handler
β βββ requirements.txt # Lambda dependencies
β βββ Dockerfile # Lambda container
βββ localstack/
β βββ init-aws.sh # LocalStack initialization
βββ tests/
β βββ __init__.py
β βββ conftest.py # Test configuration
β βββ test_api.py # API tests
βββ docker-compose.yml # Docker services
βββ Dockerfile # API container
βββ requirements.txt # Python dependencies
βββ .env.example # Environment template
βββ .gitignore
βββ INTERVIEW_QUESTIONS.md # Interview questions
βββ README.md
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Set environment variables
export DATABASE_URL="postgresql://postgres:postgres@localhost:5432/interview_db"
export AWS_REGION="us-east-1"
# ... other variables
# Run the application
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000# Build Lambda Docker image
cd lambda
docker build -t interview-lambda .
# Tag and push to ECR
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin <account-id>.dkr.ecr.us-east-1.amazonaws.com
docker tag interview-lambda:latest <account-id>.dkr.ecr.us-east-1.amazonaws.com/interview-lambda:latest
docker push <account-id>.dkr.ecr.us-east-1.amazonaws.com/interview-lambda:latest
# Create Lambda function
aws lambda create-function \
--function-name interview-processor \
--package-type Image \
--code ImageUri=<account-id>.dkr.ecr.us-east-1.amazonaws.com/interview-lambda:latest \
--role arn:aws:iam::<account-id>:role/lambda-execution-role# Create queue
aws sqs create-queue --queue-name interview-queue
# Configure Lambda trigger
aws lambda create-event-source-mapping \
--function-name interview-processor \
--event-source-arn arn:aws:sqs:us-east-1:<account-id>:interview-queue \
--batch-size 10# Build and push API image
docker build -t interview-api .
docker tag interview-api:latest <account-id>.dkr.ecr.us-east-1.amazonaws.com/interview-api:latest
docker push <account-id>.dkr.ecr.us-east-1.amazonaws.com/interview-api:latest
# Deploy to ECS (example)
# Create task definition, service, and configure load balancer# Create RDS instance
aws rds create-db-instance \
--db-instance-identifier interview-db \
--db-instance-class db.t3.micro \
--engine postgres \
--master-username postgres \
--master-user-password <password> \
--allocated-storage 20- β Use AWS Secrets Manager for sensitive credentials
- β Implement JWT authentication for API endpoints
- β Enable HTTPS/TLS in production
- β Use IAM roles instead of access keys
- β Enable VPC for database and Lambda
- β Implement rate limiting
- β Regular security updates
- β Input validation with Pydantic
- β SQL injection prevention with ORM
# Add to Lambda function
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
# Add to FastAPI
import logging
logging.basicConfig(level=logging.INFO)- API response times
- Database connection pool
- SQS queue depth
- Lambda execution duration
- Error rates
- Task processing success/failure rates
# Check if database is running
docker-compose ps db
# Check database logs
docker-compose logs db
# Restart database
docker-compose restart db# Check LocalStack logs
docker-compose logs localstack
# Verify queue exists
docker-compose exec localstack awslocal sqs list-queues
# Recreate queue
docker-compose exec localstack awslocal sqs create-queue --queue-name interview-queue- Check Lambda logs in CloudWatch
- Verify SQS trigger is configured
- Check IAM permissions
- Verify database connectivity from Lambda
# Check logs
docker-compose logs api
# Rebuild container
docker-compose build api
docker-compose up -d api- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
See INTERVIEW_QUESTIONS.md for comprehensive interview questions covering:
- Python & FastAPI
- PostgreSQL & SQLAlchemy
- AWS Services (SQS, Lambda)
- Docker & DevOps
- System Design & Architecture
- Practical Coding Tasks
- Debugging Scenarios
This project is created for interview and educational purposes.
- FastAPI Documentation
- SQLAlchemy Documentation
- AWS SQS Documentation
- AWS Lambda Documentation
- PostgreSQL Documentation
- Docker Documentation
- LocalStack Documentation
For questions or feedback about this project, please open an issue in the repository.
Happy Coding! π