A scalable distributed task queue system built with FastAPI, Celery, Redis, and PostgreSQL. This system provides reliable asynchronous task execution with monitoring, retry mechanisms, and horizontal scaling capabilities.
- RESTful API - Submit, monitor, and manage tasks via HTTP endpoints
- Asynchronous Processing - Background task execution with Celery workers
- Reliable Message Broker - Redis for task queuing and result storage
- Persistent Storage - PostgreSQL for task metadata and results
- Horizontal Scaling - Multiple worker processes and API instances
- Retry Logic - Configurable retry policies for failed tasks
- Task Monitoring - Real-time task status and queue statistics
- Priority Queues - Support for different task priorities
- Scheduled Tasks - Cron-like scheduling with Celery Beat
- Health Checks - Service health monitoring endpoints
- Containerized - Docker and Docker Compose support
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ FastAPI API │ │ Celery Worker │ │ Celery Worker │
│ │ │ │ │ │
│ - Task Submit │ │ - Task Execute │ │ - Task Execute │
│ - Task Status │ │ - Result Store │ │ - Result Store │
│ - Monitoring │ │ │ │ │
└─────────┬───────┘ └─────────┬───────┘ └─────────┬───────┘
│ │ │
└──────────────────────┼──────────────────────┘
│
┌─────────────┴───────────┐
│ Redis Broker │
│ │
│ - Task Queues │
│ - Result Backend │
│ - Worker Coordination │
└─────────────┬───────────┘
│
┌─────────────┴───────────┐
│ PostgreSQL DB │
│ │
│ - Task Metadata │
│ - Execution History │
│ - Queue Statistics │
└─────────────────────────┘
- Python 3.11+
- Docker and Docker Compose
- PostgreSQL 15+
- Redis 7+
-
Clone and setup the project:
git clone <repository-url> cd distributed-task-queue cp .env.example .env
-
Start all services:
docker-compose up -d
-
Check service health:
curl http://localhost:8000/health
-
Access the services:
- API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
- Task Monitor (Flower): http://localhost:5555
-
Create virtual environment:
python -m venv venv source venv/bin/activate # Windows: venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Setup environment:
cp .env.example .env # Edit .env with your configuration
-
Start external services:
docker-compose up -d postgres redis
-
Run database migrations:
alembic upgrade head
-
Start the services:
# Terminal 1: API Server uvicorn src.main:app --reload --port 8000 # Terminal 2: Celery Worker celery -A src.worker.main worker --loglevel=info # Terminal 3: Celery Beat (for scheduled tasks) celery -A src.worker.main beat --loglevel=info # Terminal 4: Task Monitor (optional) celery -A src.worker.main flower
curl -X POST "http://localhost:8000/api/v1/tasks/" \
-H "Content-Type: application/json" \
-d '{
"name": "example_task",
"payload": {"message": "Hello, World!"},
"priority": "normal",
"queue_name": "default"
}'
curl "http://localhost:8000/api/v1/tasks/{task_id}"
curl "http://localhost:8000/api/v1/tasks/?status=pending&limit=10"
curl "http://localhost:8000/api/v1/monitoring/queues"
# Run all tests
pytest
# Run with coverage
pytest --cov=src
# Run specific test file
pytest tests/test_api/test_tasks.py -v
# Run specific test
pytest -k "test_create_task"
# Format code
black src/ tests/
# Sort imports
isort src/ tests/
# Lint code
flake8 src/ tests/
# Type checking
mypy src/
# Create new migration
alembic revision --autogenerate -m "Add new field"
# Apply migrations
alembic upgrade head
# Downgrade migration
alembic downgrade -1
- Create task handler in
src/worker/task_handlers.py
- Register task with Celery in
src/worker/main.py
- Add task schema in
src/schemas/task.py
- Update API endpoint if needed
Key configuration options in .env
:
DATABASE_URL
- PostgreSQL connection stringREDIS_URL
- Redis connection stringAPI_PORT
- API server port (default: 8000)WORKER_CONCURRENCY
- Number of worker processesDEFAULT_TASK_TIMEOUT
- Task execution timeoutMAX_RETRY_ATTEMPTS
- Maximum retry attempts for failed tasks
- API Health:
GET /health
- Database Health:
GET /health/db
- Redis Health:
GET /health/redis
The system provides metrics for monitoring:
- Task throughput and latency
- Queue depths and processing rates
- Worker utilization and status
- Error rates and retry statistics
Access metrics at: GET /api/v1/monitoring/metrics
Use Celery Flower for real-time task monitoring:
- Web UI: http://localhost:5555
- Worker status, task history, and performance metrics
For production deployment:
- Use environment-specific
.env
files - Set up proper logging and monitoring
- Configure load balancing for API instances
- Set up database backups and replication
- Monitor resource usage and scale workers accordingly
- Implement proper security measures (authentication, HTTPS)
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Run the test suite and ensure all tests pass
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.