A distributed job queue system built from scratch with Python, Redis, PostgreSQL, and Docker.
Inspired by how tools like Celery, Sidekiq, and AWS SQS work under the hood.
- Clients submit tasks via a REST API
- Jobs are stored in PostgreSQL and queued in Redis by priority
- Worker processes pull jobs asynchronously and execute them
- Results and errors are persisted back to PostgreSQL
| Task | Description |
|---|---|
resize_image |
Resize an image to target dimensions using Pillow |
send_email |
Send transactional email via SMTP |
ml_inference |
Run sentiment analysis / classification inference |
generate_report |
Generate sales or analytics reports in JSON or CSV |
Client → FastAPI → Redis (priority queue) → Worker → PostgreSQL
↑
(persists results)
- Python — core language
- FastAPI — REST API
- Redis — priority job queue (sorted sets)
- PostgreSQL — job persistence and history
- Docker Compose — orchestration
docker compose up --buildThis starts all four services: Postgres, Redis, API, Worker.
API available at: http://localhost:8000
Interactive docs at: http://localhost:8000/docs
| Method | Path | Description |
|---|---|---|
| POST | /jobs |
Submit a new job |
| GET | /jobs |
List all jobs |
| GET | /jobs/{id} |
Get job status and result |
| GET | /stats |
Queue depth and stats |
curl -X POST http://localhost:8000/jobs \
-H "Content-Type: application/json" \
-d '{
"task_type": "ml_inference",
"payload": {
"model_name": "sentiment",
"input_text": "This project was a great learning experience!"
}
}'api/ REST API (FastAPI)
shared/ Shared models, Redis broker, PostgreSQL manager
worker/ Worker process and task handlers
docker/ Dockerfiles for API and worker
Database credentials in docker-compose.yml are for local development only.