Skip to content

Distributed task queue and job scheduler microservice with async workers, built on FastAPI, Celery, Redis, and PostgreSQL

License

Notifications You must be signed in to change notification settings

VivekChowdry/task-queue-system

Repository files navigation

Task Queue System

A production-grade distributed task queue and job scheduler microservice built with Python, FastAPI, Celery, PostgreSQL, and Redis.

Features

  • Async Task Processing: Distribute heavy workloads to background workers
  • Job Persistence: Store task history and results in PostgreSQL
  • Message Queue: Redis-backed Celery for reliable task distribution
  • REST API: Complete API for task management and monitoring
  • Health Checks: Built-in health monitoring for database and Redis
  • Retry Logic: Automatic task retries with exponential backoff
  • Priority Levels: Support for multiple task priority levels
  • Docker Support: Docker and Docker Compose for easy deployment
  • Comprehensive Tests: Unit and integration tests with pytest

Tech Stack

  • Framework: FastAPI
  • Task Queue: Celery with Redis
  • Database: PostgreSQL with SQLAlchemy ORM
  • Validation: Pydantic
  • Testing: Pytest
  • Containerization: Docker & Docker Compose

Project Structure

task-queue-system/
├── app/
│   ├── api/              # API endpoints
│   │   ├── health.py     # Health check endpoints
│   │   └── tasks.py      # Task CRUD endpoints
│   ├── core/             # Core configuration
│   │   ├── config.py     # Settings management
│   │   └── database.py   # Database configuration
│   ├── models/           # SQLAlchemy models
│   │   └── task.py       # Task model
│   ├── schemas/          # Pydantic schemas
│   │   └── task.py       # Request/response schemas
│   ├── services/         # Business logic
│   │   └── task_service.py  # Task operations
│   ├── workers/          # Celery workers
│   │   └── celery_app.py    # Celery configuration
│   └── main.py           # FastAPI application
├── tests/                # Test suite
│   ├── unit/            # Unit tests
│   └── integration/     # Integration tests
├── requirements.txt      # Python dependencies
├── docker-compose.yml   # Docker Compose configuration
├── Dockerfile           # Docker image
└── README.md           # This file

Quick Start

Prerequisites

  • Docker & Docker Compose (recommended)
  • Python 3.11+
  • PostgreSQL
  • Redis

Using Docker Compose

# Copy environment configuration
cp .env.example .env

# Start all services
docker-compose up -d

# Create database tables
docker-compose exec api python -c "from app.core.database import Base, engine; Base.metadata.create_all(bind=engine)"

# Access API
curl http://localhost:8000/health

Local Development

# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

# Set up environment
cp .env.example .env

# Create database tables
python -c "from app.core.database import Base, engine; Base.metadata.create_all(bind=engine)"

# Start API server
uvicorn app.main:app --reload

# In another terminal, start Celery worker
celery -A app.workers.celery_app worker --loglevel=info

API Endpoints

Health Check

  • GET /health - Check application health status

Tasks

  • POST /api/v1/tasks - Create a new task
  • GET /api/v1/tasks - List tasks (with filtering)
  • GET /api/v1/tasks/{task_id} - Get task details
  • POST /api/v1/tasks/{task_id}/cancel - Cancel a task
  • DELETE /api/v1/tasks/{task_id} - Delete a task

Task Creation Example

curl -X POST http://localhost:8000/api/v1/tasks \
  -H "Content-Type: application/json" \
  -d '{
    "name": "data_processing",
    "description": "Process user records",
    "payload": {
      "user_ids": [1, 2, 3],
      "operation": "export"
    },
    "priority": "high",
    "max_retries": 3
  }'

Task Status

  • pending - Task created, waiting to be processed
  • processing - Task is being executed
  • completed - Task finished successfully
  • failed - Task execution failed
  • cancelled - Task was cancelled

Running Tests

# Run all tests
pytest

# Run with coverage
pytest --cov=app tests/

# Run specific test file
pytest tests/unit/test_task_service.py

# Run integration tests only
pytest tests/integration/

Configuration

Edit .env file to customize:

  • DATABASE_URL - PostgreSQL connection string
  • REDIS_URL - Redis connection URL
  • API_HOST / API_PORT - API server address
  • LOG_LEVEL - Logging verbosity
  • DEBUG - Development mode

Architecture

See ARCHITECTURE.md for detailed architecture documentation.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes and add tests
  4. Ensure tests pass: pytest
  5. Format code: black . && isort .
  6. Submit a pull request

License

MIT License - see LICENSE file for details

Support

For issues and questions, please open an issue on GitHub.

About

Distributed task queue and job scheduler microservice with async workers, built on FastAPI, Celery, Redis, and PostgreSQL

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors