Skip to content

A lightweight Python FastAPI application with a background worker that processes jobs asynchronously. Everything’s Dockerized, non-root, and ready for a production-ish environment.

Notifications You must be signed in to change notification settings

DaniAzure29/job-processing-api

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Job Processing API & Worker

A lightweight Python FastAPI application with a background worker that processes jobs asynchronously. Everything’s Dockerized, non-root, and ready for a production-ish environment — because, let’s be honest, who wants to deploy sloppy containers?

This project demonstrates:

  • API endpoints to create and track jobs

  • A worker that pulls jobs from Postgres and processes them

  • Logging for both API and worker

  • Healthchecks for production reliability

  • Docker Compose orchestration with non-root containers

  • Image vulnerability scanning (Trivy)

  • Versioned container images for deployment

Table of Contents

  • Features

  • Architecture

  • Setup

  • Usage

  • Docker & Deployment

  • Logging & Healthchecks

  • Security Notes

  • Future Improvements

FEATURES

FastAPI endpoints:

POST /jobs → submit a new job

GET /jobs/{job_id} → check job status

GET /health → API + DB healthcheck

Background worker:

Continuously polls the DB for pending jobs

Marks them processing → completed

Logs every step for easy debugging

Dockerized non-root containers:

API and worker run as appuser, not root

Isolated from host OS, safer for production

Persistent Postgres storage:

DB stored in Docker volume

Worker and API communicate via DB, decoupled

Healthchecks:

API /health endpoint

Worker heartbeat file for liveness

ARCHITECTURE

Think of it like a small factory:

API → intake department. Takes requests, stores jobs in the DB.

Postgres DB → warehouse. Holds jobs, tracks status.

Worker → processing department. Picks up pending jobs, does the work, updates status.

No API-to-worker direct calls. They don’t chat; they just meet in the DB like coworkers passing notes. Decoupled, simple, scalable.

Client --> API --> Postgres <-- Worker

SETUP

Requirements

Docker ≥ 24

Docker Compose ≥ 2

Python ≥ 3.12 (if running locally)

Steps

Clone the repo

git clone https://github.com/DaniAzure29/job-processing-api.git cd job-processing-api

Copy .env.example → .env and fill in credentials

Build and start containers

docker-compose build docker-compose up -d

Check running containers

docker ps

USAGE Test the API

Healthcheck:

curl http://localhost:8000/health

Submit job (JSON payload):

curl -X POST http://localhost:8000/jobs
-H "Content-Type: application/json"
-d '{"payload": "Do something important"}'

Check job status:

curl http://localhost:8000/jobs/1

Worker

Check logs:

docker-compose logs -f worker

You should see:

Worker started Processing job 1 Completed job 1

DOCKER AND DEPLOYMENT

Docker Compose handles multi-container setup: API, Worker, Postgres

Non-root containers for security (appuser)

Healthchecks for API and worker (via heartbeat)

Versioned images:

docker tag job-processing-api_api:latest your-dockerhub-username/job-api:v1.0 docker push your-dockerhub-username/job-api:v1.0

Same for worker

Using versioned images lets you roll back, deploy to multiple environments, and avoid “latest” nightmares. Trust me, you don’t want a broken latest in production.

LOGGING AND HEALTHCHECKS

Logging:

API: logger.info("API startup...")

Worker: logs every job processing step

Simple, lightweight, enough to debug without turning it into ELK nightmare

Healthchecks:

API: /health → checks DB connection

Worker: writes heartbeat file every few seconds (/tmp/worker_heartbeat.txt)

Not fancy, but enough to integrate with Docker/K8s liveness probes.

SECURITY NOTES

Containers run non-root → limits system-level damage if compromised

Python packages installed system-wide, read-only → prevents tampering

Secrets should not be baked into images

DB credentials and sensitive info should be injected via .env or a secrets manager

Image scanning done with Trivy → ensures you’re not shipping vulnerable base images

Non-root isn’t magic — if your app logic is hacked, attacker can still see jobs and secrets in the container. But it contains the damage to the container, not your host.

FUTURE IMPROVEMENTS

Add Redis or RabbitMQ queue instead of DB polling for higher throughput

CI/CD workflow for automated build, scan, and push

Metrics + Prometheus/Grafana integration

Retry/backoff logic for failed jobs

Better API validation & authentication

Bottom line:

This repo is small but production-minded. It’s simple, decoupled, and demonstrates best practices in Docker, FastAPI, background workers, and security hygiene — without over-engineering.

About

A lightweight Python FastAPI application with a background worker that processes jobs asynchronously. Everything’s Dockerized, non-root, and ready for a production-ish environment.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published