This repository contains a full-stack event-driven data processing pipeline.
- API: FastAPI service that accepts incoming data jobs and returns job status.
- Queue: Redis list used as a message broker between the API and Worker.
- Worker: Python background worker that picks up jobs from the queue, processes them, and saves the results back to Redis.
- Ensure Docker Desktop is running.
- Run
docker-compose up --build -dto start the Redis, API, and Worker containers. - The API will be available at
http://localhost:8000.
- Submit a job:
curl -X POST http://localhost:8000/process -H "Content-Type: application/json" -d '{"data": "test payload"}'This returns a job_id.
- Check job status:
curl http://localhost:8000/status/<job_id>To run the pipeline integration test without Docker (using fakeredis):
pip install -r api/requirements.txt
pip install -r worker/requirements.txt
pip install fakeredis httpx pytest
python test_integration.pyThis repository includes a Terraform script to deploy the stack to an AWS EC2 instance.
- Navigate to the
terraformdirectory. - Run
terraform init. - Run
terraform applyto provision the VM and install Docker. - SSH into the VM and run the
docker-composecommand.