Spread is a microservices-based financial modeling platform that pairs a FastAPI web surface with a background agent worker that builds spreadsheet-ready models, persists results, and evaluates outputs. The system leans on Redis for queues, PostgreSQL for persistence, and MinIO/OneDrive for storing generated workbooks.
services/
rest/ FastAPI app, DB models, and Alembic migrations
agent_worker/ RQ-powered background worker, tools, and DSL tests
evals/ Batch evaluation harness for generated models
shared/ Cross-service helpers for spreadsheets, storage, and artifacts
misc/ Scratch files and exploratory notebooks
docker-compose.yml Local orchestration for the REST API + worker
pyrightconfig.json, ruff.toml Repo-wide static analysis config
Each service folder contains a more detailed README with service-specific commands.
- Python 3.11 (matches the base images used by
services/restandservices/agent_worker) - Docker + Docker Compose for running the full stack locally
- Redis and PostgreSQL instances (Docker or managed)
- Optional: micromamba/conda for the
spreadenvironment mentioned in the service docs
Create a .env file in the project root (or export the variables in your shell) with at least:
DATABASE_URL=postgresql://...
REDIS_URL=redis://...
MINIO_* (endpoint, credentials)
CLERK_* (auth)
<LLM provider keys: OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.>
The committed .env shows the required keys but contains placeholder/secrets you should replace. Never commit real credentials.
python -m venv .venv
source .venv/bin/activate
pip install -r services/rest/requirements.txt
pip install -r services/agent_worker/requirements.txt
pip install -r services/evals/requirements.txtPyright and Ruff pick up their config from the repo root.
docker compose up --buildThis builds/starts the FastAPI app on localhost:8000 and the agent worker. REDIS_URL and DATABASE_URL are read from your .env.
uvicorn services.rest.app.main:app --reload --port 8000The worker consumes Redis jobs to build spreadsheets:
python -m services.agent_worker.enqueue # enqueue a demo job
python -m services.agent_worker.core.worker # run the worker (inside the repo virtualenv)Container workflow:
docker buildx build --platform linux/amd64 -t spread_agent_worker -f services/agent_worker/Dockerfile .
docker run spread_agent_workerexport REDIS_URL=redis://localhost:6379
python -m services.evals.main services/evals/inputs/evalset_1.csv --miniomicromamba activate spread # optional, if you use the shared env
cd services/rest
alembic revision -m "short message"
alembic upgrade head # development DB
alembic -c alembic.prod.ini upgrade head # production configMigration files live in services/rest/alembic/versions/.
- Lint & format:
ruff check .andruff format . - Type checking:
mypy services/agent_worker - REST module smoke test:
python -m services.rest.app.logic.statement_generator - Agent DSL tests:
python -m services.agent_worker.lib.tests.test - Artifact tool tests:
python -m tests.test_create_artifact
Please lint whenever you touch code (see CLAUDE.md).
- To rebuild/publish the REST API image:
docker compose build web_app - To publish to GitHub Container Registry:
docker tag spread_web_app ghcr.io/mimouncadosch/spread_web_app:latest docker push ghcr.io/mimouncadosch/spread_web_app:latest
Keep the .env and cloud credentials out of commits, and prefer the shared services/shared utilities for interacting with spreadsheets, MinIO, and artifact storage to avoid duplicating logic across services.