Crawls a website and generates a spec-compliant llms.txt file.
| Tool | Install | Purpose |
|---|---|---|
| Python 3.12+ | python.org | Runtime |
| uv | brew install uv |
Python package manager |
| AWS CLI | brew install awscli |
AWS access |
| AWS SAM CLI | brew install aws-sam-cli |
Build and deploy |
| Docker | Docker Desktop | Lambda-compatible builds |
make setup # configures git hooks
cd backend && uv sync --all-groups # installs dependenciescd backend && uv run fastapi dev app/main.py # starts dev server at http://localhost:8000Open frontend/index.html directly in a browser. API_BASE at the top of the script tag points to http://localhost:8000.
aws configure # enter Access Key ID, Secret Key, region (e.g. eu-west-1), output format (json)make deploy-guided # interactive: creates samconfig.toml, provisions all AWS resources
make deploy-frontend # uploads frontend to S3make deploy # builds and deploys backend
make deploy-frontend # uploads frontend to S3 with the live API URL substituted inmake delete # deletes the entire CloudFormation stackDeploys automatically on every push to main via GitHub Actions.
Browser → API Gateway → ApiFunction (Lambda)
│ SQS
WorkerFunction (Lambda)
│
DynamoDB (job status)
S3 (results)
Browser → S3 static website (frontend)
POST /generate— returnsjob_id, enqueues crawl jobGET /job_status/{job_id}— poll for status (queued → crawling → generating → complete / failed)GET /results/{job_id}— fetch plain-text llms.txt (local dev only; on AWS the status response includes a pre-signed S3 URL)