Nidus is an open-source Applicant Tracking System that combines deterministic matching (TF-IDF) with AI reasoning (Llama 3 via Groq) to analyze CVs, calculate job-fit scores, and generate exportable PDF reports.
- 📄 CV Parsing — PDF, DOCX and TXT support (up to 2 MB)
- 🤖 AI Analysis — Llama 3 via Groq extracts candidate name, skills, experience, and generates a professional summary
- 📊 Job Match Score — TF-IDF matching against a pasted job description
- 🔑 Keyword Detection — identifies missing key terms
- 📑 PDF Export — professional branded report with candidate card, match bar, and recommendations
- 🌐 i18n — Spanish / English UI
- ⚡ Zero-dependency local dev — SQLite + Celery eager mode (no Redis needed)
- 🚦 Rate limiting — slowapi-based protection
- 🗄 Alembic migrations — schema versioning for production deployments
graph LR
Client[React App] -->|Upload CV| API[FastAPI]
subgraph "API Layer"
API -->|Extract text| Parser[File Handler]
Parser --> API
end
API -->|"DEBUG=True (eager)"| DirectResult[Result to client]
API -->|"DEBUG=False (async)"| Redis[(Redis Queue)]
Redis --> Worker[Celery Worker]
subgraph "Background Worker"
Worker -->|TF-IDF match| Matcher[Matching Service]
Worker -->|AI inference| Groq[Groq API / Llama 3]
Worker -->|Persist| DB[(SQLite / PostgreSQL)]
end
Local dev runs with
DEBUG=True— Celery tasks execute synchronously and the result is returned directly in the upload response. No Redis needed.
| Layer | Technology |
|---|---|
| Backend | Python 3.13 · FastAPI · SQLAlchemy |
| Queue | Celery 5 · Redis (prod only) |
| Database | SQLite (dev) · PostgreSQL (prod) |
| Frontend | React 18 · TailwindCSS · TanStack Query |
| AI | Groq API (Llama 3-70b-8192) |
| ReportLab Platypus | |
| Migrations | Alembic |
| Rate Limiting | slowapi |
Double-click start_dev.bat — installs dependencies and starts backend + frontend.
Backend:
cd backend
pip install -r requirements.txt
uvicorn app.main:app --reload
# → http://localhost:8000Frontend:
cd frontend
npm install
npm start
# → http://localhost:3000Copy backend/.env.example to backend/.env and fill in:
DATABASE_URL=sqlite:///./nidus.db # or postgresql://...
SECRET_KEY=change-me-in-production
DEBUG=True
# Optional — enables AI features. Can also be passed per-request via X-Groq-Api-Key header.
GROQ_API_KEY=gsk_...
# Required in production only
REDIS_URL=redis://localhost:6379/0cd backend
# First time — stamp existing DB as up to date
alembic stamp head
# After changing a model
alembic revision --autogenerate -m "describe change"
alembic upgrade headcd backend
pytest tests/ -vTests use an in-memory SQLite database and cover:
GET /health— healthcheck and database connectivityGET /cvs— empty list retrievalPOST /upload-cv— TXT upload, bad extension
| Method | Endpoint | Auth | Description |
|---|---|---|---|
| POST | /upload-cv |
❌ | Upload & analyze CV |
| GET | /cvs |
❌ | List direct CVs (?skip=0&limit=20) |
| GET | /export-pdf/{id} |
❌ | Download branded PDF report |
| GET | /tasks/{task_id} |
❌ | Poll async task status |
| GET | /health |
❌ | Healthcheck status and DB verifications |
| Concern | Implementation |
|---|---|
| File uploads | Max 2 MB, only PDF/DOCX/TXT |
| Rate limiting | 200 req/min global |
| Secrets | .env file — never committed |
| AI key | Server-side GROQ_API_KEY; optional X-Groq-Api-Key header override |
Nidus/
├── backend/
│ ├── app/
│ │ ├── api/v1/endpoints/ # router, cvs, upload, tasks, health
│ │ ├── core/ # config, database, logging_config
│ │ ├── models/ # SQLAlchemy models
│ │ ├── repositories/ # DB access layer
│ │ ├── schemas/ # Pydantic schemas
│ │ ├── services/ # cv_engine (AI+Logic), document_engine (PDF+Uploads)
│ │ └── tasks/ # Celery tasks
│ ├── migrations/ # Alembic
│ └── tests/
├── frontend/
│ └── src/
│ ├── components/ # AnalysisPanel, Dropzone, Layout, ...
│ └── i18n.js # All UI strings (ES / EN)
└── docs/
├── agent.md # Generalista agent spec
└── skills.md # Super-Skills catalogue