150 applications. One offer. Each application took 5+ manual steps.
Separate tools, separate tabs, separate sites — none of them talking to each other. Generic output. Over an hour per application.
Paste a job description — or pull it from any job site with the Chrome extension — and five AI agents run an orchestrated pipeline in under 30 seconds: analyzing the role, scoring your fit, researching the company, writing a targeted cover letter, and tailoring your resume to the role. Sequential where it needs to be, parallel where it can be, each agent's output feeding the next.
Also includes a dashboard to track every application. And tools for everything around it: interview prep with mock sessions, salary negotiation, job comparison, follow-ups, thank you notes, and references.
Runs on your machine. No subscriptions, no data stored on our servers — just your own Gemini API key connecting directly to Google.
Here's what a completed application looks like:
Six AI Agents · Career Tools · Quick Start · Gemini API Key · Chrome Extension · Highlights · Optional Features · Developer Setup · Environment Variables · How It Works · Project Structure · Contributing · License
Paste a job description and the pipeline runs automatically:
| Agent | What it produces |
|---|---|
| Job Analyzer | Structured breakdown of requirements, skills, and ATS keywords |
| Profile Matcher | Fit score, strengths to highlight, gaps to address, application strategy |
| Company Research | Culture, leadership style, interview approach, watch-out notes |
| Resume Advisor | Per-bullet rewrites, ATS alignment score, before-you-submit checklist |
| Cover Letter Writer | Personalized cover letter, regenerate with one click |
| Interview Prep (standalone) | Role-specific questions, model answers, full mock interview session |
Standalone tools you can use any time — no job description needed:
| Tool | What it does |
|---|---|
| Follow-up Email | Post-application and post-interview follow-ups |
| Thank You Note | Interviewer thank you note, ready to send |
| Salary Coach | Negotiation script based on your offer and market data |
| Rejection Analyzer | Lessons learned and re-application strategy from a rejection email |
| Reference Request | Professional reference request for a specific contact |
| Job Comparison | Side-by-side comparison of 2–3 open roles |
Three ways to run it — pick the one that suits you:
| Docker (all platforms) | No Docker (macOS) | Manual | |
|---|---|---|---|
| Command | make start |
make start-local |
make dev |
| Requires | Docker Desktop | macOS only | PostgreSQL + Redis running yourself |
| First run | ~2 min (builds Docker image) | ~3 min (installs Postgres + Redis) | Depends on your setup |
| Subsequent runs | ~5 sec | ~5 sec | ~5 sec |
What you need: Docker Desktop installed and running (installs WSL2 automatically on Windows). make start will tell you if it isn't running.
macOS / Linux — make is pre-installed:
git clone https://github.com/eliornl/applypilot.git
cd applypilot
make startWindows — install just (winget install Casey.Just) instead of make. It works natively in PowerShell and cmd — no WSL2 needed:
git clone https://github.com/eliornl/applypilot.git
cd applypilot
just startBoth commands do the same thing on first run:
- Copies
.env.local.example→.envand fills in strong random secrets automatically - Builds the Docker image (takes ~2 min, only on the first run)
- Starts PostgreSQL, Redis, and the app at http://localhost:8000
make start-d / just start-d # run in background
make docker-logs / just docker-logs # watch the log
make docker-down / just docker-down # stop everything (data preserved)
make docker-reset / just docker-reset # stop and wipe all dataWhat you need: macOS. No Docker, no manual installs — make start-local installs everything it needs (Homebrew, Python 3, Node.js, PostgreSQL, Redis) automatically on the first run. If Homebrew isn't installed yet, you'll be prompted for your sudo password once in the terminal — this is normal and required to install Homebrew.
git clone https://github.com/eliornl/applypilot.git
cd applypilot
make start-localmake start-local handles everything on the first run:
- Installs Homebrew, Python 3, and Node.js if not already present
- Creates venv, installs Python and Node dependencies, builds the frontend
- Copies
.env.local.example→.envand fills in strong random secrets automatically - Installs PostgreSQL 17 and Redis via Homebrew (first run only)
- Creates the database and user, runs migrations
- Starts the app at http://localhost:8000
make start-local # start everything
make stop-local # stop PostgreSQL and Redis when done
make dev # restart just the app (when services are already running)Use this if you already have PostgreSQL and Redis running (any platform, any setup). If you're on macOS and don't have them, use Option B instead — it installs everything for you.
Step 1 — Clone and set up the project
macOS / Linux:
git clone https://github.com/eliornl/applypilot.git
cd applypilot
make setup # creates venv, installs deps, builds frontend, generates .envWindows — install just (winget install Casey.Just) first:
git clone https://github.com/eliornl/applypilot.git
cd applypilot
just setupStep 2 — Create the database user and database
Connect to PostgreSQL as a superuser (usually postgres) and run:
CREATE USER applypilot WITH PASSWORD 'applypilot';
CREATE DATABASE applypilot OWNER applypilot;You can run these with psql -U postgres or any PostgreSQL client (pgAdmin, TablePlus, etc.).
Tip: Using
applypilotas the password matches the default in.env— you can skip Step 3 entirely. If you choose a different password, updateDATABASE_URLin Step 3.
Step 3 — Edit .env with your connection strings (skip if you used the default password above)
Open .env and update DATABASE_URL to match the password you chose:
DATABASE_URL=postgresql+asyncpg://applypilot:yourpassword@localhost:5432/applypilot
REDIS_URL=redis://localhost:6379/0Step 4 — Run migrations and start the app
make migrate / just migrate # creates all database tables
make dev / just dev # start the app at http://localhost:8000From then on, as long as PostgreSQL and Redis are running, make dev / just dev is all you need.
INFO: Application startup complete.
Open http://localhost:8000 in your browser and create your account. During profile setup you'll be prompted to add your Gemini API key — or you can add it later in Settings → AI Setup.
AI features require a key from Google AI Studio.
- Go to aistudio.google.com/api-keys
- Sign in with your Google account
- Click Create API key — copy the entire key string (Google may show different formats over time).
- Paste it in ApplyPilot — you'll be prompted during profile setup, or add it later via Settings → AI Setup
For personal use that's all — no .env editing needed. Each user stores their own key, encrypted in the database.
For multi-user hosting: add GEMINI_API_KEY=<your key> to .env to set a shared server-side key so users don't need to provide their own.
Extract any job posting with one click — works on any job site or company careers page.
- Open chrome://extensions in Chrome
- Enable Developer Mode (toggle, top-right corner)
- Click Load unpacked
- Select the
extension/folder from this repo
The extension appears in your Chrome toolbar. Navigate to any job posting and click it — the role is extracted and sent to ApplyPilot automatically.
- Local-first — PostgreSQL, Redis, and the app all run on your machine. One command to start, no external services required.
- Full profile system — work experience, skills, career preferences; agents use your profile in every output.
- BYOK AI keys — each user adds their own Gemini key via Settings, or the admin sets one server-wide key.
- Google OAuth — optional "Continue with Google" alongside standard email/password.
- Multi-user ready — JWT auth, encrypted key storage, rate limiting per user, soft delete.
- No analytics by default — PostHog is disabled unless you explicitly enable it in
.env. - Data ownership — everything lives in your local PostgreSQL database. Delete the volume and it's gone.
For a personal single-user setup this is usually not needed. To enable:
# Add to .env:
SMTP_HOST=smtp.gmail.com
SMTP_PORT=587
SMTP_USERNAME=your-gmail@gmail.com
SMTP_PASSWORD=your-app-password # myaccount.google.com/apppasswords
SMTP_FROM_EMAIL=your-gmail@gmail.com
SMTP_FROM_NAME=ApplyPilot
DISABLE_EMAIL_VERIFICATION=false # require email verification on sign-up- Google Cloud Console → APIs & Services → Credentials
- Create an OAuth 2.0 Client ID (Web application)
- Set authorized redirect URI:
http://localhost:8000/api/v1/auth/google/callback - Add to
.env:
GOOGLE_CLIENT_ID=your-client-id.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET=your-client-secretDisabled by default. To enable:
- Create a free project at posthog.com
- Add to
.env:
POSTHOG_ENABLED=true
POSTHOG_API_KEY=phc_your-api-key
POSTHOG_HOST=https://us.i.posthog.com # or your self-hosted instanceUse this if you have a Google Cloud project and want to use Vertex AI instead of a direct Gemini API key. End users are not affected — they still add their own Google AI Studio key via Settings.
USE_VERTEX_AI=true
VERTEX_AI_PROJECT=your-gcp-project-id
VERTEX_AI_LOCATION=global # required for gemini-3-* modelsRequires Application Default Credentials (gcloud auth application-default login) or a service account in the environment.
macOS (no Docker) — see Option B in Quick Start. After the first run, restarting the app is just:
make dev # restart the FastAPI server (Postgres + Redis already running)Frontend changes — after editing any JS or CSS file, rebuild assets and hard-refresh:
make build-frontend # rebuilds dist/ and updates manifest.json
# Then Cmd+Shift+R in the browser (no server restart needed in dev mode)Linux / custom setup — see Option C in Quick Start.
| Command | What it does |
|---|---|
make start-local |
No Docker: install services + setup + migrate + start app (macOS) |
make stop-local |
Stop PostgreSQL and Redis Homebrew services |
make start / just start |
Docker: generate .env + start all services (foreground) |
make start-d / just start-d |
Docker: generate .env + start all services (background) |
make docker-down / just docker-down |
Stop Docker services, keep data |
make docker-reset / just docker-reset |
Stop Docker services, wipe data volumes |
make docker-logs / just docker-logs |
Tail the Docker app log |
make setup / just setup |
Dev setup: venv + Python/Node deps + frontend build |
make dev / just dev |
Start FastAPI dev server with auto-reload (services must be running) |
make migrate / just migrate |
Run Alembic database migrations |
make build-frontend / just build-frontend |
Compile and content-hash JS/CSS assets |
make test / just test |
Run the test suite |
make lint / just lint |
Run ruff linter |
make clean |
Remove venv and compiled artefacts |
.env is created and populated automatically by make start, make start-local, or make setup. You normally don't need to touch it.
| Variable | Default | Description |
|---|---|---|
JWT_SECRET |
Auto-generated | Signs auth tokens |
ENCRYPTION_KEY |
Auto-generated | Encrypts stored API keys |
DATABASE_URL |
Set automatically | PostgreSQL connection |
REDIS_URL |
Set automatically | Redis connection |
GEMINI_API_KEY |
(empty) | Server-wide AI key — users can add their own during profile setup or via Settings → AI Setup |
GEMINI_MODEL |
gemini-3-flash-preview |
AI model to use — users can change this in Settings → AI Setup |
BASE_URL |
http://localhost:8000 |
Used in password-reset and verification email links |
DISABLE_EMAIL_VERIFICATION |
true |
Set false when SMTP is configured |
GOOGLE_CLIENT_ID |
(empty) | Enables "Continue with Google" |
SMTP_HOST |
(empty) | Enables password-reset emails |
DEBUG |
true |
Set false in any shared or public environment |
USE_VERTEX_AI |
false |
Server-admin: use Google Cloud Vertex AI instead of a direct API key |
Full reference with comments: .env.local.example
Browser / Chrome Extension
│
▼
┌──────────────────────────────┐
│ FastAPI app │ Python 3.13, async
│ uvicorn · port 8000 │
└──────────┬───────────────────┘
│
├── PostgreSQL users, profiles, job applications, workflow sessions, agent outputs
├── Redis caching, rate limiting, auth state, background task locks
│
└── Five-Agent Pipeline (Google Gemini + LangGraph)
Job Analyzer
↓
Profile Matcher ← gates on low fit score
↓
Company Research
↓
Resume Advisor + Cover Letter Writer (parallel)
Interview Prep ← standalone, runs on demand
Six career tools (Follow-up Email, Thank You Note, Salary Coach,
Rejection Analyzer, Reference Request, Job Comparison)
← standalone, no job description needed
Frontend: server-rendered HTML + vanilla JS, no framework. Assets are compiled and content-hashed with esbuild. The Chrome extension uses Manifest V3 and posts directly to your local server.
applypilot/
├── main.py # FastAPI app entry point
├── agents/ # 5 workflow agents + interview prep + 6 career tool agents
├── workflows/ # LangGraph pipeline orchestration and state schema
├── api/ # FastAPI route handlers
├── config/ # Settings (Pydantic BaseSettings + .env)
├── models/ # SQLAlchemy ORM models and database setup
├── utils/ # Auth, email, Redis, encryption, LLM client helpers
├── alembic/ # Database migrations
├── extension/ # Chrome Extension (Manifest V3)
├── ui/ # HTML templates + JS + CSS
│ ├── index.html # Landing page
│ ├── dashboard/ # All dashboard pages
│ ├── auth/ # Login, register, verify
│ ├── profile/ # Profile setup
│ ├── partials/ # Shared template fragments
│ └── static/ # Compiled assets (esbuild output)
├── tests/ # Unit + integration tests (pytest)
│ ├── test_agents/ # Agent unit tests
│ └── test_api/ # API integration tests (no live server needed)
├── e2e/ # Playwright end-to-end tests
├── docs/ # Demo GIF and logo assets
├── docker-compose.yml # Local: postgres + redis + app
├── Dockerfile # Multi-stage build: Node (frontend) → Python
├── Makefile # Dev workflow shortcuts (macOS / Linux)
├── Justfile # Same shortcuts for Windows (just)
├── requirements.txt # Python dependencies
├── CHANGELOG.md # Version history
├── CONTRIBUTING.md # Contribution guide
├── USER_GUIDE.md # End-user documentation
└── .env.local.example # Config template (make start copies this to .env)
Contributions are welcome. Open an issue first to discuss what you'd like to change.
- Fork the repo
- Create a feature branch:
git checkout -b feature/my-feature - Make your changes and run the tests:
make test - Open a pull request
MIT — use it, fork it, modify it, self-host it.
