AI-powered resume analysis and optimization platform that helps job seekers tailor their resumes to specific job descriptions using GPT-4o.
- π€ AI-Powered Analysis: Uses OpenAI GPT-4o to analyze resumes against job descriptions
- π PDF Processing: Extracts text from PDF resumes for analysis
- π‘ Smart Suggestions: Provides categorized recommendations (keyword gaps, bullet updates, section additions/removals, wording improvements)
- π Analysis History: Track all your resume analyses in one place
- π― Quantifiable Metrics: Get suggestions that include measurable achievements
- π Secure Authentication: Built with Supabase Auth and Row Level Security
- π Copy to Clipboard: Easily copy AI suggestions to update your resume
- React 18 with Vite
- React Router v6 for navigation
- TailwindCSS for styling
- Material Symbols icons
- Supabase client for authentication
- Node.js with Express
- Supabase for database and authentication
- OpenAI GPT-4o API for AI analysis
- pdf.js-extract for PDF text extraction
- Multer for file uploads
- PostgreSQL (via Supabase)
- Row Level Security (RLS) policies
- Real-time capabilities
- Node.js 18+ and npm
- Supabase account and project
- OpenAI API key with GPT-4o access
- Two terminals (frontend + backend)
# Install root dependencies
npm install
# Install backend dependencies
cd apps/backend
npm install
# Install frontend dependencies
cd ../frontend
npm installcd apps/backend
cp .env.example .envEdit apps/backend/.env and add your credentials:
SUPABASE_URL=your_supabase_project_url
SUPABASE_ANON_KEY=your_supabase_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key
OPENAI_API_KEY=your_openai_api_key
PORT=4000cd apps/frontend
cp .env.example .envEdit apps/frontend/.env and add your Supabase credentials:
VITE_SUPABASE_URL=your_supabase_project_url
VITE_SUPABASE_ANON_KEY=your_supabase_anon_key- Go to your Supabase project dashboard
- Navigate to the SQL Editor
- Copy the contents of
supabase/schema.sqland execute it - This creates all necessary tables, policies, and indexes
# Terminal 1 - Start backend
npm run dev:backend
# Backend runs on http://localhost:4000
# Terminal 2 - Start frontend
npm run dev:frontend
# Frontend runs on http://localhost:5173- Open http://localhost:5173 in your browser
- Click "Sign Up" to create an account
- After signing in, you'll be redirected to the dashboard
- Navigate to "Analyze" to upload a resume and start analysis
skillsync/
βββ apps/
β βββ backend/ # Express API server
β β βββ src/
β β β βββ lib/
β β β β βββ supabase.js # Supabase clients (anon + admin)
β β β βββ routes/
β β β β βββ analyses.js # Analysis CRUD endpoints
β β β βββ services/
β β β β βββ openai.js # GPT-4o integration
β β β β βββ pdfParser.js # PDF text extraction
β β β βββ server.js # Express app entry point
β β βββ .env.example
β β βββ package.json
β β βββ README.md
β β
β βββ frontend/ # React SPA
β βββ src/
β β βββ auth/
β β β βββ AuthContext.jsx # Auth state management
β β β βββ ProtectedRoute.jsx # Route protection
β β βββ components/ # Reusable UI components
β β βββ layouts/ # Page layouts
β β βββ pages/
β β β βββ Landing.jsx # Public landing page
β β β βββ Login.jsx # Login page
β β β βββ Register.jsx # Registration page
β β β βββ Dashboard.jsx # Main dashboard
β β β βββ Analyze.jsx # Resume upload page
β β β βββ History.jsx # Analysis history
β β β βββ Results.jsx # Analysis results viewer
β β βββ App.jsx # Main app component
β β βββ main.jsx # App entry point
β βββ .env.example
β βββ package.json
β βββ README.md
β
βββ supabase/
β βββ schema.sql # Complete database schema
β βββ policies.sql # Row Level Security policies
β
βββ package.json # Root package.json with scripts
βββ README.md # This file
- Upload: User uploads PDF resume, provides job title and description
- Extract: Backend extracts text from PDF using pdf.js-extract
- Analyze: Text is sent to GPT-4o for analysis against job requirements
- Store: Analysis and suggestions are stored in PostgreSQL
- Display: Frontend polls for status and displays results when complete
- keyword_gap: Missing skills or keywords from job description
- bullet_update: Improvements to existing bullet points (add metrics, action verbs)
- section_add: New sections to add (e.g., relevant skills, summary)
- section_remove: Sections that don't align with the job
- wording_tone: Better phrasing or professional tone improvements
- Frontend: Supabase client handles signup/signin
- Backend Cookie: After auth, frontend sends token to backend to set httpOnly cookie
- Dual Clients: Backend uses both anon key (for auth) and service role key (for RLS bypass)
- Middleware:
withAuthmiddleware validates JWT on protected routes
See apps/backend/README.md for complete API documentation.
POST /api/auth/set-cookie- Set authentication cookieGET /api/session- Get current sessionPOST /api/analyses- Create new analysis (multipart/form-data)GET /api/analyses- List all user analysesGET /api/analyses/:id- Get specific analysis with suggestionsGET /api/analyses/:id/status- Poll analysis status
- OpenAI GPT-4o Pricing: ~$0.02-$0.05 per resume analysis
- Supabase: Free tier supports up to 500MB database and 2GB bandwidth
- Recommended: Monitor OpenAI usage and set spending limits
- HttpOnly cookies for authentication tokens
- Row Level Security (RLS) on all database tables
- File size limits (5MB max for PDFs)
- PDF validation before processing
- CORS configuration for frontend origin
- Service role key used only on backend (never exposed to client)
- Check that port 4000 is not in use:
lsof -ti:4000 | xargs kill -9 - Verify all environment variables are set correctly
- Ensure Supabase service role key is configured
- Verify backend is running on http://localhost:4000
- Check browser console for CORS errors
- Ensure
credentials: 'include'is set on fetch requests
- This was an RLS issue - fixed by using
supabaseAdminclient on backend - Verify
SUPABASE_SERVICE_ROLE_KEYis set in backend.env
- Check that cookies are being set (Network tab β Application β Cookies)
- Verify
/api/auth/set-cookieis called after login - Check AuthContext is wrapping the entire app in
main.jsx
- Create a new branch for your feature
- Make changes and test thoroughly
- Update relevant README files
- Submit a pull request with clear description
MIT
For issues or questions, please open an issue on GitHub.