An intelligent, AI-powered portfolio website that showcases projects, skills, and experience through an interactive conversational interface. Built with Next.js 14 and FastAPI, featuring real-time AI chat powered by LangChain and OpenAI.
AI Portfolio is a full-stack application that transforms the traditional portfolio experience into an engaging conversation. The platform features:
- AI-Powered Chatbot: Intelligent assistant that answers questions about projects, skills, and experience using RAG (Retrieval-Augmented Generation)
- Interactive Tour Guide: Step-by-step guided tour optimized for both desktop and mobile devices
- Real-time Streaming: Server-Sent Events (SSE) for smooth, responsive chat interactions
- Dynamic Content Display: Project showcases, skills visualization, and resume integration
- Smart Contact System: Direct email integration with resume delivery functionality
- Command Palette: Quick navigation with keyboard shortcuts (
/projects,/skills,/resume)
Portfolio/
βββ frontend/ # Next.js application
β βββ app/ # Next.js App Router
β β βββ globals.css # Global styles and animations
β β βββ layout.tsx # Root layout with providers
β β βββ page.tsx # Main chat page
β βββ components/ # Reusable UI components
β β βββ ui/ # UI primitives (buttons, dialogs, etc.)
β βββ features/ # Feature-based modules
β β βββ chat/ # Chat functionality
β β β βββ components/ # Chat UI components
β β β βββ context/ # Chat state management
β β β βββ hooks/ # Chat-related hooks
β β β βββ lib/ # Chat utilities (streaming, etc.)
β β βββ tour/ # Interactive tour guide
β β βββ sidebar/ # Navigation sidebar
β β βββ projects/ # Project showcase
β β βββ skills/ # Skills visualization
β β βββ resume/ # Resume display
β β βββ contact/ # Contact form
β β βββ command-palette/ # Keyboard shortcuts
β βββ lib/ # Utilities and helpers
β β βββ session.ts # Session ID management for rate limiting
β β βββ utils.ts # Helper functions
β βββ services/ # API client services
β βββ public/ # Static assets (resume.pdf, images)
β
βββ backend/ # FastAPI application
β βββ app/
β β βββ core/ # Configuration and logging
β β β βββ config.py # Environment settings
β β β βββ logging.py # Logging configuration
β β β βββ limiter.py # Rate limiting configuration
β β β βββ exceptions.py # Custom exception handlers
β β βββ routers/ # API endpoints
β β β βββ chat.py # Streaming chat endpoint
β β β βββ contact.py # Contact form endpoint
β β βββ services/ # Business logic
β β β βββ agent_service.py # LangChain AI agent
β β β βββ rag_service.py # Vector DB & retrieval
β β β βββ contact_service.py # Email service
β β βββ models/ # Data models
β β βββ data/ # Knowledge base documents
β β β βββ resume.txt
β β β βββ projects.txt
β β β βββ skills.txt
β β β βββ bio.txt
β β βββ main.py # Application entry point
β βββ requirements.txt # Python dependencies
β βββ Dockerfile # Docker configuration
β βββ .env.example # Environment template
β
βββ docker-compose.yml # Multi-service orchestration
- Next.js 14 - React framework with App Router
- React 18 - UI library with TypeScript
- Tailwind CSS 4 - Utility-first styling
- Framer Motion - Smooth animations and transitions
- Radix UI - Accessible component primitives
- React Query - Data fetching and caching
- React Markdown - Markdown rendering
- Embla Carousel - Touch-friendly carousels
- Lucide Icons - Icon library
- FastAPI - High-performance Python web framework
- LangChain - AI agent orchestration and tooling
- OpenAI GPT-4 - Language model for intelligent responses
- PostgreSQL + pgvector - Vector database for semantic search
- Redis - Chat history and session storage
- Uvicorn - ASGI server
- Docker & Docker Compose - Containerization
- Server-Sent Events (SSE) - Real-time streaming
- FastAPI Mail - Email service integration
- Session-Based Rate Limiting - API protection with SlowAPI (per-user tracking)
- Node.js 18+ and npm
- Python 3.10+ and pip
- Docker and Docker Compose (for local database services)
- OpenAI API Key - Get from OpenAI Platform
- SMTP Email Credentials - For contact form (Gmail, SendGrid, etc.)
This option uses Docker to run PostgreSQL and Redis locally, making setup much easier.
git clone https://github.com/royamit1/portfolio.git
cd Portfolio
# Install frontend dependencies
cd frontend
npm install
# Install backend dependencies
cd ../backend
pip install -r requirements.txtCopy the environment template and configure it:
cp .env.example .env(Optional) Configure frontend environment:
cp frontend/.env.example frontend/.envEdit .env and fill in your API keys:
# Required: Get from https://platform.openai.com/api-keys
OPENAI_API_KEY="sk-..."
# Required: Your email for contact form
MAIL_USERNAME="your.email@gmail.com"
MAIL_PASSWORD="your_app_password"
OWNER_EMAIL="your.email@gmail.com"
# The database and Redis URLs are pre-configured for Docker
# No changes needed for DATABASE_URL and REDIS_URL if using Docker- Backend Data: Edit files in
backend/app/data/(resume.txt, projects.txt, etc) to match your profile. - Configuration: Edit
backend/app/core/config.pyto setPORTFOLIO_OWNERandRESUME_LINK. - PDF Resume: Replace
frontend/public/resume.pdfwith your own file. - Images: Replace
frontend/public/profile.jpgwith your photo.
# Edit these files with your information:
# - resume.txt (Your professional summary, experience, education)
# - projects.txt (Your projects with descriptions and tech stacks)
# - skills.txt (Your technical skills and expertise)
# - bio.txt (Your background and story)# Go back to project root
cd ../..
# Start PostgreSQL and Redis in the background
docker-compose up -d postgres redis
# Check that services are running
docker-compose pscd backend
uvicorn app.main:app --reload --port 8000The backend will automatically ingest your portfolio data into the vector database on first startup.
Open a new terminal:
cd frontend
npm run devThe application will be available at http://localhost:3000.
When you're done developing:
# Stop the application (Ctrl+C in both terminals)
# Stop Docker services
docker-compose down
# To also remove volumes (delete all data):
# docker-compose down -vIf you prefer using external services, like in production (Neon, Upstash, etc.):
- Set up your PostgreSQL database on Neon or Supabase
- Enable the
pgvectorextension
- Enable the
- Set up your Redis instance on Upstash
- Copy
backend/.env.exampletobackend/.env - Replace the
DATABASE_URLandREDIS_URLwith your external service URLs - Continue with steps 1-3 and 5-6 from Option 1 (skip step 4)
To run the optimized production build (frontend + backend + DBs) on a single server (VPS):
- Clone repo and setup environment (steps 1-2 from Option 1).
- Run with the production compose file:
docker-compose -f docker-compose.prod.yml up -d --buildThis utilizes the multi-stage frontend/Dockerfile (target: runner) to build a standalone, optimized image (~100MB).
Note: All docker-compose commands should be run from the project root directory.
| Command | Description |
|---|---|
docker-compose up |
Start all services |
docker-compose up --build |
Rebuild and start all services |
docker-compose up -d |
Start services in background (detached) |
docker-compose down |
Stop all services |
docker-compose down -v |
Stop and remove volumes (delete data) |
docker-compose logs -f |
View logs from all services |
docker-compose logs -f backend |
View backend logs only |
docker-compose ps |
Check service status |
docker-compose restart backend |
Restart backend service only |
The backend uses LangChain to create an intelligent AI agent with two primary tools:
- Portfolio Knowledge Base: Vector search using RAG to retrieve relevant information from your portfolio documents
- Resume Email Tool: Sends PDF resume via email to interested parties
User Message β Frontend β FastAPI Backend β LangChain Agent
β
[Tool Selection]
β
ββββββββββββββββββββββ΄βββββββββββββββββββββ
β β
Portfolio Knowledge Base Resume Email Tool
(Vector Search in PostgreSQL) (SMTP Email Service)
β β
Retrieved Context Email Sent
β β
ββββββββββββββββββ GPT-4 Response ββββββββ
β
Stream to Frontend (SSE)
β
Live Chat Display
On startup, the backend automatically:
- Reads documents from
backend/app/data/ - Splits text into semantic chunks
- Generates embeddings using OpenAI
- Stores in PostgreSQL with pgvector extension
- Enables semantic search for relevant information retrieval
The application is designed to be deployed with:
Both support automatic deployment on every push to the main branch.
Note: For production, use external database services like Neon for PostgreSQL and Upstash for Redis.
cd backend
python -m pytest
# With coverage
pytest --cov=app tests/- Next.js Documentation - Next.js features and API
- FastAPI Documentation - FastAPI framework
- LangChain Documentation - AI agent framework
- OpenAI API Reference - GPT models
- PostgreSQL + pgvector - Vector similarity search
- Docker Compose - Multi-container applications
- Tailwind CSS - Styling framework
- Framer Motion - Animation library
This project is licensed under the MIT License - see the LICENSE file for details.
Roy Amit
- Portfolio: royamit.vercel.app
- GitHub: @royamit1
- LinkedIn: Roy Amit
β Star this repo if you found it helpful!