AI-powered search and research assistant built with Next.js, Supabase, and Gemini 2.5.
- 🔍 Multi-Stage Search: 4 search modes from instant answers to deep research
- 🤖 AI-Powered: Gemini 2.5 Flash via LiteLLM for intelligent responses
- 📄 File Analysis: Upload and analyze PDFs, documents, and more
- 🔐 Secure Auth: Email, Google, and GitHub authentication via Supabase
- 📊 Usage Tracking: Monitor your searches and API usage
- 🎨 Beautiful UI: Modern design with dark/light mode support
- Frontend: Next.js 14, React, TypeScript, Tailwind CSS
- Backend: Next.js API Routes, LiteLLM (Python FastAPI)
- Database: Supabase (PostgreSQL with pgvector)
- Auth: Supabase Auth
- Search: Tavily API
- AI: Google Gemini 2.5 Flash
- Node.js 18+
- Python 3.11+
- Supabase account
- Tavily API key
- Google AI API key
-
Clone the repository:
git clone https://github.com/yourusername/panathos-ai.git cd panathos-ai -
Install dependencies:
npm install
-
Set up environment variables:
cp .env.example .env.local
Fill in your API keys and configuration.
-
Start the development server:
npm run dev
-
(Optional) Start the LiteLLM backend:
cd backend pip install -r requirements.txt uvicorn main:app --reload --port 8000
See .env.example for required environment variables.
├── app/ # Next.js App Router
│ ├── api/ # API routes
│ ├── auth/ # Auth pages
│ ├── chat/ # Chat interface
│ ├── dashboard/ # Usage dashboard
│ ├── files/ # File management
│ ├── research/ # Research projects
│ └── settings/ # User settings
├── components/ # React components
│ ├── chat/ # Chat-specific components
│ ├── layout/ # Layout components
│ ├── providers/ # Context providers
│ └── ui/ # shadcn/ui components
├── lib/ # Utility libraries
│ ├── context/ # Context management
│ ├── prompts/ # System prompts
│ ├── search/ # Tavily integration
│ ├── supabase/ # Supabase client
│ └── usage/ # Usage tracking
├── backend/ # LiteLLM Python service
└── docs/ # Documentation
| Mode | Time | Stages | Use Case |
|---|---|---|---|
| Instant | ~2s | 1 | Quick facts and definitions |
| Fast | ~10s | 2-3 | Simple research questions |
| Medium | ~30s | 4-6 | Thorough analysis |
| Slow | ~60s | 6-8 | Comprehensive research |
- Push to GitHub
- Import project to Vercel
- Add environment variables
- Deploy
- Deploy the
backend/directory - Set environment variables
- Update
LITELLM_API_URLin Vercel
Contributions are welcome! Please read our contributing guidelines first.
MIT License - see LICENSE for details.