Cortex AI is an advanced AI-powered search and chat platform that integrates multiple language models and search providers for enhanced information retrieval and interaction.
- Multiple Search Providers: SearxNG, Tavily, Serper, Bing
- Multiple LLM Support: Supports OpenAI, Groq, Azure, Ollama and custom models
- Docker Integration: Fully dockerized setup for easy deployment
- Modern UI: Sleek interface built with Next.js and Tailwind CSS
- Local Modes: Support for running with local models
├── src/
│ ├── backend/ # Python FastAPI backend
│ │ ├── alembic/ # Database migrations
│ │ ├── db/ # Database models and utilities
│ │ ├── llm/ # LLM integrations
│ │ ├── search/ # Search provider integrations
│ │ └── main.py # Main FastAPI application
│ │
│ └── frontend/ # Next.js frontend
│ ├── app/ # Next.js app router
│ ├── components/ # UI components
│ ├── hooks/ # React hooks
│ ├── lib/ # Utility functions
│ └── styles/ # CSS styles
│
├── docker-compose.dev.yaml # Docker Compose configuration
├── standalone.Dockerfile # All-in-one Docker setup
├── .env-example # Environment variables template
└── start.sh # Startup script
- Docker and Docker Compose
- Node.js (for local development)
- Python 3.11+ (for local development)
-
Clone the repository:
git clone https://github.com/i-anubhav-anand/Cortex_AI.git cd Cortex_AI -
Copy the environment template:
cp .env-example .env
-
Update the
.envfile with your API keys and preferences -
Run using Docker Compose:
./start.sh
Or manually:
docker-compose -f docker-compose.dev.yaml up --build
-
Access the application:
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
- SearxNG: http://localhost:8080
-
Navigate to the backend directory:
cd src/backend -
Create a virtual environment and install dependencies:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install poetry poetry install
-
Start the backend server:
uvicorn main:app --reload --host 0.0.0.0 --port 8000
-
Navigate to the frontend directory:
cd src/frontend -
Install dependencies:
pnpm install
-
Start the development server:
pnpm dev
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.