Local Model Context Protocol (MCP) server with semantic search (RAG) over code repositories. Exposes the context of multiple projects to external AI clients — Claude.ai, Gemini, Claude Code — eliminating the need to manually repass context at each session.
Stack: Python · FastMCP · Qdrant · sentence-transformers · Next.js · Docker · Cloudflare Tunnel
Make sure the items below are installed before continuing:
| Tool | Minimum version | Check |
|---|---|---|
| Python | 3.11 | python3 --version |
| Docker | 24.0 | docker --version |
| Docker Compose | 2.20 | docker compose version |
| Node.js | 18.0 | node --version |
| npm | 9.0 | npm --version |
| Git | any | git --version |
| cloudflared | any | cloudflared --version |
A Cloudflare account with an active domain is required for remote connectivity (Phase 3).
# 1. Clone the repository
git clone https://github.com/ericlimabr/mcp-context.git
cd mcp-context
# 2. Copy the environment variables file
cp .env.example .env
# 3. Run setup — installs Python dependencies via uv and starts Qdrant
make setupNote on
GITHUB_TOKEN: The.envfile includes aGITHUB_TOKENvariable for read access to repositories. To get this token, go to your GitHub Settings > Developer settings > Personal access tokens. Generate a new token and paste it into your.envfile.
Edit the .env file created in the previous step:
# Root path where your projects are stored on the machine
PROJECTS_ROOT=/home/your-user/projects
# Access password for the administrative panel
ADMIN_PASSWORD=choose-a-strong-password# Authenticate cloudflared with your Cloudflare account
cloudflared tunnel login
# Create the permanent tunnel
cloudflared tunnel create mcp-context
# Associate with your subdomain (replace yourdomain.com)
cloudflared tunnel route dns mcp-context mcp.yourdomain.comCreate cloudflared/config.yml in the project root (this file is gitignored). Replace <id> with the UUID printed by the tunnel create command above:
tunnel: mcp-context
credentials-file: /home/your-user/.cloudflared/<id>.json
ingress:
- hostname: mcp.yourdomain.com
service: http://localhost:17800
- service: http_status:404This step is only needed once. It creates the Next.js project and removes the internal .git/ automatically:
make frontend-setupOpen the frontend, add a project path and trigger indexing:
make frontend
# Access http://localhost:17801Or run the indexer directly from the terminal:
make indexTo start the MCP server with auto-reload (recommended while developing):
make devThis starts Qdrant, the Cloudflare Tunnel, and the MCP server with hot-reload enabled. The server restarts automatically on every file change in server/.
To start all services in the background (MCP server, frontend, tunnel, Qdrant):
make server-devAvailable services:
| Service | URL |
|---|---|
| MCP Server | http://localhost:17800 |
| Frontend (admin + dashboard) | http://localhost:17801 |
| Qdrant (API) | http://localhost:17810 |
Real-time logs are available in logs/:
tail -f logs/server.log # MCP server
tail -f logs/frontend.log # frontend
tail -f logs/tunnel.log # Cloudflare Tunnel
make qdrant-logs # QdrantTo stop everything:
make stopTo run the project fully containerized — useful for simulating the production environment or having everything start together with Docker:
# First time: build the images
make prod-build
# Start all containers
make prod-up
# Stop everything
make prod-down
# Real-time logs
make prod-logsWarning: make sure
PROJECTS_ROOTin.envpoints to the correct directory before runningprod-up. The MCP server container mounts that path to access local files.
| Command | Description |
|---|---|
make setup |
First run: installs deps via uv and starts Qdrant |
make frontend-setup |
Creates the Next.js project in apps/frontend/ (once) |
make dev |
Starts Qdrant + tunnel + MCP server with auto-reload |
make server-dev |
Starts all services in the background |
make stop |
Stops all dev services |
make server |
Starts only the MCP server (port 17800) |
make frontend |
Starts only the Next.js frontend (port 17801) |
make tunnel |
Starts only the Cloudflare Tunnel |
make index |
Runs the indexing worker |
make qdrant-up |
Starts Qdrant via Docker Compose |
make qdrant-down |
Stops Qdrant |
make qdrant-logs |
Qdrant logs in real time |
make prod-build |
Build of all Docker images |
make prod-up |
Starts all containers |
make prod-down |
Stops all containers |
make prod-logs |
Logs of all containers in real time |
mcp-context/
├── server/ # MCP Server (FastMCP/SSE, port 17800)
│ ├── main.py # Entrypoint and MCP configuration
│ ├── tools/ # MCP tool definitions (planned)
│ ├── resources/ # MCP resource definitions (planned)
│ └── embeddings.py # Local embedding model loading (planned)
├── indexer/ # Project indexing worker (planned)
│ ├── worker.py # Orchestrates indexing
│ ├── chunker.py # Function-scope chunking
│ └── qdrant_client.py # Qdrant client abstraction
├── apps/
│ └── frontend/ # Next.js app — admin panel + dashboard (port 17801)
│ # Generated by make frontend-setup
├── cloudflared/ # Cloudflare Tunnel config (gitignored)
│ └── config.yml
├── docs/
│ ├── ARCHITECTURE.md # Detailed system architecture
│ ├── DECISIONS.md # Architecture decisions
│ ├── ENDPOINTS.md # Endpoints documentation
│ └── ROADMAP.md # Implementation phases
├── logs/ # Service logs in dev (gitignored)
├── .pids/ # Process PIDs in dev (gitignored)
├── qdrant_data/ # Persisted Qdrant data (gitignored)
├── config.json # Project configuration (gitignored)
├── .env # Environment variables (gitignored)
├── .env.example # Environment variables template
├── docker-compose.yml # Qdrant for development
├── docker-compose.prod.yml # All services for local deploy
├── Makefile # Command shortcuts
├── pyproject.toml # Python project and dependencies (uv)
├── CONTEXT.md # Project context for LLMs
└── README.md # This file
- The server depends on the local machine being on to work remotely via Cloudflare Tunnel
- The Qdrant index reflects the state of the code at the time of the last indexing — configure a
post-commithook to keep the index updated automatically after each commit - Binary files, images and assets are not indexed
- The embedding model (
jina-embeddings-v2-base-code, ~160 MB) is automatically downloaded on the first run viasentence-transformers