Skip to content

A Python-based server implementation of the Model Context Protocol (MCP) designed for Retrieval-Augmented Generation (RAG) workflows. It manages the flow of context, tool invocation and model output in one unified service. Supports modular “Models” and “Modules” directories for extensible integrations, environment configuration, and easy deployment

License

Notifications You must be signed in to change notification settings

NSANTRA/RAG-MCP-Server

Repository files navigation

TITLE

Python LangChain Claude Desktop Cursor IDE ChromaDB HuggingFace GPU License: MIT


TL;DR:

  • This project implements a Retrieval-Augmented Generation (RAG) MCP Server using LangChain wrappers for ChromaDB and Hugging Face models.
  • Designed for seamless integration with Claude Desktop and Cursor IDE as the MCP client.
  • Uses a single persistent Chroma vector database with multiple collections (domains).
  • Automatically retrieves and ranks the most relevant context for Claude, enabling domain-aware reasoning and citation-based responses.

TOC


PROJECT OVERVIEW

This project implements a LangChain-powered Retrieval-Augmented Generation (RAG) pipeline hosted as a FastMCP server for integration with Claude Desktop and Cursor IDE.

It uses:

  • langchain_chroma.Chroma for persistent, domain-based vector stores.
  • langchain_huggingface.HuggingFaceEmbeddings for local or HuggingFace embedding models.
  • langchain_community.cross_encoders.HuggingFaceCrossEncoder for local or HuggingFace reranking models for better relevance tracking.
  • FastMCP — a lightweight Python interface (built on FastAPI) that exposes LangChain-based retrieval tools to any MCP client such as Claude Desktop or Cursor IDE.

Each Chroma collection represents a distinct knowledge domain or document. Claude queries are routed to the appropriate collection, which retrieves top-k results and returns relevant context and citations.

⚡Workflow:

flowchart TD
  Claude[Claude Desktop]
  MCP[MCP Server: FastMCP + LangChain]
  LangChain[LangChain Wrappers → ChromaDB + HuggingFace]
  Claude --> MCP --> LangChain --> Claude
Loading

FEATURES

  • PDF Embedding: Add PDFs locally or via URL directly into a chosen collection.
  • Smart Retrieval: Retrieve context chunks per collection or across multiple collections.
  • Reranking Support: Uses a HuggingFace cross-encoder reranker for better document relevance.
  • Document Management: List, rename, and inspect metadata for locally stored documents.
  • Collection Management: Create, list, and delete ChromaDB collections dynamically.
  • Citation Provider: Citations are generated from document metadata (e.g., page numbers, source document and path, etc.).
  • Self-Describing Tools: describeTools() lists all available MCP tools dynamically for introspection.

AVAILABLE TOOLS

This MCP server exposes a set of tools that can be invoked by MCP Client to perform document and collection operations — including embedding, retrieval, metadata management, and citation generation.

For a full list of available tools, their arguments, and example usage, see the dedicated documentation:
View All Tools → TOOLS.md


GETTING STARTED

🔧 Prerequisites

Important

  • Ensure Anaconda is installed, if not you can download from Anaconda and also Git (if not available, download from Github).

⚙️ Installation

  1. Create and Activate Conda Environment
conda create -n MCP python=3.11.13 -y
conda activate MCP
  1. Clone the Repository
git clone https://github.com/NSANTRA/RAG-MCP-Server.git
cd RAG-MCP-Server
  1. Install Dependencies
pip install -r requirements.txt
  1. Configure .env
# Example:

# If your system has Nvidia GPU CUDA Toolkit setup, you can set the device to cuda, otherwise set it to cpu
DEVICE = "cuda"

DOCUMENT_DIR = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Documents"
CHROMA_DB_PERSIST_DIR = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Databases"

EMBEDDING_MODEL = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Models/MiniLM"
RERANKER_MODEL = "C:/Users/<yourusername>/Projects/RAG-MCP-Server/Models/MiniLM-Reranker"

Caution

You need to mention the absolute path wherever needed.

Tip

  • The above mentioned configuration uses local downloaded models. You can download the models using the Download Model.py python script. Change the models, if needed.
  • You can swap the embedding or reranker paths for any HuggingFace models.

INTEGRATIONS

Important

You need to download the Claude Desktop app or Cursor IDE in order to run the MCP Server as it needs a MCP Client. You can download:

The above mentioned MCP clients automatically launches the RAG MCP Server when it’s registered in the MCP configuration file.
You do not need to run the Python script manually.

Claude Desktop Integration

🛠️ Setup Instructions

  • Add the following entry to your Claude MCP configuration file (typically located in your Claude Desktop settings folder).
  • You can find the mcp configuration file here: Settings → Developer → Edit Config to open the file.
  • Then, add the following JSON config:
{
  "mcpServers": {
    "RAG": {
      "command": "C:/Users/<yourusername>/anaconda3/envs/MCP/python.exe",
      "args": ["<absolute to the Main.py>"],
      "options": {
        "cwd": "absolute project root directory path"
      }
    }
  }
}

⚠️ Common Issue: If Claude fails to start the MCP server, ensure that:

  • The Python path points to your Conda environment’s executable.
  • Main.py has no syntax errors and dependencies are installed.
  • The cwd option matches your project root directory.

Cursor IDE Integration

🛠️ Setup Instructions

  • Open your project in Cursor IDE and go to File → Preferences → Cursor Setting → Tool & MCP → New MCP Server to open your MCP configuration file.
  • Add the following JSON entry under the "mcpServers" section (adjusting paths as needed):
{
  "mcpServers": {
    "RAG": {
      "command": "C:/Users/<yourusername>/anaconda3/envs/MCP/python.exe",
      "args": ["<absolute to the Main.py>"],
      "options": {
        "cwd": "absolute project root directory path"
      }
    }
  }
}

PROJECT STRUCTURE

├── Main.py                 # Entry point - starts the FastMCP server
│
├── Modules/
│ ├── Config.py             # Loads env vars, sets up embeddings & reranker
│ ├── Core.py               # Document-level utilities (metadata, citation, rename)
│ ├── Database.py           # ChromaDB logic for embedding/retrieval
│ ├── Utils.py              # Helper functions (file ops, reranking)
│ └── ToolDefinition.py     # MCP tool manifests and argument schemas
│
├── .env                    # Environment configuration
├── requirements.txt        # Dependencies
└── README.md

REFERENCES

  1. LangChain RAG Workflow
    LangChain Documentation — RAG

  2. Chroma Vector Database
    Chroma Docs

  3. HuggingFace Embeddings and Cross-Encoders
    Sentence Transformers
    Cross-Encoder Models

  4. Anthropic MCP & Claude Desktop
    Model Context Protocol Official Site
    Claude Desktop Overview


LICENSE

MIT License

Copyright (c) 2025 Neelotpal Santra

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

About

A Python-based server implementation of the Model Context Protocol (MCP) designed for Retrieval-Augmented Generation (RAG) workflows. It manages the flow of context, tool invocation and model output in one unified service. Supports modular “Models” and “Modules” directories for extensible integrations, environment configuration, and easy deployment

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages