Skip to content

PathRAG System - A Path-based Retrieval-Augmented Generation implementation with knowledge graph visualization and Ollama integration for enhanced question answering.

Notifications You must be signed in to change notification settings

Anubis-Labs/PathRAG-System

Repository files navigation

PathRAG - Path-based Retrieval Augmented Generation

PathRAG is an implementation of the paper "PathRAG: Pruning Graph-based Retrieval Augmented Generation with Relational Paths". This system improves retrieval-augmented generation by extracting and utilizing key relational paths from an indexing graph.

System Overview

The application consists of four main components running in Docker containers:

  1. Frontend (React + Vite)

    • Modern UI built with Material-UI
    • Interactive knowledge graph visualization
    • File upload and query interface
    • Real-time model selection and results display
  2. Backend (FastAPI)

    • RESTful API endpoints for data processing
    • Knowledge graph construction
    • Path-based retrieval implementation
    • Integration with Weaviate and Ollama
  3. Weaviate (Vector Database)

    • Stores and indexes knowledge graph nodes
    • Enables semantic similarity search
    • Manages relationships between entities
  4. Ollama (LLM Service)

    • Local LLM deployment
    • Handles query processing
    • Generates natural language responses

Features

  • Document Upload

    • Support for PDF, TXT, and DOCX files
    • Raw text input option
    • Multiple file upload capability
    • Progress tracking
  • Knowledge Base Management

    • Create and manage multiple knowledge bases
    • Add descriptions and metadata
    • Organize documents by topic or domain
  • Knowledge Graph Visualization

    • Interactive graph display
    • Node and relationship exploration
    • Path visualization for query results
  • Query Interface

    • Natural language querying
    • Model selection dropdown
    • Query history tracking
    • Detailed response visualization

Installation

  1. Prerequisites

    • Docker and Docker Compose
    • Git
    • 8GB+ RAM recommended
    • NVIDIA GPU (optional, for improved LLM performance)
  2. Clone the Repository

    git clone https://github.com/your-username/PathRAG.git
    cd PathRAG
  3. Environment Setup

    # Start all services
    docker-compose up -d
  4. Access the Application

Architecture

Frontend (Port 80)

  • Vite + React application
  • Material-UI components
  • Nginx for static file serving and API proxying
  • Environment configuration via VITE_API_URL

Backend (Port 8000)

  • FastAPI application
  • Uvicorn ASGI server
  • File processing and KG construction
  • Path-based retrieval implementation

Weaviate (Port 8080)

  • Vector database for KG storage
  • RESTful and GraphQL APIs
  • Configurable vectorizer modules
  • Persistent data storage

Ollama (Port 11434)

  • Local LLM service
  • Multiple model support
  • REST API for inference
  • Model management

API Endpoints

Upload API

  • POST /upload/files - Upload documents
  • POST /upload/knowledge_base - Create knowledge base
  • GET /query/kbs - List knowledge bases

Query API

  • GET /query/models - List available models
  • POST /query/query - Execute queries
  • GET /query/graph - Retrieve graph data

Configuration

Docker Compose

version: '3.8'
services:
  frontend:
    build: ./frontend
    ports:
      - "80:80"
    environment:
      - VITE_API_URL=

  backend:
    build: ./backend
    expose:
      - "8000"
    environment:
      - WEAVIATE_URL=http://weaviate:8080
      - OLLAMA_URL=http://ollama:11434

  weaviate:
    image: semitechnologies/weaviate:1.24.1
    environment:
      - QUERY_DEFAULTS_LIMIT=20
      - AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED=true

  ollama:
    image: ollama/ollama:latest
    volumes:
      - ollama-models:/root/.ollama

Development

Frontend Development

cd frontend
npm install
npm run dev

Backend Development

cd backend
pip install -r requirements.txt
uvicorn main:app --reload

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Commit your changes
  4. Push to the branch
  5. Create a Pull Request

License

[Add your license information here]

References

About

PathRAG System - A Path-based Retrieval-Augmented Generation implementation with knowledge graph visualization and Ollama integration for enhanced question answering.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published