Skip to content

AnuragSingh457/RAGProject

Repository files navigation

📄 RAG-Powered PDF QA App

This project is an end-to-end Retrieval-Augmented Generation (RAG) system that lets users:

  • Upload a PDF
  • Index it using FAISS
  • Ask natural language questions about its content
  • Get answers using LLMs (via OpenRouter)

Built with FastAPI, LangChain, FAISS, Streamlit, and Docker.


🚀 Features

Feature Description
📄 PDF Upload Upload and process PDF documents
🔎 Vector Search Index and search text chunks using FAISS
🤖 LLM Answers Generate responses using OpenRouter-compatible models
🌐 Frontend UI Streamlit-based interface for uploads and queries
🐳 Dockerized Fully containerized backend with Docker Compose
🧪 Tested Unit + integration tests using pytest

🧠 Tech Stack

  • Backend: FastAPI, LangChain, FAISS
  • LLM: OpenRouter (LLaMA 3.3 8B Instruct)
  • Frontend: Streamlit
  • Containerization: Docker + Docker Compose
  • Testing: Pytest + pytest-asyncio

🏗️ Project Structure

├── app/                    # FastAPI backend
│   ├── main.py             # API entrypoint
│   ├── routes/             # Upload, query, metadata APIs
│   ├── services/           # Ingestion, retrieval, generation logic
│   └── utils/              # PDF parser
├── frontend/               # Streamlit frontend
│   └── frontend.py
├── tests/                  # Unit and integration tests
│   └── test_retrieval_query.py
├── Dockerfile              # Backend Docker config
├── docker-compose.yml      # Orchestration
├── requirements.txt        # Python dependencies
└── README.md               # This file

⚙️ Setup Instructions

🔧 Clone the repo

git clone https://github.com/AnuragSingh457/RAG-Project.git
cd RAG-Project

🐳 Backend: Docker Setup

docker-compose up --build

➡️ Backend will be live at: http://localhost:8000 ➡️ Test it: http://localhost:8000/docs

🌐 Frontend: Streamlit App

cd frontend
pip install -r requirements.txt  # or: pip install streamlit requests
streamlit run frontend.py

➡️ App will run at: http://localhost:8501


🔐 Environment Setup

Create a .env file (for OpenRouter LLM):

OPENAI_API_KEY=your_openrouter_api_key
OPENAI_API_BASE=https://openrouter.ai/api/v1

🧪 Run Tests

docker exec -it rag_api env PYTHONPATH=/app pytest tests/

Includes:

  • PDF upload and FAISS ingestion test
  • Querying + LLM answer generation
  • API error handling

🌍 Deployment Options

You can deploy this project to:

  • 🚀 Streamlit Cloud (for the frontend)
  • 🐳 Render / Railway / Fly.io (for Dockerized backend)

🙌 Credits

  • Built with ❤️ using FastAPI, LangChain, Streamlit, and FAISS
  • LLMs powered by OpenRouter

-You can see the app live at - https://ragproject-mewe49drlug4gkgrqprnjh.streamlit.app/

About

Live website

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published