Skip to content

raf-init/rag-llama3-gradio

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

5 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🧠 RAG Chatbot with LLaMA 3 & Gradio

Python Gradio LLaMA 3 RAG

A lightweight and interactive chatbot powered by Meta’s LLaMA 3 and enhanced through Retrieval-Augmented Generation (RAG) for domain-specific answers. Comes with a clean Gradio UI for easy access and testing.


πŸš€ Features

  • πŸ” RAG architecture for better factual accuracy
  • πŸ€– Powered by Meta’s LLaMA 3 model
  • πŸŽ›οΈ Easy-to-use Gradio interface
  • πŸ“š Plug in your own data (PDFs, text files, etc.)
  • 🌐 Local or cloud-hosted deployment

πŸ“Έ Demo Screenshot

Case Study: We provide specific details in plain text of an object and ask questions regarding its properties. The LLM must reply only based on the provided context. image


πŸ“ Project Structure

rag-llama3-gradio/
β”œβ”€β”€ main.py                # Entry point: runs the Gradio app
β”œβ”€β”€ loader.py              # Loads and chunks documents for processing
β”œβ”€β”€ embedder.py            # Handles embeddings and indexing using ChromaDB
β”œβ”€β”€ rag_engine.py          # Retrieval + Generation logic using LLaMA 3
β”œβ”€β”€ requirements.txt       # Python dependencies
β”œβ”€β”€ README.md              # Project documentation
β”œβ”€β”€ data/                  # Source documents for retrieval
β”‚   └── demo.txt           # Sample file for testing
└── .gitignore             # Files and folders to exclude from Git

πŸ› οΈ Installation

git clone https://github.com/raf-init/rag-llama3-gradio.git
cd rag-llama3-gradio
pip install -r requirements.txt

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages