A Retrieval-Augmented Generation (RAG) chatbot built with Streamlit and open-source models.
- Retrieval-Augmented Generation (RAG) for context-aware responses.
- DeepSeek R1 as the LLM for generating answers.
- ChromaDB as the vector store for efficient document retrieval.
- Nomic Embed Text for embedding knowledge into a searchable format.
- Streamlit UI for easy interaction with the AI chatbot.
.
├── app.py
├── agents.py # Main Streamlit App
├── nature.pdf # Sample knowledge document
└── README.md # Project Documentation
git clone https://github.com/RAJA102002/ragbasedchatbot.gitCould you make sure you have Python 3.8+ installed?
- Install Python
- Install Ollama
Ensure you have Ollama installed and running:
#Install OllamaDownload DeepSeek R1:
ollama pull deepseek-llm:latest
Download Nomic Embed models:
ollama pull nomic-embed-text:latest
Install required packages:
pip install chromadb
Now, set your OpenAI API key, to the requirement for the SDK:
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_API_KEY=fake-keystreamlit run agents.pyHow It Works
- Loads Knowledge – Uses
sample.pdf, For Examplenature.pdffor retrieval-based answering. - Embeds Data – Utilizes Nomic Embed Text for vectorized search.
- Retrieves Relevant Info – Searches ChromaDB for the most relevant content.
- Generates Responses – Feeds retrieved data into DeepSeek R1 for contextual answers.
Example Usage
- Run the app and open the Streamlit UI.
- Ask a question related to the uploaded document.
- Get AI-generated responses based on retrieved knowledge!