RAG base chatbot with LLM and Qdrant
Clone the repo:
git clone https://github.com/Coding-Rod/delve_final_project.git
Install python dependencies:
pip install -r requirements.txt
Get a Groq API key from here
Create a .env
file in the root of the project with the following content:
GROQ_API_KEY="YOUR API KEY"
Start project:
streamlit run src/app.py
Extracts text from PDF documents and creates chunks (using semantic and character splitter) that are stored in a vector databse
Given a query, searches for similar documents, reranks the result and applies LLM chain filter before returning the response.
Combines the LLM with the retriever to answer a given user question