The Knowledge Bot is a web-based chatbot that provides information and answers questions related to any data which is given as context based on Retrieval Augmented Generation Architecture. It utilizes the llama_index
library for data indexing and OpenAI's GPT-3.5-Turbo model for generating responses.
The chatbot is designed to assist users in finding information by answering questions based on indexed documents.
- Ask questions related to your indexed documents.
- Receive informative responses based on indexed data.
- Convenient web-based interface powered by Streamlit.
To run the Knowledge Bot locally with docker, follow these steps:
-
Clone this repository to your local machine:
git clone https://github.com/PatrickPT/RAG_LLM_example.git
-
Create your OpenAI Key
cd RAG_LLM_example cd .streamlit nano .streamlit/secrets.toml # Insert your API Key as openai_key = "API Key" and save
-
Create your documents or change the input_dir parameter in config.yaml to your folder(which needs to be accessible from the docker container)
cd data # Insert the contextual documents the LLM should use in that folder
-
Change the
config.yaml
file accordingly to your prior changes -
Run docker compose
docker compose up -d
PS: content in /.streamlit
and /data
is ignored by git.
Contributions are welcome! If you'd like to contribute to this project, feel free to reach out.
This project relies on the llama_index library for data indexing and retrieval and streamlit for the frontend. It uses OpenAI's GPT-3.5-Turbo model for natural language understanding and generation.
If you have any questions or feedback, feel free to reach out.