This project is a Question and Answer Chat Bot with a Large Language Model (LLM). The current implementation is designed around the Brazilian 'Código de Proteção e Defesa do Consumidor'. However, the Retrieval-Augmented generation (RAG) can be easily customized and adapted to suit a variety of different scenarios.
- Python
- Langchain
- Streamlit
- HuggingFace
Follow these steps to install and run this project:
- Clone the repository to your local machine using Git.
- Navigate to the project directory.
- Install the required dependencies using pip:
pip install -r requirements.txt
- Export your HuggingFace token. This token is necessary to access certain features of the HuggingFace API. Replace
your-token
with your actual token:
export HF_TOKEN=your-token
- Run the Streamlit app:
streamlit run app.py
After running the Streamlit app, a user interface will open in your default web browser. Here, you can interact with the chat bot. Simply type your question into the input field and press enter to receive an answer.