Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add Quivr chatbot example #2827

Merged
merged 5 commits into from
Jul 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -89,3 +89,7 @@ backend/.env.test
**/*.egg-info

.coverage
backend/core/examples/chatbot/.files/*
backend/core/examples/chatbot/.python-version
backend/core/examples/chatbot/.chainlit/config.toml
backend/core/examples/chatbot/.chainlit/translations/en-US.json
45 changes: 45 additions & 0 deletions backend/core/examples/chatbot/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# Quivr Chatbot Example

This example demonstrates how to create a simple chatbot using Quivr and Chainlit. The chatbot allows users to upload a text file and then ask questions about its content.

## Prerequisites

- Python 3.8 or higher

## Installation

1. Clone the repository or navigate to the `backend/core/examples/chatbot` directory.

2. Install the required dependencies:

```
pip install -r requirements.txt
```

## Running the Chatbot

1. Start the Chainlit server:

```
chainlit run main.py
```

2. Open your web browser and go to the URL displayed in the terminal (usually `http://localhost:8000`).

## Using the Chatbot

1. When the chatbot interface loads, you will be prompted to upload a text file.

2. Click on the upload area and select a `.txt` file from your computer. The file size should not exceed 20MB.

3. After uploading, the chatbot will process the file and inform you when it's ready.

4. You can now start asking questions about the content of the uploaded file.

5. Type your questions in the chat input and press Enter. The chatbot will respond based on the information in the uploaded file.

## How It Works

The chatbot uses the Quivr library to create a "brain" from the uploaded text file. This brain is then used to answer questions about the file's content. The Chainlit library provides the user interface and handles the chat interactions.

Enjoy chatting with your documents!
45 changes: 45 additions & 0 deletions backend/core/examples/chatbot/chainlit.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# Quivr Chatbot Example

This example demonstrates how to create a simple chatbot using Quivr and Chainlit. The chatbot allows users to upload a text file and then ask questions about its content.

## Prerequisites

- Python 3.8 or higher

## Installation

1. Clone the repository or navigate to the `backend/core/examples/chatbot` directory.

2. Install the required dependencies:

```
pip install -r requirements.txt
```

## Running the Chatbot

1. Start the Chainlit server:

```
chainlit run main.py
```

2. Open your web browser and go to the URL displayed in the terminal (usually `http://localhost:8000`).

## Using the Chatbot

1. When the chatbot interface loads, you will be prompted to upload a text file.

2. Click on the upload area and select a `.txt` file from your computer. The file size should not exceed 20MB.

3. After uploading, the chatbot will process the file and inform you when it's ready.

4. You can now start asking questions about the content of the uploaded file.

5. Type your questions in the chat input and press Enter. The chatbot will respond based on the information in the uploaded file.

## How It Works

The chatbot uses the Quivr library to create a "brain" from the uploaded text file. This brain is then used to answer questions about the file's content. The Chainlit library provides the user interface and handles the chat interactions.

Enjoy chatting with your documents!
63 changes: 63 additions & 0 deletions backend/core/examples/chatbot/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
import tempfile

import chainlit as cl
from quivr_core import Brain


@cl.on_chat_start
async def on_chat_start():
files = None

# Wait for the user to upload a file
while files is None:
files = await cl.AskFileMessage(
content="Please upload a text .txt file to begin!",
accept=["text/plain"],
max_size_mb=20,
timeout=180,
).send()

file = files[0]

msg = cl.Message(content=f"Processing `{file.name}`...", disable_feedback=True)
await msg.send()

with open(file.path, "r", encoding="utf-8") as f:
text = f.read()

with tempfile.NamedTemporaryFile(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tempfile are usually garbage collected as soon and the context is closed. I think we can pass that file.path directly to the brain 🤔 ?

If this doesn't work, maybe have the logic inside the context manager

mode="w", suffix=".txt", delete=False
) as temp_file:
temp_file.write(text)
temp_file.flush()
temp_file_path = temp_file.name

brain = Brain.from_files(name="user_brain", file_paths=[temp_file_path])

# Store the file path in the session
cl.user_session.set("file_path", temp_file_path)

# Let the user know that the system is ready
msg.content = f"Processing `{file.name}` done. You can now ask questions!"
await msg.update()

cl.user_session.set("brain", brain)


@cl.on_message
async def main(message: cl.Message):
brain = cl.user_session.get("brain") # type: Brain

if brain is None:
await cl.Message(content="Please upload a file first.").send()
return

# Prepare the message for streaming
msg = cl.Message(content="")
await msg.send()

# Use the ask_stream method for streaming responses
async for chunk in brain.ask_streaming(message.content):
await msg.stream_token(chunk.answer)

await msg.send()
5 changes: 5 additions & 0 deletions backend/core/examples/chatbot/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
quivr-core==0.0.7
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

quivr-core[base] should probably work which remove faiss, langchain-community, ...

pip install "quivr-core[base]"

langchain-community==0.2.6
faiss-cpu==1.8.0
langchain-openai==0.1.14
chainlit==1.0.0
Loading