Skip to content

Commit

Permalink
Adds GraphDB Conversational Agent example
Browse files Browse the repository at this point in the history
With help from the FalkorDB folks we added a GraphDB
example to Burr.

This uses Hamilton to ingest and push data to FalkorDB that
you run locally in a docker container. Then you
can run a Burr application to run the conversational agent.

I decided to add to the conversational agent directory
because that seems like the more logical place to curate
conversational agent examples.

So I moved the original example to simple_example and updated
links accordingly. As long as we didn't link directly to the notebook
but the repository example we should be good.
  • Loading branch information
skrawcz committed Jun 1, 2024
1 parent 01c7cfa commit 229b788
Show file tree
Hide file tree
Showing 30 changed files with 12,207 additions and 1,696 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ Burr includes:
Burr can be used to power a variety of applications, including:

1. [A simple gpt-like chatbot](examples/multi-modal-chatbot)
2. [A stateful RAG-based chatbot](examples/conversational-rag)
2. [A stateful RAG-based chatbot](examples/conversational-rag/simple_example)
3. [A machine learning pipeline](examples/ml-training)
4. [A simulation](examples/simulation)

Expand Down
2 changes: 1 addition & 1 deletion docs/examples/chatbot.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ See `github repository example <https://github.com/DAGWorks-Inc/burr/tree/main/e

Conversational RAG chatbot
--------------------------
See `github example <https://github.com/DAGWorks-Inc/burr/tree/main/examples/conversational-rag>`_.
See `github example <https://github.com/DAGWorks-Inc/burr/tree/main/examples/conversational-rag/simple_example>`_.

Accompanying video walkthrough:

Expand Down
2 changes: 1 addition & 1 deletion docs/getting_started/up-next.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ This is a toy interactive RAG example. You'll ask questions in the terminal abou

.. code-block:: bash
cd examples/conversational-rag
cd examples/conversational-rag/simple_example
pip install -r requirements.txt
python application.py
Expand Down
2 changes: 1 addition & 1 deletion docs/main.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Welcome to Burr's documentation.

For a quick overview of Burr, watch `this walkthrough <https://www.loom.com/share/a10f163428b942fea55db1a84b1140d8?sid=1512863b-f533-4a42-a2f3-95b13deb07c9>`_
or read `our blog post <https://blog.dagworks.io/p/burr-develop-stateful-ai-applications?r=2cg5z1&utm_campaign=post&utm_medium=web>`_. The following video is
a longer demo of building a simple chatbot application with Burr using `this notebook <https://github.com/DAGWorks-Inc/burr/blob/main/examples/conversational-rag/notebook.ipynb>`_:
a longer demo of building a simple chatbot application with Burr using `this notebook <https://github.com/DAGWorks-Inc/burr/blob/main/examples/conversational-rag/simple_example/notebook.ipynb>`_:

.. raw:: html

Expand Down
2 changes: 1 addition & 1 deletion examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Note we have a few more in [other-examples](other-examples/), but those do not y
# Index

- [simple-chatbot-intro](simple-chatbot-intro/) - This is a simple chatbot that shows how to use Burr to create a simple chatbot. This is a good starting point for understanding how to use Burr -- the notebook follows the original [blog post](https://blog.dagworks.io/p/burr-develop-stateful-ai-applications).
- [conversational-rag](conversational-rag/) - This example shows how to use Burr to create a conversational RAG chatbot. This shows how to use state/prior knowledge to augment your LLM call with Burr.
- [conversational-rag](conversational-rag/) - This shows multiple examples on how to use Burr to create a conversational RAG chatbot. This shows how to use state/prior knowledge to augment your LLM call with Burr.
- [hello-world-counter](hello-world-counter/) - This is an example of a simple state machine, used in the docs.
- [llm-adventure-game](llm-adventure-game/) - This is an example of a simple text-based adventure game using LLMs -- it shows how to progress through hidden states while reusing components.
- [ml-training](ml-training/) - This is an example of a simple ML training pipeline. It shows how to use Burr to track the training of a model. This is not complete.
Expand Down
Empty file added examples/__init__.py
Empty file.
51 changes: 8 additions & 43 deletions examples/conversational-rag/README.md
Original file line number Diff line number Diff line change
@@ -1,49 +1,14 @@
# Conversational RAG with memory
# Conversational RAG examples
Here we curate different examples of how to build a Conversational RAG agent using different approaches/backends.

## [Simple Example](simple_example/)
This example demonstrates how to build a conversational RAG agent with "memory".

The "memory" here is stored in state, which Burr then can help you track,
manage, and introspect.

The set up of this example is that you have:

1. Some initial "documents" i.e. knowledge.
2. We bootstrap a vector store with these documents.
3. We then have a pipeline that uses a vector store for a RAG query. This example uses a [pre-made conversational RAG pipeline](https://hub.dagworks.io/docs/DAGWorks/conversational_rag/); the prompt isn't hidden under layers of abstraction.
4. We hook everything together with Burr that will manage the state
of the conversation and asking for user inputs.

To run this example, install Burr and the necessary dependencies:

```bash
pip install "burr[start]" -r requirements.txt
```

Then run the server in the background:

```bash
burr
```

Make sure you have an `OPENAI_API_KEY` set in your environment.

Then run
```bash
python application.py
```

You'll then have a text terminal where you can interact. Type exit to stop.

# Application That's Defined:
![Application Image](statemachine.png)

# Video Walkthrough via Notebook
Open the notebook <a target="_blank" href="https://colab.research.google.com/github/dagworks-inc/burr/blob/main/examples/conversational-rag/notebook.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>

Watch the video walkthrough with the notebook (1.5x+ speed recommended):


<a href="http://www.youtube.com/watch?feature=player_embedded&v=t54DCiOH270" target="_blank">
<img src="http://img.youtube.com/vi/t54DCiOH270/hqdefault.jpg" alt="Watch the video" border="10" />
</a>
## [Graph DB Example](graph_db_example/)
This demo illustrates how to build a RAG Q&A AI agent over the [UFC stats dataset](https://www.kaggle.com/datasets/rajeevw/ufcdata).
This one uses a Knowledge Graph that is stored in [FalkorDB](https://www.falkordb.com/) to query for
information about UFC fighters and fights.
125 changes: 125 additions & 0 deletions examples/conversational-rag/graph_db_example/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,125 @@
# Conversational agent over UFC Knowledge graph

## Introduction
This demo illustrates how to build a RAG Q&A AI agent over the [UFC stats dataset](https://www.kaggle.com/datasets/rajeevw/ufcdata)
This one uses a Knowledge Graph that is stored in [FalkorDB](https://www.falkordb.com/) to query
for information about UFC fighters and fights.

Thanks to the folks at [FalkorDB](https://www.falkordb.com/) for helping set up this example.

## Data
The [UFC](http://ufc.com) publicly offers statistics for each fight it held in addition to individual fighter's
personal statistics on [UFC stats](http://ufcstats.com/statistics/events/completed)

This information includes among other details:
* Where and when an event was held
* Details and statistics of a fight
* Who won a fight
* How long a fight lasted
* Fighter's reach

We have pulled some data and stored it in the `/data` folder.


# Querying the AI agent
Once the data is loaded either into a Knowledge Graph, users can start asking the AI agent questions. For example:

```
Which fighter holds the fastest win?
The fighter who holds the fastest win is Jorge Masvidal, with a win in just 5 second
Who did he win against ?
Jorge Masvidal won against Ben Askren in the fight where he secured the fastest win.
List fighters who had a trilogy match
The only fighters specifically identified in the data having a trilogy (i.e., three matches against the same opponent) are:
- Frankie Edgar and BJ Penn
- Randy Couture and Vitor Belfort
- BJ Penn and Frankie Edgar
- Cain Velasquez and Junior Dos Santos
...
Who has a 50% win percentage?
Yes, there are fighters with a 50% win percentage in the dataset. Here are a few of them:
- Joe Slick: 1 win out of 2 fights (50%)
- Ken Stone: 2 wins out of 4 fights (50%)
- Tim Boetsch: 12 wins out of 24 fights (50%)
- Jordan Wright: 1 win out of 2 fights (50%)
- Travis Fulton: 1 win out of 2 fights (50%)
```

# Running the demo

## Prerequisites

Install Python modules
```sh
pip install -r requirements.txt
```

Run FalkorDB
```sh
docker run -p 6379:6379 -p 3000:3000 -it --rm falkordb/falkordb:edge
```
Note: at the time of writing this image did not persist data.

## Ingest data
We first need to create the Knowledge Graph.

**Ingest data using the command line**:

```sh
python hamilton_ingest.py
```
This will run the following two pipelines:

![ingest fighters](ingest_fighters.png)
![ingest fights](ingest_fights.png)

**Note:** [Hamilton](https://github.com/dagworks-inc/hamilton) also comes with a UI that you can use to visualize the pipeline and
track execution information about it. See hamilton_ingest.py or ingest_notebook.ipynb for more information.

## Ingest data using a notebook:

```sh
pip install jupyter
jupyter notebook
# select ingest_notebook.ipynb and follow the instructions there
```

## Run the QA agent via the notebook:
```sh
export OPENAI_API_KEY="YOUR_OPENAI_KEY"
pip install jupyter
jupyter notebook
# select notebook.ipynb and follow the instructions there
```

## Run the QA agent via the command line:
```sh
export OPENAI_API_KEY="YOUR_OPENAI_KEY"
python application.py
```

Knowledge Graph generated:

![knowledge graph](UFC_Graph.png)

Application Graph generated:
![application graph](statemachine.png)

## See the trace of the QA agent with the Burr UI
In a terminal run:
```sh
burr
```
Then open a browser and go to `http://localhost:7241` to see the Burr UI.

You can then navigate to the `ufc-falkor` project and see the trace of the QA agent.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 229b788

Please sign in to comment.