Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Examples merged into main repo #286

Draft
wants to merge 18 commits into
base: dev
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ include dff/context_storages/protocols.json
exclude makefile

recursive-exclude tests *
recursive-exclude examples *
recursive-exclude tutorials *
recursive-exclude * __pycache__
recursive-exclude * *.py[co]
Expand Down
19 changes: 18 additions & 1 deletion docs/source/examples.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,21 @@
Examples
--------

Examples are available in this `repository <https://github.com/deeppavlov/dialog_flow_demo>`_.
:doc:`FAQ bot <./examples/faq_bot>`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

FAQ bot for Deeppavlov users built using `DFF`.
Can be run with Telegram or with a web interface.

:doc:`Customer service bot <./examples/customer_service_bot>`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Customer service bot built using `DFF`.
This bot is designed to answer any type of user questions in a limited business domain (book shop).
Uses a Telegram interface.

.. toctree::
:hidden:

examples/faq_bot
examples/customer_service_bot
67 changes: 67 additions & 0 deletions docs/source/examples/customer_service_bot.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
Customer service bot
--------------------

Customer service bot built using `DFF`.
This bot is designed to answer any type of user questions in a limited business domain (book shop).
Uses a Telegram interface.

You can read more about deploying the project in its README file.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Link to readme file?


Project structure
~~~~~~~~~~~~~~~~~

While DFF allows you to choose any structure for your own projects,
we propose a schema of how project files can be meaningfully split
into services and modules.

* In our projects, we go for docker-based deployment due to its scalability and universal
applicability. If you decide to go for the same deployment scheme, you will always
have at least one service that wraps your bot.

* Neural network models that you run locally can be factored out into a separate service.
This way your main service, i.e. the service wrapping the bot, won't crash if something
unexpected happens with the model.

* In the main service directory, we make a separate package for all DFF-related abstractions.
There, we put the script into a separate module, also creating modules for
`processing, condition, and response functions <../user_guides/basic_conceptions#>`__.

* The rest of the project-related Python code is factored out into other packages.

* We also create 'run.py' and 'test.py' at the project root. These files import the ready pipeline
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Link to the files?

and execute it to test or run the service.

.. code-block:: shell

examples/customer_service_bot/
├── docker-compose.yml # docker-compose orchestrates the services
├── bot # main docker service
│ ├── api
│ │ ├── __init__.py
│ │ ├── chatgpt.py
│ │ └── intent_catcher.py
│ ├── dialog_graph # Separate package for DFF-related abstractions
│ │ ├── __init__.py
│ │ ├── conditions.py # Condition callbacks
│ │ ├── consts.py # Constant values for keys
│ │ ├── processing.py # Processing callbacks
│ │ ├── response.py # Response callbacks
│ │ └── script.py # DFF script and pipeline are constructed here
│ ├── dockerfile # The dockerfile takes care of setting up the project. See the file for more details
│ ├── requirements.txt
│ ├── run.py
│ └── test.py
└── intent_catcher # intent catching model wrapped as a docker service
├── dockerfile
├── requirements.txt
├── server.py
└── test_server.py

Models
~~~~~~

Two differently designed models are used to power the customer service bot: an intent classifier and a generative model.
The classifier is being deployed as a separate service while ChatGPT is being interacted with via API.

* [DeepPavlov Intent Catcher](https://docs.deeppavlov.ai/en/0.14.1/features/models/intent_catcher.html) is used for intent retrieval.
* [ChatGPT](https://openai.com/pricing#language-models) is used for context based question answering.
61 changes: 61 additions & 0 deletions docs/source/examples/faq_bot.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
FAQ Bot
-------

FAQ bot for Deeppavlov users built using `DFF`.
Can be run with Telegram or with a web interface.

You can read more about deploying the project in its README file.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Link to the readme file?


Project structure
~~~~~~~~~~~~~~~~~

* In our projects, we go for docker-based deployment due to its scalability and universal
applicability. If you decide to go for the same deployment scheme, you will always
have at least one service that wraps your bot.

* In the main service directory, we make a separate package for all DFF-related abstractions.
There, we put the `script <#>`__ into a separate module, also creating modules for
`condition and response functions <#>`__.

* We also create a separate package for `pipeline services <#>`__.

* The rest of the project-related Python code is factored out into other packages.


.. code-block:: shell

examples/frequently_asked_question_bot/
├── README.md
├── compose.yml # docker compose file orchestrates the services
├── nginx.conf # web service proxy configurations
└── web
├── Dockerfile
├── app.py
├── bot
│ ├── dialog_graph # A separate module for DFF-related abstractions
│ │ ├── responses.py
│ │ └── script.py # DFF script is constructed here
│ ├── faq_model # model-related code
│ │ ├── faq_dataset_sample.json
│ │ ├── model.py
│ │ ├── request_translations.json
│ │ └── response_translations.json
│ ├── pipeline.py
│ ├── pipeline_services # Separately stored pipeline service functions
│ │ └── pre_services.py
│ └── test.py
├── requirements.txt
└── static
├── LICENSE.txt
├── index.css
├── index.html
└── index.js

Models
~~~~~~~

The project makes use of the `clips/mfaq <https://huggingface.co/clips/mfaq>`__ model that powers the bot's ability to understand queries in multiple languages.
A number of techniques is employed to make the usage more efficient.

* The project's Dockerfile illustrates caching a model using SentenceTransformer in a Docker container.
The model is constructed during image build, so that the weights that the Huggingface library fetches from the web are downloaded in advance. At runtime, the fetched weights will be quickly read from the disk.
5 changes: 5 additions & 0 deletions examples/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# DFF example projects

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: update the 'examples' section in index.rst with new links

This directory contains examples of bots built using [DFF](https://github.com/deeppavlov/dialog_flow_framework) (Dialog Flow Framework).

The example projects include a FAQ bot for potential Linux users and a customer service bot for a book shop. Both bots use Telegram as an interface.
2 changes: 2 additions & 0 deletions examples/customer_service_bot/.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
TG_BOT_TOKEN=bot_token
OPENAI_API_TOKEN=openai_api_token
67 changes: 67 additions & 0 deletions examples/customer_service_bot/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
## Description

### Customer service bot

Customer service bot built using `DFF`. Uses Telegram as an interface.
This bot is designed to answer any type of user questions in a limited business domain (book shop).

* [DeepPavlov Intent Catcher](https://docs.deeppavlov.ai/en/0.14.1/features/models/intent_catcher.html) is used for intent retrieval.
* [ChatGPT](https://openai.com/pricing#language-models) is used for context based question answering.

### Intent Catcher

Intent catcher is a DistilBERT-based classifier for user intent classes.
We use the DeepPavlov library for a seamless training and inference experience.
Sample code for training the model can be found in `Training_intent_catcher.ipynb`.
The model is deployed as a separate microservice running at port 4999.

The bot interacts with the container via `/respond` endpoint.
The API expects a json object with the dialog history passed as an array and labeled 'dialog_contexts'. Intents will be extracted from the last utterance.

```json
{
"dialog_contexts": ["phrase_1", "phrase_2"]
}
```

The API responds with a nested array containing `label - score` pairs.

```json
[["no",0.3393537402153015]]
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
[["no",0.3393537402153015]]
[["no", 0.3393537402153015]]

```

Run the intent catcher:
```commandline
docker compose up --build --abort-on-container-exit --exit-code-from intent_client
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose this command should be:
docker compose up --build --abort-on-container-exit --exit-code-from intent_catcher for it to run.

Anyways, the command still runs with errors

```

## Running the bot

### Step 1: Configuring the docker services
To interact with external APIs, the bot requires API tokens that can be set through the [.env](.env) file. Update it replacing the placeholders with actual token values.
```
TG_BOT_TOKEN=***
OPENAI_API_TOKEN=***
```

### Step 2: Launching the project
*The commands below need to be run from the /examples/customer_service_bot directory*

Building the bot and launching it in the background can be done with a single command given that the environment variables have been configured correctly. Then you can immediately interact with your bot in Telegram.
```commandline
docker-compose up -d
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Uniform the commands: with - or without

```

If any of the source files have received updates, you can rebuild and sync the bot using the docker-compose build command.
```commandline
docker compose build
```
In case of bugs, you can test whether the bot correctly handles basic functionality using the following command:
```commandline
docker compose run assistant pytest test.py
```

The bot can also be run as a self-standing service, i.e. without the intent catcher for a less resource-demanding workflow:
```commandline
docker compose run assistant python run.py
```
Loading