Skip to content

Commit

Permalink
Fix huggingface example notebook and updated model id (#78)
Browse files Browse the repository at this point in the history
* Fix huggingface notebook and updated model id

* removed HF warnings from output

---------

Co-authored-by: Raja Sekhar Rao Dheekonda <rdheekonda@microsoft.com>
  • Loading branch information
rdheekonda and Raja Sekhar Rao Dheekonda authored Mar 1, 2024
1 parent c5a030c commit b9e40e8
Show file tree
Hide file tree
Showing 3 changed files with 84 additions and 43 deletions.
59 changes: 47 additions & 12 deletions doc/code/huggingface_endpoints.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -9,20 +9,18 @@
"source": [
"# Introduction\n",
"\n",
"This code shows how to use Hugging Face managed online endpoints with PyRIT. Hugging Face support is currently experimental\n",
"This code shows how to use Hugging Face models with PyRIT. Hugging Face support is currently experimental\n",
"and may not work as intended.\n",
"\n",
"## Prerequisites\n",
"\n",
"Before beginning, ensure that you have the model id obtained from the Hugging Face as shown below.\n",
"<br> <img src=\"./../../assets/huggingface_model_id.png\" alt=\"huggingface_model_id.png\" height=\"400\"/> <br>\n",
"\n",
"After deploying a model and populating your env file, creating an endpoint is as simple as the following"
"<br> <img src=\"./../../assets/huggingface_model_id.png\" alt=\"huggingface_model_id.png\" height=\"400\"/> <br>"
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 4,
"id": "89aad9fe",
"metadata": {},
"outputs": [],
Expand All @@ -33,10 +31,10 @@
"from pyrit.models import ChatMessage\n",
"from pyrit.chat import HuggingFaceChat\n",
"\n",
"\n",
"default_values.load_default_env()\n",
"\n",
"model_id = \"Fredithefish/Guanaco-3B-Uncensored-v2\"\n",
"# The first execution of this code cell will download the model from HuggingFace hub, taking about 5 minutes. \n",
"# Subsequent runs will load the model from your disk, making them much quicker.\n",
"model_id = \"TinyLlama/TinyLlama-1.1B-Chat-v1.0\"\n",
"huggingface_chat_engine = HuggingFaceChat(model_id=model_id)"
]
},
Expand All @@ -46,25 +44,62 @@
"metadata": {},
"source": [
"\n",
"After the model is created, you can use it like any other `ChatSupport` object. For example, you can complete a chat as shown below."
"After the model is loaded, you can use it like any other `ChatSupport` object. For example, you can complete a chat as shown below."
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 5,
"id": "789ffb7c",
"metadata": {},
"outputs": [],
"outputs": [
{
"data": {
"text/plain": [
"'Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER:Bye!USER'"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"huggingface_chat_engine.complete_chat(messages=[ChatMessage(role=\"user\", content=\"Hello world!\")])"
"huggingface_chat_engine.complete_chat(messages=[ChatMessage(role=\"system\", content=\"You are a helpful AI assistant\"), \n",
" ChatMessage(role=\"user\", content=\"Hello world!\")])"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b9ece9ee",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"jupytext": {
"cell_metadata_filter": "-all",
"main_language": "python",
"notebook_metadata_filter": "-all"
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
}
},
"nbformat": 4,
Expand Down
37 changes: 37 additions & 0 deletions doc/code/huggingface_endpoints.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# %% [markdown]
# # Introduction
#
# This code shows how to use Hugging Face models with PyRIT. Hugging Face support is currently experimental
# and may not work as intended.
#
# ## Prerequisites
#
# Before beginning, ensure that you have the model id obtained from the Hugging Face as shown below.
# <br> <img src="./../../assets/huggingface_model_id.png" alt="huggingface_model_id.png" height="400"/> <br>

# %%

from pyrit.common import default_values

from pyrit.models import ChatMessage
from pyrit.chat import HuggingFaceChat

default_values.load_default_env()
# The first execution of this code cell will download the model from HuggingFace hub, taking about 5 minutes.
# Subsequent runs will load the model from your disk, making them much quicker.
model_id = "TinyLlama/TinyLlama-1.1B-Chat-v1.0"
huggingface_chat_engine = HuggingFaceChat(model_id=model_id)

# %% [markdown]
#
# After the model is loaded, you can use it like any other `ChatSupport` object. For example, you can complete a chat as shown below.

# %%
huggingface_chat_engine.complete_chat(
messages=[
ChatMessage(role="system", content="You are a helpful AI assistant"),
ChatMessage(role="user", content="Hello world!"),
]
)

# %%
31 changes: 0 additions & 31 deletions doc/code/huggingface_endpoints.py.tt

This file was deleted.

0 comments on commit b9e40e8

Please sign in to comment.