Skip to content

Commit

Permalink
add streaming to chat engines (#6717)
Browse files Browse the repository at this point in the history
  • Loading branch information
logan-markewich committed Jul 7, 2023
1 parent a1f29ee commit d164316
Show file tree
Hide file tree
Showing 10 changed files with 454 additions and 99 deletions.
3 changes: 2 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,10 @@
## Unreleased

### New Features
- Sub question query engine returns source nodes of sub questions in `response.metadata['sources']` (#6745)
- Sub question query engine returns source nodes of sub questions in the callback manager (#6745)

### Bug Fixes / Nits
- Added/Fixed streaming support to simple and condense chat engines (#6717)
- fixed `response_mode="no_text"` response synthesizer (#6755)
- fixed error setting `num_output` and `context_window` in service context (#6766)

Expand Down
115 changes: 101 additions & 14 deletions docs/examples/chat_engine/chat_engine_condense_question.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
"tags": []
},
"source": [
"### Get started in 5 lines of code"
"## Get started in 5 lines of code"
]
},
{
Expand All @@ -65,16 +65,7 @@
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/Users/suo/miniconda3/envs/llama/lib/python3.9/site-packages/deeplake/util/check_latest_version.py:32: UserWarning: A newer version of deeplake (3.6.7) is available. It's recommended that you update to the latest version using `pip install -U deeplake`.\n",
" warnings.warn(\n"
]
}
],
"outputs": [],
"source": [
"from llama_index import VectorStoreIndex, SimpleDirectoryReader\n",
"\n",
Expand Down Expand Up @@ -303,13 +294,109 @@
"source": [
"print(response)"
]
},
{
"cell_type": "markdown",
"id": "a65ad1a2",
"metadata": {},
"source": [
"## Streaming Support"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "ad272dfe",
"metadata": {},
"outputs": [],
"source": [
"from llama_index import ServiceContext, VectorStoreIndex, SimpleDirectoryReader\n",
"from llama_index.llms import OpenAI\n",
"\n",
"service_context = ServiceContext.from_defaults(\n",
" llm=OpenAI(model=\"gpt-3.5-turbo-0613\", temperature=0)\n",
")\n",
"\n",
"data = SimpleDirectoryReader(input_dir=\"../data/paul_graham/\").load_data()\n",
"\n",
"index = VectorStoreIndex.from_documents(data, service_context=service_context)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "22605caa",
"metadata": {},
"outputs": [],
"source": [
"chat_engine = index.as_chat_engine(verbose=True, streaming=True)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "250abd43",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Querying with: What did Paul Graham do after YC? Can you write a poem to describe his activities?\n",
"After YC, Paul Graham took a break,\n",
"To focus on something new, for his own sake.\n",
"He decided to paint, to see how good he could be,\n",
"And for a while, it consumed him completely.\n",
"\n",
"He spent 2014 with brush in hand,\n",
"Creating art, in a world so grand.\n",
"But then, in the middle of a painting, he lost his drive,\n",
"Finishing it felt like a chore, he couldn't revive.\n",
"\n",
"So he put down his brushes, and turned to writing,\n",
"Essays flowed from his mind, thoughts exciting.\n",
"But that wasn't enough, he needed more,\n",
"And so, he returned to Lisp, a language to explore.\n",
"\n",
"Lisp, a language of computation and creation,\n",
"Brought him back to his coding fascination.\n",
"He worked on YC's internal software, with Arc in his hand,\n",
"But gradually, his focus shifted, as YC's demands expanded.\n",
"\n",
"He realized that YC had become his life's work,\n",
"And it was time for a change, a new perk.\n",
"So he handed over the reins to Sam Altman,\n",
"And stepped back, as a new chapter began.\n",
"\n",
"Now, he writes and advises, with wisdom to share,\n",
"His experiences and insights, he's always there.\n",
"Paul Graham, a man of many talents and dreams,\n",
"Continues to inspire, through his words and schemes."
]
}
],
"source": [
"response = chat_engine.stream_chat(\n",
" \"What did Paul Graham do after YC? Write a poem with your answer\"\n",
")\n",
"for token in response.response_gen:\n",
" print(token, end=\"\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5e92a544",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"display_name": "venv",
"language": "python",
"name": "python3"
"name": "venv"
},
"language_info": {
"codemirror_mode": {
Expand All @@ -321,7 +408,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.16"
"version": "3.9.6"
}
},
"nbformat": 4,
Expand Down
102 changes: 100 additions & 2 deletions docs/examples/chat_engine/chat_engine_repl.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -318,12 +318,110 @@
"chat_engine.chat_repl()"
]
},
{
"cell_type": "markdown",
"id": "accae591",
"metadata": {},
"source": [
"## Streaming Support"
]
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 1,
"id": "cd62a9f1-ed27-4730-82de-9a89750c08fd",
"metadata": {},
"outputs": [],
"source": [
"from llama_index.llms import OpenAI\n",
"from llama_index import ServiceContext\n",
"\n",
"service_context = ServiceContext.from_defaults(\n",
" llm=OpenAI(temperature=0.0, model=\"gpt-3.5-turbo-0613\")\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "6ef942f9",
"metadata": {},
"outputs": [],
"source": [
"from llama_index.chat_engine import SimpleChatEngine\n",
"\n",
"chat_engine = SimpleChatEngine.from_defaults(service_context=service_context)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "e0c69d90",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"In a world where whimsy takes its flight,\n",
"Where dreams and reality intertwine,\n",
"A tale unfolds, both strange and bright,\n",
"Of raining cats and dogs, so divine.\n",
"\n",
"From the heavens, a tempest brews,\n",
"Clouds gather, dark and thick,\n",
"And as the wind begins to choose,\n",
"The sky releases a whimsical trick.\n",
"\n",
"Down they fall, with paws and tails,\n",
"Cats and dogs, in a watery dance,\n",
"Tiny meows and barks prevail,\n",
"As they descend in a wild romance.\n",
"\n",
"The felines, graceful, land with poise,\n",
"Their fur glistening, sleek and fine,\n",
"With eyes that gleam like emerald joys,\n",
"They prance and purr, in a feline line.\n",
"\n",
"The canines, playful, splash and bound,\n",
"Their wagging tails a joyful sight,\n",
"With tongues that pant and ears that sound,\n",
"They frolic and bark, with all their might.\n",
"\n",
"Together they create a symphony,\n",
"A chorus of meows and barks,\n",
"A spectacle for all to see,\n",
"As they dance upon the parks.\n",
"\n",
"Children giggle, adults stare,\n",
"Amazed by this peculiar sight,\n",
"For in this moment, they're all aware,\n",
"Of the magic raining from the height.\n",
"\n",
"And as the storm begins to wane,\n",
"The cats and dogs return above,\n",
"Leaving behind a world untamed,\n",
"A memory of a rain so rare and of love.\n",
"\n",
"So, let us cherish this whimsical tale,\n",
"Of raining cats and dogs, so grand,\n",
"For in the extraordinary, we prevail,\n",
"And find enchantment in the palm of our hand."
]
}
],
"source": [
"response = chat_engine.stream_chat(\"Write me a poem about raining cats and dogs.\")\n",
"for token in response.response_gen:\n",
" print(token, end=\"\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "c33715a9",
"metadata": {},
"outputs": [],
"source": []
}
],
Expand All @@ -343,7 +441,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.16"
"version": "3.9.6"
}
},
"nbformat": 4,
Expand Down
Loading

0 comments on commit d164316

Please sign in to comment.