Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert PanelCallbackHandler change #5710

Merged
merged 9 commits into from
Oct 23, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file modified examples/assets/panel_callback_handler_agent.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified examples/assets/panel_callback_handler_retriever.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion examples/reference/chat/ChatFeed.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"Check out the [panel-chat-examples](https://holoviz-topics.github.io/panel-chat-examples/) docs to see applicable examples related to [LangChain](https://python.langchain.com/docs/get_started/introduction), [OpenAI](https://openai.com/blog/chatgpt), [Mistral](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwjZtP35yvSBAxU00wIHHerUDZAQFnoECBEQAQ&url=https%3A%2F%2Fdocs.mistral.ai%2F&usg=AOvVaw2qpx09O_zOzSksgjBKiJY_&opi=89978449), [Llama](https://ai.meta.com/llama/), etc. If you have an example to demo, we'd love to add it to the panel-chat-examples gallery!\n",
"\n",
"![Chat Design Specification](../../assets/ChatDesignSpecification.png)\n",
"<img alt=\"Chat Design Specification\" src=\"../../assets/ChatDesignSpecification.png\"></img>\n",
"\n",
"#### Parameters:\n",
"\n",
Expand Down
5 changes: 3 additions & 2 deletions examples/reference/chat/ChatInterface.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@
"\n",
"Check out the [panel-chat-examples](https://holoviz-topics.github.io/panel-chat-examples/) docs to see applicable examples related to [LangChain](https://python.langchain.com/docs/get_started/introduction), [OpenAI](https://openai.com/blog/chatgpt), [Mistral](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwjZtP35yvSBAxU00wIHHerUDZAQFnoECBEQAQ&url=https%3A%2F%2Fdocs.mistral.ai%2F&usg=AOvVaw2qpx09O_zOzSksgjBKiJY_&opi=89978449), [Llama](https://ai.meta.com/llama/), etc. If you have an example to demo, we'd love to add it to the panel-chat-examples gallery!\n",
"\n",
"![Chat Design Specification](../../assets/ChatDesignSpecification.png)\n",
"<img alt=\"Chat Design Specification\" src=\"../../assets/ChatDesignSpecification.png\"></img>\n",
"\n",
"#### Parameters:\n",
"\n",
"##### Core\n",
Expand Down Expand Up @@ -406,5 +407,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}
2 changes: 1 addition & 1 deletion examples/reference/chat/ChatMessage.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"See [`ChatInterface`](ChatInterface.ipynb) for a high-level, *easy to use*, *ChatGPT like* interface.\n",
"\n",
"![Chat Design Specification](../../assets/ChatDesignSpecification.png)\n",
"<img alt=\"Chat Design Specification\" src=\"../../assets/ChatDesignSpecification.png\"></img>\n",
"\n",
"#### Parameters:\n",
"\n",
Expand Down
19 changes: 10 additions & 9 deletions examples/reference/chat/PanelCallbackHandler.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The [Langchain](https://python.langchain.com/docs/get_started/introduction) `PanelCallbackHandler` is useful for rendering and streaming *chain of thought* from Langchain objects like Tools, Agents, and Chains. It inherits from Langchain's [BaseCallbackHandler](https://python.langchain.com/docs/modules/callbacks/).\n",
"The [Langchain](https://python.langchain.com/docs/get_started/introduction) `PanelCallbackHandler` is useful for rendering and streaming the *chain of thought* from Langchain objects like Tools, Agents, and Chains. It inherits from Langchain's [BaseCallbackHandler](https://python.langchain.com/docs/modules/callbacks/).\n",
"\n",
"Check out the [panel-chat-examples](https://holoviz-topics.github.io/panel-chat-examples/langchain/) docs to see more examples on how to use `PanelCallbackHandler`. If you have an example to demo, we'd love to add it to the panel-chat-examples gallery!\n",
"\n",
Expand Down Expand Up @@ -46,7 +46,7 @@
"pn.chat.ChatInterface(callback=callback).servable()\n",
"```\n",
"\n",
"![Panel Callback Handler Basic](../../assets/panel_callback_handler_basic.png)\n",
"<img alt=\"Panel Callback Handler Basic\" src=\"../../assets/panel_callback_handler_basic.png\"></img>\n",
"\n",
"This example shows the response from the `llm` only. A `llm` by it self does not show any *chain of thought*. Later we will build an agent that uses tools. This will show *chain of thought*."
]
Expand Down Expand Up @@ -74,7 +74,7 @@
"pn.chat.ChatInterface(callback=callback).servable()\n",
"```\n",
"\n",
"![Panel Callback Handler Basic](../../assets/panel_callback_handler_basic.png)"
"<img alt=\"Panel Callback Handler Basic\" src=\"../../assets/panel_callback_handler_basic.png\"></img>"
]
},
{
Expand Down Expand Up @@ -114,7 +114,7 @@
"source": [
"### Agents with Tools\n",
"\n",
"Again, `async` is not required, but more efficient.\n",
"Tools can be used and it'll also be streamed by passing `callback_handler` to `callbacks`.\n",
"\n",
"```python\n",
"from langchain.llms import OpenAI\n",
Expand All @@ -127,9 +127,9 @@
" tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True, \n",
")\n",
"\n",
"async def callback(contents, user, instance):\n",
"def callback(contents, user, instance):\n",
" callback_handler = pn.chat.langchain.PanelCallbackHandler(instance)\n",
" await agent.arun(contents, callbacks=[callback_handler])\n",
" return agent.run(contents, callbacks=[callback_handler])\n",
"\n",
"pn.chat.ChatInterface(callback=callback).servable()\n",
"```\n",
Expand Down Expand Up @@ -210,7 +210,8 @@
"async def callback(contents, user, instance):\n",
" callback_handler = pn.chat.langchain.PanelCallbackHandler(instance)\n",
" chain = get_chain(callbacks=[callback_handler])\n",
" await chain.ainvoke(contents)\n",
" response = await chain.ainvoke(contents)\n",
" return response.content\n",
"\n",
"\n",
"pn.chat.ChatInterface(callback=callback).servable()\n",
Expand All @@ -219,7 +220,7 @@
"Please note that we use the `hack` because retrievers currently do not call any `CallbackHandler`s.\n",
"See [LangChain Issue#7290](https://github.com/langchain-ai/langchain/issues/7290).\n",
"\n",
"![PanelCallbackHandler with retriever](../../assets/panel_callback_handler_retriever.png)"
"<img alt=\"PanelCallbackHandler with retriever\" src=\"../../assets/panel_callback_handler_retriever.png\"></img>"
]
}
],
Expand All @@ -230,5 +231,5 @@
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}
32 changes: 19 additions & 13 deletions panel/chat/langchain.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,8 @@
class PanelCallbackHandler(BaseCallbackHandler):
"""
The Langchain `PanelCallbackHandler` itself is not a widget or pane, but is useful for rendering
and streaming output from Langchain Tools, Agents, and Chains as `ChatMessage` objects.
and streaming the *chain of thought* from Langchain Tools, Agents, and Chains
as `ChatMessage` objects.

Reference: https://panel.holoviz.org/reference/chat/PanelCallbackHandler.html

Expand Down Expand Up @@ -67,13 +68,20 @@ def _update_active(self, avatar: str, label: str):
if f"- {label}" not in self._active_user:
self._active_user = f"{self._active_user} - {label}"

def _reset_active(self):
self._active_user = self._input_user
self._active_avatar = self._input_avatar
self._message = None

def _stream(self, message: str):
return self.instance.stream(
message,
user=self._active_user,
avatar=self._active_avatar,
message=self._message,
)
if message.strip():
return self.instance.stream(
message,
user=self._active_user,
avatar=self._active_avatar,
message=self._message,
)
return self._message

MarcSkovMadsen marked this conversation as resolved.
Show resolved Hide resolved
def on_llm_start(self, serialized: Dict[str, Any], *args, **kwargs):
model = kwargs.get("invocation_params", {}).get("model_name", "")
Expand All @@ -99,9 +107,7 @@ def on_llm_end(self, response: LLMResult, *args, **kwargs):
respond=False,
)

self._active_user = self._input_user
self._active_avatar = self._input_avatar
self._message = None
self._reset_active()
return super().on_llm_end(response, *args, **kwargs)

def on_llm_error(self, error: Union[Exception, KeyboardInterrupt], *args, **kwargs):
Expand All @@ -117,10 +123,12 @@ def on_tool_start(
self, serialized: Dict[str, Any], input_str: str, *args, **kwargs
):
self._update_active(DEFAULT_AVATARS["tool"], serialized["name"])
self._stream(f"Tool input: {input_str}")
return super().on_tool_start(serialized, input_str, *args, **kwargs)

def on_tool_end(self, output: str, *args, **kwargs):
self._stream(output)
self._reset_active()
return super().on_tool_end(output, *args, **kwargs)

def on_tool_error(
Expand All @@ -136,9 +144,7 @@ def on_chain_start(
return super().on_chain_start(serialized, inputs, *args, **kwargs)

def on_chain_end(self, outputs: Dict[str, Any], *args, **kwargs):
if 'output' in outputs: # The chain is finished. Report the result
self.instance.disabled = self._disabled_state
self._stream(outputs['output'])
self.instance.disabled = self._disabled_state
return super().on_chain_end(outputs, *args, **kwargs)

def on_retriever_error(
Expand Down
4 changes: 2 additions & 2 deletions panel/tests/chat/test_langchain.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ async def callback(contents, user, instance):
tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, callbacks=[callback_handler]
)
instance.send("2 + 2")
assert len(instance.objects) == 4
assert len(instance.objects) == 3
assert instance.objects[1].object == "Action: Python REPL\nAction Input: print(2 + 2)"
assert instance.objects[2].object == "Final Answer: 4"
assert instance.objects[3].object == "4"
assert not instance.disabled
Loading