New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add simple fix to update container status after llm complete #8311
Add simple fix to update container status after llm complete #8311
Conversation
@KedoKudo Thanks for the contribution. I believe the streamlit/lib/streamlit/external/langchain/streamlit_callback_handler.py Lines 238 to 261 in ef2851f
I'm wondering if we should just call the following here?: complete(self._labeler.get_final_agent_thought_label()) |
Use the |
Another aspect that also doesn't seem to work well right now is if the llm runs into an error. It would be awesome if you could change the def on_llm_error(self, error: BaseException, *args: Any, **kwargs: Any) -> None:
self._container.exception(error)
self._state = LLMThoughtState.ERROR
self.complete("LLM encountered an error...") |
The suggested change has been added to this PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 👍
Describe your changes
This PR fixes an issue when using Langchain with
StreamlitCallbackHandler
.Currently, the container status is stuck at "running" after displaying all content.
This PR added one more step to update the container status after displaying all content:
GitHub Issue Link (if applicable)
This issue is reported in langchain-ai/langchain#11398
Testing Plan
Contribution License Agreement
By submitting this pull request you agree that all contributions to this project are made under the Apache 2.0 license.