Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StreamlitCallbackHandler #6315

Merged

Conversation

tconkling
Copy link
Contributor

A new implementation of StreamlitCallbackHandler. It formats Agent thoughts into Streamlit expanders.

You can see the handler in action here: https://langchain-mrkl.streamlit.app/

Per a discussion with Harrison, we'll be adding a StreamlitCallbackHandler implementation to an upcoming Streamlit release as well, and will be updating it as we add new LLM- and LangChain-specific features to Streamlit.

The idea with this PR is that the LangChain StreamlitCallbackHandler will "auto-update" in a way that keeps it forward- (and backward-) compatible with Streamlit. If the user has an older Streamlit version installed, the LangChain StreamlitCallbackHandler will be used; if they have a newer Streamlit version that has an updated StreamlitCallbackHandler, that implementation will be used instead.

(I'm opening this as a draft to get the conversation going and make sure we're on the same page. We're really excited to land this into LangChain!)

Who can review?

@agola11, @hwchase17

@vercel
Copy link

vercel bot commented Jun 16, 2023

@tconkling is attempting to deploy a commit to the LangChain Team on Vercel.

A member of the Team first needs to authorize it.

@dev2049 dev2049 added 03 enhancement Enhancement of existing functionality callbacks labels Jun 17, 2023
* master: (101 commits)
  add FunctionMessage support to `_convert_dict_to_message()` in OpenAI chat model (langchain-ai#6382)
  bump version to 206 (langchain-ai#6465)
  fix neo4j schema query (langchain-ai#6381)
  Update serpapi.py Support baidu list type answer_box (langchain-ai#6386)
  fix: llm caching for replicate (langchain-ai#6396)
  feat: use latest duckduckgo_search API to call (langchain-ai#6409)
  Harrison/unstructured page number (langchain-ai#6464)
  Improve error message (langchain-ai#6275)
  Fix the issue where ANTHROPIC_API_URL set in environment is not takin… (langchain-ai#6400)
  Fix broken links in autonomous agents docs (langchain-ai#6398)
  Update SinglStoreDB vectorstore (langchain-ai#6423)
  Fix for langchain-ai#6431 - chatprompt template with partial variables giing validation error (langchain-ai#6456)
  Harrison/functions in retrieval (langchain-ai#6463)
  Incorrect argument count handling (langchain-ai#5543)
  Fixed a link typo /-/route -> /-/routes. and change endpoint format (langchain-ai#6186)
  docs `retrievers` fixes (langchain-ai#6299)
  Update introduction.mdx (langchain-ai#6425)
  Fix Custom LLM Agent example (langchain-ai#6429)
  Remove backticks without clear purpose from docs (langchain-ai#6442)
  Update web_base.ipynb (langchain-ai#6430)
  ...
Copy link
Contributor

@vowelparrot vowelparrot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this on a high level! Some things that come to mind as I skim this:

  • It's explicitly targeting agents (or rather the agent executor state machine). That's great, I think, especially at first, but I wonder if we want to indicate this in any way via the callback naming since it doesn't support arbitrary "chain" or other workflows.
  • I may be misreading, but it doesn't seem like it supports at most one thought at a time, right? I think that's fine to start, just want to understand the limits a bit here.

Re: testing, an integration test would be great. If we could mock a couple of the imports, that would be even better, but I'm not sure we are quite that rigorous on the callback integrations yet.

cc @agola11

langchain/callbacks/streamlit/__init__.py Outdated Show resolved Hide resolved
langchain/callbacks/streamlit/mutable_expander.py Outdated Show resolved Hide resolved
langchain/callbacks/streamlit/mutable_expander.py Outdated Show resolved Hide resolved
from enum import Enum
from typing import Any, NamedTuple

from streamlit.delta_generator import DeltaGenerator
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we'll have to lazy import these

@sfc-gh-jcarroll
Copy link
Contributor

Thanks @vowelparrot! I have been working on this with Tim. On the high level feedback:

It's explicitly targeting agents (or rather the agent executor state machine). That's great, I think, especially at first, but I wonder if we want to indicate this in any way via the callback naming since it doesn't support arbitrary "chain" or other workflows.

Yep, we mentioned that in the docstring and would want to call it out in the docs & examples we share too.

  • Only concern about changing the name is that we DO want to make it more general later so would that be an annoying name change to deprecate / break in a later version?
  • BTW, any ideas on what the right flow / UI would be for this in a non-agent Chain? (like a raw SQLChain or something?) We wracked our brains a bit and weren't totally sure.

I may be misreading, but it doesn't seem like it supports at most one thought at a time, right? I think that's fine to start, just want to understand the limits a bit here.

I'm not sure I understand what you mean by this, could you share a little more detail or an example of what you mean? The callback integration should support one thought or arbitrarily many thoughts (as seen in the example app). Although one thought / no tool use would render fine but be a bit awkward - related to the chain comment above - very open to ideas on how to support this.

Thank you!

* master: (28 commits)
  [Feature][VectorStore] Support StarRocks as vector db (langchain-ai#6119)
  Relax string input mapper check (langchain-ai#6544)
  bump to ver 208 (langchain-ai#6540)
  Harrison/multi tool (langchain-ai#6518)
  Infino integration for simplified logs, metrics & search across LLM data & token usage (langchain-ai#6218)
  Update model token mappings/cost to include 0613 models (langchain-ai#6122)
  Fix issue with non-list `To` header in GmailSendMessage Tool (langchain-ai#6242)
  Integrate Rockset as Vectorstore (langchain-ai#6216)
  Feat: Add a prompt template parameter to qa with structure chains (langchain-ai#6495)
  Add async support for HuggingFaceTextGenInference (langchain-ai#6507)
  Be able to use Codey models on Vertex AI (langchain-ai#6354)
  Add KuzuQAChain (langchain-ai#6454)
  Update index.mdx (langchain-ai#6326)
  Export trajectory eval fn (langchain-ai#6509)
  typo(llamacpp.ipynb): 'condiser' -> 'consider' (langchain-ai#6474)
  Fix typo in docstring of format_tool_to_openai_function (langchain-ai#6479)
  Make streamlit import optional (langchain-ai#6510)
  Fixed: 'readible' -> readable (langchain-ai#6492)
  Documentation Fix: Correct the example code output in the prompt templates doc (langchain-ai#6496)
  Fix link (langchain-ai#6501)
  ...
@vercel
Copy link

vercel bot commented Jun 21, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Jun 22, 2023 5:15pm

@tconkling
Copy link
Contributor Author

Thanks for the review, @vowelparrot! I'll take care of the nits today, and get started on integration tests.

I'm not able to add Streamlit as an optional dependency because our supported Python version range excludes Python 3.9.7 due to some issue we had with it, and Poetry won't resolve Streamlit because of that 3.9.7 exclusion.

@tconkling tconkling marked this pull request as ready for review June 22, 2023 16:43
* master:
  MD header text splitter returns Documents (langchain-ai#6571)
  Fix callback forwarding in async plan method for OpenAI function agent (langchain-ai#6584)
  bump 209 (langchain-ai#6593)
  Clarifai integration (langchain-ai#5954)
  Add missing word in comment (langchain-ai#6587)
  Add AzureML endpoint LLM wrapper (langchain-ai#6580)
  Add OpenLLM wrapper(langchain-ai#6578)
  feat: interfaces for async embeddings, implement async openai (langchain-ai#6563)
  Upgrade the version of AwaDB and add some new interfaces (langchain-ai#6565)
  add motherduck docs (langchain-ai#6572)
  Detailed using the Twilio tool to send messages with 3rd party apps incl. WhatsApp (langchain-ai#6562)
  Change Data Loader Namespace (langchain-ai#6568)
  Remove duplicate databricks entries in ecosystem integrations (langchain-ai#6569)
  Fix whatsappchatloader - enable parsing new datetime format on WhatsApp chat (langchain-ai#6555)
  Wait for all futures (langchain-ai#6554)
  feat: faiss filter from list (langchain-ai#6537)
  update pr tmpl (langchain-ai#6552)
  Remove unintended double negation in docstring (langchain-ai#6541)
  Minor Grammar Fixes in Docs and Comments (langchain-ai#6536)
@tconkling
Copy link
Contributor Author

tconkling commented Jun 22, 2023

@vowelparrot (and cc @sfc-gh-jcarroll) - I'm removing the draft status from this PR, and would love another review if you have a chance.

Some notes:

  • There's a few unit tests for the "auto-upgrading callback handler" feature that pulls StreamlitCallbackHandler from the streamlit package itself, if it has a newer version of the callback handler
  • There's a very basic integration test for the handler. It just runs the "Who is Olivia Wilde's boyfriend" agent query that I saw used in other tests. Obviously, this test requires network access, but:
    • I wrote a (very simple!) tool for testing CallbackHandlers that saves an Agent's callback invocations to a pickle file, and lets you "replay" those callbacks against any set of CallbackHandlers without hitting the network (or spending your token budget). If this would be useful, I'd be happy to add it to the PR and use it in the integration test. (It's just a single small Python file.)
  • Re: naming - happy to rename this to StreamlitAgentCallbackHandler or whatever you think is appropriate; just let me know and I'll push the change.

Thanks!

@vowelparrot
Copy link
Contributor

Will review in an hour!

@vowelparrot
Copy link
Contributor

@sfc-gh-jcarroll I think this looks good as is, and with the current name.

In terms of other non-agent flows that are common - I'd say:

  • Some fixed retrieval -> response synthesis for Q&A or summarization, so map reduce chain, SQL DB chain
  • Information extraction
  • Some of the experimental agents (BabyAGI, Generative agents, etc.)

You know your users better than I, so maybe the first two are less important for visualizing in streamlit. Perhaps verifying this works reasonably well with custom agents that don't use the existing agent executor would be good, even if it doesn't impact this PR directly

@sfc-gh-jcarroll
Copy link
Contributor

Awesome!! Anything else needed on our end before this can be merged and released?

We will give those other workflows a try and see what viz makes sense with this callback handler in Streamlit (future work). Thank you!!

@vowelparrot vowelparrot merged commit c28990d into langchain-ai:master Jun 22, 2023
14 checks passed
@tconkling tconkling deleted the tim/StreamlitCallbackHandler branch June 22, 2023 21:11
This was referenced Jun 25, 2023
hwchase17 pushed a commit that referenced this pull request Jun 27, 2023
**Description:** Add a documentation page for the Streamlit Callback
Handler integration (#6315)

Notes:
- Implemented as a markdown file instead of a notebook since example
code runs in a Streamlit app (happy to discuss / consider alternatives
now or later)
- Contains an embedded Streamlit app ->
https://mrkl-minimal.streamlit.app/ Currently this app is hosted out of
a Streamlit repo but we're working to migrate the code to a LangChain
owned repo


![streamlit_docs](https://github.com/hwchase17/langchain/assets/116604821/0b7a6239-361f-470c-8539-f22c40098d1a)

cc @dev2049 @tconkling
tconkling added a commit to streamlit/streamlit that referenced this pull request Jun 27, 2023
Add our LangChain `StreamlitCallbackHandler` (also present in the [LangChain repo](langchain-ai/langchain#6315)), along with some Streamlit-specific tests.

When used from LangChain, this callback handler is an "auto-updating API". That is, a LangChain user can do

```python
from langchain.callbacks.streamlit import StreamlitCallbackHandler
callback = StreamlitCallbackHandler(st.container())
```

and if they have a recent version of Streamlit installed in their environment, Streamlit's copy of the callback handler will be used instead of the LangChain-internal one. This allows us to update and improve `StreamlitCallbackHandler` independently of LangChain, and LangChain users of the callback will see those changes automatically.

In other words, while `StreamlitCallbackHandler` is not part of the public Streamlit `st` API, it _is_ part of LangChain's public API, and we need to keep it stable. (This PR contains a few tests that assert its API stability.)
kacperlukawski pushed a commit to kacperlukawski/langchain that referenced this pull request Jun 29, 2023
A new implementation of `StreamlitCallbackHandler`. It formats Agent
thoughts into Streamlit expanders.

You can see the handler in action here:
https://langchain-mrkl.streamlit.app/

Per a discussion with Harrison, we'll be adding a
`StreamlitCallbackHandler` implementation to an upcoming
[Streamlit](https://github.com/streamlit/streamlit) release as well, and
will be updating it as we add new LLM- and LangChain-specific features
to Streamlit.

The idea with this PR is that the LangChain `StreamlitCallbackHandler`
will "auto-update" in a way that keeps it forward- (and backward-)
compatible with Streamlit. If the user has an older Streamlit version
installed, the LangChain `StreamlitCallbackHandler` will be used; if
they have a newer Streamlit version that has an updated
`StreamlitCallbackHandler`, that implementation will be used instead.

(I'm opening this as a draft to get the conversation going and make sure
we're on the same page. We're really excited to land this into
LangChain!)

#### Who can review?

@agola11, @hwchase17
kacperlukawski pushed a commit to kacperlukawski/langchain that referenced this pull request Jun 29, 2023
**Description:** Add a documentation page for the Streamlit Callback
Handler integration (langchain-ai#6315)

Notes:
- Implemented as a markdown file instead of a notebook since example
code runs in a Streamlit app (happy to discuss / consider alternatives
now or later)
- Contains an embedded Streamlit app ->
https://mrkl-minimal.streamlit.app/ Currently this app is hosted out of
a Streamlit repo but we're working to migrate the code to a LangChain
owned repo


![streamlit_docs](https://github.com/hwchase17/langchain/assets/116604821/0b7a6239-361f-470c-8539-f22c40098d1a)

cc @dev2049 @tconkling
vowelparrot pushed a commit that referenced this pull request Jul 4, 2023
**Description:** Add a documentation page for the Streamlit Callback
Handler integration (#6315)

Notes:
- Implemented as a markdown file instead of a notebook since example
code runs in a Streamlit app (happy to discuss / consider alternatives
now or later)
- Contains an embedded Streamlit app ->
https://mrkl-minimal.streamlit.app/ Currently this app is hosted out of
a Streamlit repo but we're working to migrate the code to a LangChain
owned repo


![streamlit_docs](https://github.com/hwchase17/langchain/assets/116604821/0b7a6239-361f-470c-8539-f22c40098d1a)

cc @dev2049 @tconkling
aerrober pushed a commit to aerrober/langchain-fork that referenced this pull request Jul 24, 2023
A new implementation of `StreamlitCallbackHandler`. It formats Agent
thoughts into Streamlit expanders.

You can see the handler in action here:
https://langchain-mrkl.streamlit.app/

Per a discussion with Harrison, we'll be adding a
`StreamlitCallbackHandler` implementation to an upcoming
[Streamlit](https://github.com/streamlit/streamlit) release as well, and
will be updating it as we add new LLM- and LangChain-specific features
to Streamlit.

The idea with this PR is that the LangChain `StreamlitCallbackHandler`
will "auto-update" in a way that keeps it forward- (and backward-)
compatible with Streamlit. If the user has an older Streamlit version
installed, the LangChain `StreamlitCallbackHandler` will be used; if
they have a newer Streamlit version that has an updated
`StreamlitCallbackHandler`, that implementation will be used instead.

(I'm opening this as a draft to get the conversation going and make sure
we're on the same page. We're really excited to land this into
LangChain!)

#### Who can review?

@agola11, @hwchase17
aerrober pushed a commit to aerrober/langchain-fork that referenced this pull request Jul 24, 2023
**Description:** Add a documentation page for the Streamlit Callback
Handler integration (langchain-ai#6315)

Notes:
- Implemented as a markdown file instead of a notebook since example
code runs in a Streamlit app (happy to discuss / consider alternatives
now or later)
- Contains an embedded Streamlit app ->
https://mrkl-minimal.streamlit.app/ Currently this app is hosted out of
a Streamlit repo but we're working to migrate the code to a LangChain
owned repo


![streamlit_docs](https://github.com/hwchase17/langchain/assets/116604821/0b7a6239-361f-470c-8539-f22c40098d1a)

cc @dev2049 @tconkling
eric-skydio pushed a commit to eric-skydio/streamlit that referenced this pull request Dec 20, 2023
Add our LangChain `StreamlitCallbackHandler` (also present in the [LangChain repo](langchain-ai/langchain#6315)), along with some Streamlit-specific tests.

When used from LangChain, this callback handler is an "auto-updating API". That is, a LangChain user can do

```python
from langchain.callbacks.streamlit import StreamlitCallbackHandler
callback = StreamlitCallbackHandler(st.container())
```

and if they have a recent version of Streamlit installed in their environment, Streamlit's copy of the callback handler will be used instead of the LangChain-internal one. This allows us to update and improve `StreamlitCallbackHandler` independently of LangChain, and LangChain users of the callback will see those changes automatically.

In other words, while `StreamlitCallbackHandler` is not part of the public Streamlit `st` API, it _is_ part of LangChain's public API, and we need to keep it stable. (This PR contains a few tests that assert its API stability.)
zyxue pushed a commit to zyxue/streamlit that referenced this pull request Mar 22, 2024
Add our LangChain `StreamlitCallbackHandler` (also present in the [LangChain repo](langchain-ai/langchain#6315)), along with some Streamlit-specific tests.

When used from LangChain, this callback handler is an "auto-updating API". That is, a LangChain user can do

```python
from langchain.callbacks.streamlit import StreamlitCallbackHandler
callback = StreamlitCallbackHandler(st.container())
```

and if they have a recent version of Streamlit installed in their environment, Streamlit's copy of the callback handler will be used instead of the LangChain-internal one. This allows us to update and improve `StreamlitCallbackHandler` independently of LangChain, and LangChain users of the callback will see those changes automatically.

In other words, while `StreamlitCallbackHandler` is not part of the public Streamlit `st` API, it _is_ part of LangChain's public API, and we need to keep it stable. (This PR contains a few tests that assert its API stability.)
zyxue pushed a commit to zyxue/streamlit that referenced this pull request Apr 16, 2024
Add our LangChain `StreamlitCallbackHandler` (also present in the [LangChain repo](langchain-ai/langchain#6315)), along with some Streamlit-specific tests.

When used from LangChain, this callback handler is an "auto-updating API". That is, a LangChain user can do

```python
from langchain.callbacks.streamlit import StreamlitCallbackHandler
callback = StreamlitCallbackHandler(st.container())
```

and if they have a recent version of Streamlit installed in their environment, Streamlit's copy of the callback handler will be used instead of the LangChain-internal one. This allows us to update and improve `StreamlitCallbackHandler` independently of LangChain, and LangChain users of the callback will see those changes automatically.

In other words, while `StreamlitCallbackHandler` is not part of the public Streamlit `st` API, it _is_ part of LangChain's public API, and we need to keep it stable. (This PR contains a few tests that assert its API stability.)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
03 enhancement Enhancement of existing functionality
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants