Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add shiny.ui.Chat #1453

Merged
merged 101 commits into from
Jul 3, 2024
Merged

Add shiny.ui.Chat #1453

merged 101 commits into from
Jul 3, 2024

Conversation

cpsievert
Copy link
Collaborator

@cpsievert cpsievert commented Jun 5, 2024

Adds a shiny.ui.Chat class, designed to support any LLM provider (i.e., response assistant) of your choosing (e.g., OpenAI, Anthropic, Ollama, LangChain, Google, etc). To get started, consider this bare-bones Chat example that uses no provider at all. It just displays a starting message, and then for each input message, just adds "You said: " in it's response:

App code
from shiny.express import ui

ui.page_opts(title="Hello Shiny Chat")

# Create a chat instance, with an initial message
chat = ui.Chat(
    id="chat",
    messages=[
        {"content": "Hello! How can I help you today?", "role": "assistant"},
    ],
)

# Display the chat
chat.ui()

# Define a callback to run when the user submits a message
@chat.on_user_submit
async def _():
    user_msg = chat.get_user_input()
    await chat.append_message(f"You said: {user_msg}")

Screenshot 2024-06-28 at 4 28 47 PM

An "actual" LLM-powered chat bot might look something more like this (this example requires an OpenAI API key to run). Note also that, by default, responses are interpreted as markdown strings, and code blocks are rendered with code highlighting + copy/paste:

App code
# ------------------------------------------------------------------------------------
# A basic Shiny Chat example powered by OpenAI via LangChain.
# To run it, you'll need OpenAI API key.
# To get one, follow the instructions at https://platform.openai.com/docs/quickstart
# To use other providers/models via LangChain, see https://python.langchain.com/v0.1/docs/modules/model_io/chat/quick_start/
# ------------------------------------------------------------------------------------
import os

from langchain_openai import ChatOpenAI

from shiny.express import ui

# Provide your API key here (or set the environment variable)
llm = ChatOpenAI(api_key=os.environ.get("OPENAI_API_KEY"))

# Set some Shiny page options
ui.page_opts(
    title="Hello LangChain Chat Models",
    fillable=True,
    fillable_mobile=True,
)

# Create and display an empty chat UI
chat = ui.Chat(id="chat")
chat.ui()


# Define a callback to run when the user submits a message
@chat.on_user_submit
async def _():
    # Get messages currently in the chat
    messages = chat.get_messages()
    # Create a response message stream
    response = llm.astream(messages)
    # Append the response stream into the chat
    await chat.append_message_stream(response)

Screenshot 2024-06-28 at 4 22 05 PM

A whole collection of other examples are available under the examples/chat directory. See the basic and enterprise sub-directories for getting started with various providers.

Follow up tasks

  • Tests

examples/chat/openai/app.py Outdated Show resolved Hide resolved
shiny/ui/_chat.py Outdated Show resolved Hide resolved
@cpsievert cpsievert enabled auto-merge July 3, 2024 21:03
@cpsievert cpsievert added this pull request to the merge queue Jul 3, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Jul 3, 2024
@cpsievert cpsievert merged commit 1a5db90 into main Jul 3, 2024
31 checks passed
@cpsievert cpsievert deleted the chat-llms branch July 3, 2024 21:41
schloerke added a commit to machow/py-shiny that referenced this pull request Jul 5, 2024
* main:
  test(controllers): Refactor column sort and filter methods for Dataframe class (posit-dev#1496)
  Follow up to posit-dev#1453: allow user roles when normalizing a dictionary (posit-dev#1495)
  fix(layout_columns): Fix coercion of scalar row height to list for python <= 3.9 (posit-dev#1494)
  Add `shiny.ui.Chat` (posit-dev#1453)
  docs(Theme): Fix example and clarify usage (posit-dev#1491)
  chore(pyright): Pin pyright version to `1.1.369` to avoid CI failures (posit-dev#1493)
  tests(dataframe): Add additional tests for dataframe (posit-dev#1487)
  bug(data frame): Export `render.StyleInfo` (posit-dev#1488)
schloerke added a commit that referenced this pull request Jul 9, 2024
* main:
  feat(data frame): Support `polars` (#1474)
  api(playwright): Code review of complete playwright API (#1501)
  fix: Move `www/shared/py-shiny` to `www/py-shiny` (#1499)
  test(controllers): Refactor column sort and filter methods for Dataframe class (#1496)
  Follow up to #1453: allow user roles when normalizing a dictionary (#1495)
  fix(layout_columns): Fix coercion of scalar row height to list for python <= 3.9 (#1494)
  Add `shiny.ui.Chat` (#1453)
  docs(Theme): Fix example and clarify usage (#1491)
  chore(pyright): Pin pyright version to `1.1.369` to avoid CI failures (#1493)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants