Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradio components in gr.Chatbot() #8131

Merged
merged 79 commits into from
Jun 14, 2024
Merged
Show file tree
Hide file tree
Changes from 40 commits
Commits
Show all changes
79 commits
Select commit Hold shift + click to select a range
803a059
chatbot components
dawoodkhan82 Apr 24, 2024
eaf3737
demoi
dawoodkhan82 Apr 29, 2024
0e161e7
merge
dawoodkhan82 Apr 29, 2024
539ff05
add changeset
gradio-pr-bot Apr 29, 2024
fb2b107
preprocess fix
dawoodkhan82 Apr 29, 2024
54af1b5
Merge branch 'main' into chatbot-components
dawoodkhan82 Apr 29, 2024
d1b9604
add changeset
gradio-pr-bot Apr 29, 2024
838965f
Make guide for tailwind more verbose (#8152)
duerrsimon Apr 30, 2024
d3a11ca
Lite wheel optimization (#7855)
whitphx Apr 30, 2024
80b868e
Add ETag to `/custom_component` route to control browser caching (#8170)
freddyaboulton Apr 30, 2024
ce4d348
Implement JS Client tests (#8109)
hannahblair Apr 30, 2024
e4bd0c4
Remove hatch installation in js/app/package.json which is no longer n…
whitphx Apr 30, 2024
e94d7da
Update test-functional.yml - Fix vulnerability code injection (#8145)
h2oa May 1, 2024
9d8ead8
rework upload to be a class method + pass client into each component …
pngwn May 1, 2024
7d1c87b
chore(deps): update pnpm to v9 (#8123)
renovate[bot] May 1, 2024
1b3d2b6
Use workspace version for code in _website (#8189)
aliabd May 1, 2024
1f0483f
Pass Error status in /dev/reload stream (#8106)
freddyaboulton May 1, 2024
1b1ed31
Convert sse calls in client from async to sync (#8182)
abidlabs May 2, 2024
0430f79
run python reload only if python file changed (#8194)
jameszhou02 May 2, 2024
8e0d09d
fix: handling SIGINT correctly in reload.py, single entrance of block…
Tiger3018 May 2, 2024
1d7fc6f
Add eventsource polyfill for Node.js and browser environments (#8118)
hannahblair May 2, 2024
c5277a3
Ensure connectivity to private HF spaces with SSE protocol (#8181)
hannahblair May 2, 2024
6cac294
Support custom components in gr.load (#8200)
freddyaboulton May 2, 2024
6b102f8
Refactor analytics to not use api.gradio.app (#8180)
freddyaboulton May 3, 2024
7d1a125
Specify the fastapi version on Lite to avoid ujson installation which…
whitphx May 3, 2024
9533ea6
Set the show_api flag on Lite (#8205)
whitphx May 3, 2024
5678375
Extend Interface.from_pipeline() to support Transformers.js.py pipeli…
whitphx May 3, 2024
078df9c
merge
dawoodkhan82 May 6, 2024
7e65963
allow the canvas size to be set on the `ImageEditor` (#8127)
pngwn May 3, 2024
70783cf
Rename `eventSource_Factory` and `fetch_implementation` (#8209)
hannahblair May 3, 2024
4183dd8
remove redundant event source logic (#8211)
hannahblair May 3, 2024
c96ef0d
Only connect to heartbeat if needed (#8169)
freddyaboulton May 3, 2024
1b2fa59
chore: update versions (#8172)
pngwn May 3, 2024
68c70df
merge
dawoodkhan82 May 9, 2024
190edf7
fixes
dawoodkhan82 May 9, 2024
2b4c079
type fixes
dawoodkhan82 May 12, 2024
1a9318d
Merge branch 'main' into chatbot-components
dawoodkhan82 May 12, 2024
ba056cb
type fixes
dawoodkhan82 May 12, 2024
c3ba20a
Merge branch 'main' into chatbot-components
dawoodkhan82 May 13, 2024
5955494
notebook fix
dawoodkhan82 May 13, 2024
daeb642
type ignore
dawoodkhan82 May 13, 2024
2fc0ba9
data object model
dawoodkhan82 May 15, 2024
d179f7e
Merge branch 'main' into chatbot-components
dawoodkhan82 May 15, 2024
649c566
remove component in tuple
dawoodkhan82 May 15, 2024
42144cb
more fixes
dawoodkhan82 May 16, 2024
688e576
Merge branch 'main' into chatbot-components
dawoodkhan82 May 16, 2024
a7ca154
Merge branch 'main' into chatbot-components
dawoodkhan82 May 19, 2024
411e736
extend components
dawoodkhan82 May 22, 2024
5f3321b
merge
dawoodkhan82 May 23, 2024
80a3324
remove test var
dawoodkhan82 May 24, 2024
61ae7c4
extend to all components backend
dawoodkhan82 May 29, 2024
c0ed798
Merge branch 'main' into chatbot-components
dawoodkhan82 May 29, 2024
acc78b9
remove loading data models
dawoodkhan82 May 29, 2024
89540fb
merge
dawoodkhan82 Jun 4, 2024
620e7d9
conflict fix
dawoodkhan82 Jun 4, 2024
6ccc896
test and type fixes
dawoodkhan82 Jun 4, 2024
f8a1b2b
Merge branch 'main' into chatbot-components
dawoodkhan82 Jun 4, 2024
2ef7bd9
playwright test
dawoodkhan82 Jun 4, 2024
37bddec
PR fixes
dawoodkhan82 Jun 6, 2024
71d3d40
Merge branch 'main' into chatbot-components
dawoodkhan82 Jun 6, 2024
3433694
Merge branch 'main' into chatbot-components
freddyaboulton Jun 10, 2024
82220e2
final changes
freddyaboulton Jun 10, 2024
a713691
Merge branch 'main' into chatbot-components
freddyaboulton Jun 10, 2024
825ee2b
Add pltly for 2e2 test
freddyaboulton Jun 10, 2024
60dd746
pass loader to Gradio helper class
pngwn Jun 12, 2024
448dad1
fix things
pngwn Jun 12, 2024
95ef23a
add changeset
gradio-pr-bot Jun 12, 2024
af62d2c
checks
pngwn Jun 12, 2024
e5e7e7b
Merge branch 'main' into chatbot-components
pngwn Jun 12, 2024
247f1e8
more fixy
pngwn Jun 12, 2024
57962c8
more fixy
pngwn Jun 12, 2024
e3075d3
more fixy
pngwn Jun 12, 2024
bb76e5e
guess what? more fixy
pngwn Jun 12, 2024
d8e44fc
fix storybook
pngwn Jun 13, 2024
e83e688
add changeset
gradio-pr-bot Jun 13, 2024
8cb6a3d
formatting
pngwn Jun 13, 2024
95fd6b0
timeout
pngwn Jun 13, 2024
ea43e95
Merge branch 'main' into chatbot-components
pngwn Jun 14, 2024
f4282fb
fixyfixfix
pngwn Jun 14, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
9 changes: 9 additions & 0 deletions .changeset/bitter-goats-chew.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
"@gradio/chatbot": minor
"@gradio/image": minor
"@gradio/multimodaltextbox": minor
"@gradio/plot": minor
"gradio": minor
---

feat:Gradio components in `gr.Chatbot()`
Binary file added demo/chatbot_multimodal/files/cantina.wav
Binary file not shown.
Binary file added demo/chatbot_multimodal/files/cheetah.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added demo/chatbot_multimodal/files/world.mp4
Binary file not shown.
Binary file added demo/chatbot_multimodal/files/zebra.jpg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion demo/chatbot_multimodal/run.ipynb
Original file line number Diff line number Diff line change
@@ -1 +1 @@
{"cells": [{"cell_type": "markdown", "id": "302934307671667531413257853548643485645", "metadata": {}, "source": ["# Gradio Demo: chatbot_multimodal"]}, {"cell_type": "code", "execution_count": null, "id": "272996653310673477252411125948039410165", "metadata": {}, "outputs": [], "source": ["!pip install -q gradio "]}, {"cell_type": "code", "execution_count": null, "id": "288918539441861185822528903084949547379", "metadata": {}, "outputs": [], "source": ["# Downloading files from the demo repo\n", "import os\n", "os.mkdir('files')\n", "!wget -q -O files/avatar.png https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/files/avatar.png\n", "!wget -q -O files/lion.jpg https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/files/lion.jpg"]}, {"cell_type": "code", "execution_count": null, "id": "44380577570523278879349135829904343037", "metadata": {}, "outputs": [], "source": ["import gradio as gr\n", "import os\n", "import time\n", "\n", "# Chatbot demo with multimodal input (text, markdown, LaTeX, code blocks, image, audio, & video). Plus shows support for streaming text.\n", "\n", "\n", "def print_like_dislike(x: gr.LikeData):\n", " print(x.index, x.value, x.liked)\n", "\n", "def add_message(history, message):\n", " for x in message[\"files\"]:\n", " history.append(((x,), None))\n", " if message[\"text\"] is not None:\n", " history.append((message[\"text\"], None))\n", " return history, gr.MultimodalTextbox(value=None, interactive=False)\n", "\n", "def bot(history):\n", " response = \"**That's cool!**\"\n", " history[-1][1] = \"\"\n", " for character in response:\n", " history[-1][1] += character\n", " time.sleep(0.05)\n", " yield history\n", "\n", "with gr.Blocks() as demo:\n", " chatbot = gr.Chatbot(\n", " [],\n", " elem_id=\"chatbot\",\n", " bubble_full_width=False\n", " )\n", "\n", " chat_input = gr.MultimodalTextbox(interactive=True, file_types=[\"image\"], placeholder=\"Enter message or upload file...\", show_label=False)\n", "\n", " chat_msg = chat_input.submit(add_message, [chatbot, chat_input], [chatbot, chat_input])\n", " bot_msg = chat_msg.then(bot, chatbot, chatbot, api_name=\"bot_response\")\n", " bot_msg.then(lambda: gr.MultimodalTextbox(interactive=True), None, [chat_input])\n", "\n", " chatbot.like(print_like_dislike, None, None)\n", "\n", "demo.queue()\n", "if __name__ == \"__main__\":\n", " demo.launch()\n"]}], "metadata": {}, "nbformat": 4, "nbformat_minor": 5}
{"cells": [{"cell_type": "markdown", "id": "302934307671667531413257853548643485645", "metadata": {}, "source": ["# Gradio Demo: chatbot_multimodal"]}, {"cell_type": "code", "execution_count": null, "id": "272996653310673477252411125948039410165", "metadata": {}, "outputs": [], "source": ["!pip install -q gradio "]}, {"cell_type": "code", "execution_count": null, "id": "288918539441861185822528903084949547379", "metadata": {}, "outputs": [], "source": ["# Downloading files from the demo repo\n", "import os\n", "os.mkdir('files')\n", "!wget -q -O files/avatar.png https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/files/avatar.png\n", "!wget -q -O files/cantina.wav https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/files/cantina.wav\n", "!wget -q -O files/cheetah.jpg https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/files/cheetah.jpg\n", "!wget -q -O files/lion.jpg https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/files/lion.jpg\n", "!wget -q -O files/world.mp4 https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/files/world.mp4\n", "!wget -q -O files/zebra.jpg https://github.com/gradio-app/gradio/raw/main/demo/chatbot_multimodal/files/zebra.jpg"]}, {"cell_type": "code", "execution_count": null, "id": "44380577570523278879349135829904343037", "metadata": {}, "outputs": [], "source": ["import gradio as gr\n", "import plotly.express as px\n", "\n", "# Chatbot demo with multimodal input (text, markdown, LaTeX, code blocks, image, audio, & video). Plus shows support for streaming text.\n", "\n", "def random_plot():\n", " df = px.data.iris()\n", " fig = px.scatter(df, x=\"sepal_width\", y=\"sepal_length\", color=\"species\",\n", " size='petal_length', hover_data=['petal_width'])\n", " return fig\n", "\n", "def print_like_dislike(x: gr.LikeData):\n", " print(x.index, x.value, x.liked)\n", "\n", "def add_message(history, message):\n", " for x in message[\"files\"]:\n", " history.append(((x,), None))\n", " if message[\"text\"] is not None:\n", " history.append((message[\"text\"], None))\n", " return history, gr.MultimodalTextbox(value=None, interactive=False)\n", "\n", "def bot(history):\n", " history[-1][1] = \"Cool!\"\n", " return history\n", "\n", "fig = random_plot()\n", "\n", "with gr.Blocks(fill_height=True) as demo:\n", " chatbot = gr.Chatbot(\n", " [[\"Image\", gr.Image(value=\"files/avatar.png\", render=False)],\n", " [\"Video\", gr.Video(value=\"files/world.mp4\", render=False)],\n", " [\"Audio\", gr.Audio(value=\"files/cantina.wav\", render=False)],\n", " [\"Plot\", gr.Plot(value=fig, render=False)],\n", " [\"Gallery\", gr.Gallery(value=[\"files/lion.jpg\", \"files/cheetah.jpg\", \"files/zebra.jpg\"], render=False)]],\n", " elem_id=\"chatbot\",\n", " bubble_full_width=False,\n", " scale=1,\n", " )\n", "\n", " chat_input = gr.MultimodalTextbox(interactive=True, file_types=[\"image\"], placeholder=\"Enter message or upload file...\", show_label=False)\n", "\n", " chat_msg = chat_input.submit(add_message, [chatbot, chat_input], [chatbot, chat_input])\n", " bot_msg = chat_msg.then(bot, chatbot, chatbot, api_name=\"bot_response\")\n", " bot_msg.then(lambda: gr.MultimodalTextbox(interactive=True), None, [chat_input])\n", "\n", " chatbot.like(print_like_dislike, None, None)\n", "\n", "demo.queue()\n", "if __name__ == \"__main__\":\n", " demo.launch()\n"]}], "metadata": {}, "nbformat": 4, "nbformat_minor": 5}
31 changes: 19 additions & 12 deletions demo/chatbot_multimodal/run.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,13 @@
import gradio as gr
import os
import time
import plotly.express as px

# Chatbot demo with multimodal input (text, markdown, LaTeX, code blocks, image, audio, & video). Plus shows support for streaming text.

def random_plot():
df = px.data.iris()
fig = px.scatter(df, x="sepal_width", y="sepal_length", color="species",
size='petal_length', hover_data=['petal_width'])
return fig

def print_like_dislike(x: gr.LikeData):
print(x.index, x.value, x.liked)
Expand All @@ -16,18 +20,21 @@ def add_message(history, message):
return history, gr.MultimodalTextbox(value=None, interactive=False)

def bot(history):
response = "**That's cool!**"
history[-1][1] = ""
for character in response:
history[-1][1] += character
time.sleep(0.05)
yield history

with gr.Blocks() as demo:
history[-1][1] = "Cool!"
return history

fig = random_plot()

with gr.Blocks(fill_height=True) as demo:
chatbot = gr.Chatbot(
[],
[["Image", gr.Image(value="files/avatar.png", render=False)],
dawoodkhan82 marked this conversation as resolved.
Show resolved Hide resolved
["Video", gr.Video(value="files/world.mp4", render=False)],
["Audio", gr.Audio(value="files/cantina.wav", render=False)],
["Plot", gr.Plot(value=fig, render=False)],
["Gallery", gr.Gallery(value=["files/lion.jpg", "files/cheetah.jpg", "files/zebra.jpg"], render=False)]],
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@freddyaboulton Any suggestions on how to simplify this so render=False is not necessary? I tried passing it as a constructor arg during postprocessing, but the component would still render.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you call unrender for each component in the initial value in the chatbot init?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering if it would work.

elem_id="chatbot",
bubble_full_width=False
bubble_full_width=False,
scale=1,
)

chat_input = gr.MultimodalTextbox(interactive=True, file_types=["image"], placeholder="Enter message or upload file...", show_label=False)
Expand Down
77 changes: 62 additions & 15 deletions gradio/components/chatbot.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,12 @@
from gradio_client.documentation import document

from gradio import utils
from gradio.components import (
Component as GradioComponent,
)
from gradio.components.base import Component
from gradio.components.gallery import Gallery, GalleryData
from gradio.components.plot import Plot, PlotData
from gradio.data_classes import FileData, GradioModel, GradioRootModel
from gradio.events import Events

Expand All @@ -21,7 +26,12 @@ class FileMessage(GradioModel):


class ChatbotData(GradioRootModel):
root: List[Tuple[Union[str, FileMessage, None], Union[str, FileMessage, None]]]
root: List[
dawoodkhan82 marked this conversation as resolved.
Show resolved Hide resolved
Tuple[
Union[str, FileMessage, List[FileMessage], PlotData, GalleryData, None],
Union[str, FileMessage, List[FileMessage], PlotData, GalleryData, None],
]
]


@document()
Expand All @@ -40,7 +50,9 @@ class Chatbot(Component):

def __init__(
self,
value: list[list[str | tuple[str] | tuple[str | Path, str] | None]]
value: list[
list[str | GradioComponent | tuple[str] | tuple[str | Path, str] | None]
]
| Callable
| None = None,
*,
Expand Down Expand Up @@ -139,24 +151,37 @@ def __init__(
self.placeholder = placeholder

def _preprocess_chat_messages(
self, chat_message: str | FileMessage | None
) -> str | tuple[str | None] | tuple[str | None, str] | None:
self,
chat_message: str
| FileMessage
| List[FileMessage]
| GalleryData
| PlotData
| None,
) -> str | GradioComponent | tuple[str | None] | tuple[str | None, str] | None:
if chat_message is None:
return None
elif isinstance(chat_message, FileMessage):
if chat_message.alt_text is not None:
return (chat_message.file.path, chat_message.alt_text)
else:
return (chat_message.file.path,)
elif isinstance(chat_message, str):
elif isinstance(chat_message, (str)):
return chat_message
elif isinstance(chat_message, GalleryData):
dawoodkhan82 marked this conversation as resolved.
Show resolved Hide resolved
value = Gallery().preprocess(chat_message)
dawoodkhan82 marked this conversation as resolved.
Show resolved Hide resolved
gallery = Gallery(value=value)
return gallery
elif isinstance(chat_message, PlotData):
plot = Plot(value=chat_message)
return plot
else:
raise ValueError(f"Invalid message for Chatbot component: {chat_message}")

def preprocess(
self,
payload: ChatbotData | None,
) -> list[list[str | tuple[str] | tuple[str, str] | None]] | None:
) -> list[list[str | GradioComponent | tuple[str] | tuple[str, str] | None]] | None:
"""
Parameters:
payload: data as a ChatbotData object
Expand Down Expand Up @@ -184,18 +209,36 @@ def preprocess(
return processed_messages

def _postprocess_chat_messages(
self, chat_message: str | tuple | list | None
) -> str | FileMessage | None:
if chat_message is None:
return None
elif isinstance(chat_message, (tuple, list)):
filepath = str(chat_message[0])

self, chat_message: str | tuple | list | GradioComponent | None
) -> str | FileMessage | List[FileMessage] | PlotData | GalleryData | None:
def create_file_message(chat_message, filepath):
mime_type = client_utils.get_mimetype(filepath)
return FileMessage(
file=FileData(path=filepath, mime_type=mime_type),
alt_text=chat_message[1] if len(chat_message) > 1 else None,
alt_text=chat_message[1]
dawoodkhan82 marked this conversation as resolved.
Show resolved Hide resolved
if not isinstance(chat_message, GradioComponent)
and len(chat_message) > 1
else None,
)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering if this can be simplified to the following:

component = chat_message.__class__(**chat_message.constructor_args)
config = component.get_config()
return ComponentMessage(
    component=type(chat_message).__name__.lower(),
    value=config.get("value", None),
    constructor_args=[chat_message.constructor_args]
)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This breaks for the plot component, so left it as is for now.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok I can't repro that if I make this change with the demo code I linked below

chatbot_components_demo

Also I'm thinking that constructor_args does not need to be a list.

if chat_message is None:
return None
elif isinstance(chat_message, (tuple, list)):
if isinstance(chat_message[0], GradioComponent):
return type(chat_message[0]).postprocess(
chat_message[0], chat_message[0]._constructor_args[1]["value"]
)
else:
filepath = str(chat_message[0])
return create_file_message(chat_message, filepath)
elif isinstance(chat_message, GradioComponent):
if type(chat_message) == Plot or type(chat_message) == Gallery:
dawoodkhan82 marked this conversation as resolved.
Show resolved Hide resolved
return type(chat_message).postprocess(
chat_message, chat_message._constructor_args[1]["value"]
) # type: ignore
else:
filepath = chat_message._constructor_args[1]["value"]
return create_file_message(chat_message, filepath)
elif isinstance(chat_message, str):
chat_message = inspect.cleandoc(chat_message)
return chat_message
Expand All @@ -204,7 +247,10 @@ def _postprocess_chat_messages(

def postprocess(
self,
value: list[list[str | tuple[str] | tuple[str, str] | None] | tuple] | None,
value: list[
list[str | GradioComponent | tuple[str] | tuple[str, str] | None] | tuple
]
| None,
) -> ChatbotData:
"""
Parameters:
Expand All @@ -214,6 +260,7 @@ def postprocess(
"""
if value is None:
return ChatbotData(root=[])

processed_messages = []
for message_pair in value:
if not isinstance(message_pair, (tuple, list)):
Expand Down
2 changes: 2 additions & 0 deletions gradio/components/plot.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,6 +124,8 @@ def postprocess(self, value: Any) -> PlotData | None:

if value is None:
return None
if isinstance(value, PlotData):
return value
if isinstance(value, (ModuleType, matplotlib.figure.Figure)): # type: ignore
dtype = "matplotlib"
out_y = processing_utils.encode_plot_to_base64(value, self.format)
Expand Down
49 changes: 40 additions & 9 deletions js/chatbot/Index.svelte
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,18 @@
export let elem_classes: string[] = [];
export let visible = true;
export let value: [
string | { file: FileData; alt_text: string | null } | null,
string | { file: FileData; alt_text: string | null } | null
(
| string
| { file: FileData | FileData[]; alt_text: string | null }
| { type: string; plot: string | null }
| null
),
(
| string
| { file: FileData | FileData[]; alt_text: string | null }
| { type: string; plot: string | null }
| null
)
][] = [];
export let scale: number | null = null;
export let min_width: number | undefined = undefined;
Expand Down Expand Up @@ -50,19 +60,35 @@
export let avatar_images: [FileData | null, FileData | null] = [null, null];

let _value: [
string | { file: FileData; alt_text: string | null } | null,
string | { file: FileData; alt_text: string | null } | null
(
| string
| { file: FileData | FileData[]; alt_text: string | null }
| { type: string; plot: string | null }
| null
),
(
| string
| { file: FileData | FileData[]; alt_text: string | null }
| { type: string; plot: string | null }
| null
)
][];

const redirect_src_url = (src: string): string =>
src.replace('src="/file', `src="${root}file`);

function normalize_messages(
message: { file: FileData; alt_text: string | null } | null
): { file: FileData; alt_text: string | null } | null {
message: { file: FileData | FileData[]; alt_text: string | null } | null
): { file: FileData | FileData[]; alt_text: string | null } | null {
if (message === null) {
return message;
}
if (Array.isArray(message)) {
return {
file: message.map((file: FileData) => file as FileData),
alt_text: message?.alt_text
};
}
return {
file: message?.file as FileData,
alt_text: message?.alt_text
Expand All @@ -73,10 +99,14 @@
? value.map(([user_msg, bot_msg]) => [
typeof user_msg === "string"
? redirect_src_url(user_msg)
: normalize_messages(user_msg),
typeof bot_msg === "string"
: user_msg != null && "plot" in user_msg
? user_msg
: normalize_messages(user_msg),
typeof bot_msg === "string" && bot_msg != null
? redirect_src_url(bot_msg)
: normalize_messages(bot_msg)
: bot_msg != null && "plot" in bot_msg
? bot_msg
: normalize_messages(bot_msg)
])
: [];

Expand Down Expand Up @@ -137,6 +167,7 @@
{line_breaks}
{layout}
{placeholder}
upload={gradio.client.upload}
/>
</div>
</Block>
Expand Down
2 changes: 2 additions & 0 deletions js/chatbot/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,11 @@
"@gradio/atoms": "workspace:^",
"@gradio/audio": "workspace:^",
"@gradio/client": "workspace:^",
"@gradio/gallery": "workspace:^",
"@gradio/icons": "workspace:^",
"@gradio/image": "workspace:^",
"@gradio/markdown": "workspace:^",
"@gradio/plot": "workspace:^",
"@gradio/statustracker": "workspace:^",
"@gradio/theme": "workspace:^",
"@gradio/upload": "workspace:^",
Expand Down