Skip to content

Commit

Permalink
Fix how files are processed in gr.ChatInterface (#7875)
Browse files Browse the repository at this point in the history
* submit button icon

* loader

* add changeset

* pr fix

* paste

* format

* add changeset

* remove path

* revert

* reset cset

* add changeset

* add changeset

* guide

* guide

* fix

---------

Co-authored-by: Dawood <dawoodkhan82@gmail.com>
Co-authored-by: gradio-pr-bot <gradio-pr-bot@users.noreply.github.com>
  • Loading branch information
3 people committed Mar 28, 2024
1 parent b9dbcf7 commit e6d051d
Show file tree
Hide file tree
Showing 4 changed files with 23 additions and 21 deletions.
5 changes: 5 additions & 0 deletions .changeset/sixty-paths-push.md
@@ -0,0 +1,5 @@
---
"gradio": patch
---

feat:Fix how files are processed in `gr.ChatInterface`
4 changes: 2 additions & 2 deletions gradio/chat_interface.py
Expand Up @@ -467,7 +467,7 @@ def _append_multimodal_history(
history: list[list[str | tuple | None]],
):
for x in message["files"]:
history.append([(x["path"],), None])
history.append([(x,), None])
if message["text"] is not None and isinstance(message["text"], str):
history.append([message["text"], response])

Expand Down Expand Up @@ -544,7 +544,7 @@ async def _stream_fn(
first_response = await async_iteration(generator)
if self.multimodal and isinstance(message, dict):
for x in message["files"]:
history.append([(x["path"],), None])
history.append([(x,), None])
update = history + [[message["text"], first_response]]
yield update, update
else:
Expand Down
23 changes: 9 additions & 14 deletions guides/04_chatbots/01_creating-a-chatbot-fast.md
Expand Up @@ -87,7 +87,8 @@ def slow_echo(message, history):
gr.ChatInterface(slow_echo).launch()
```

Notice that we've [enabled queuing](/guides/key-features#queuing), which is required to use generator functions. While the response is streaming, the "Submit" button turns into a "Stop" button that can be used to stop the generator function. You can customize the appearance and behavior of the "Stop" button using the `stop_btn` parameter.

Tip: While the response is streaming, the "Submit" button turns into a "Stop" button that can be used to stop the generator function. You can customize the appearance and behavior of the "Stop" button using the `stop_btn` parameter.

## Customizing your chatbot

Expand Down Expand Up @@ -128,31 +129,25 @@ gr.ChatInterface(

You may want to add multimodal capability to your chatbot. For example, you may want users to be able to easily upload images or files to your chatbot and ask questions about it. You can make your chatbot "multimodal" by passing in a single parameter (`multimodal=True`) to the `gr.ChatInterface` class.

```python
import gradio as gr

chat_input = gr.MultimodalTextbox(file_types=["image"], placeholder="Enter message or upload file...")
```

`gr.ChatInterface` also supports multimodality, simply pass in the `multimodal` parameter as `True`:

```python
import gradio as gr
import time

def echo(message, history):
t = x['text']
for i in range(len(t)):
time.sleep(0.5)
yield t[:i+1]
def count_files(message, history):
num_files = len(message["files"])
return f"You uploaded {num_files} files"

demo = gr.ChatInterface(fn=echo, examples=["hello", "hola", "merhaba"], title="Echo Bot", multimodal=True)
demo = gr.ChatInterface(fn=count_files, examples=[{"text": "Hello", "files": []}], title="Echo Bot", multimodal=True)

demo.launch()
```

When `multimodal=True`, the first parameter of your function should receives a dictionary consisting of the submitted text and uploaded files that looks like this: `{"text": "user input", "file": ["file_path1", "file_path2", ...]}`.


Tip: If you'd like to customize the UI/UX of the textbox for your multimodal chatbot, you should pass in an instance of `gr.MultimodalTextbox` to the `textbox` argument of `ChatInterface` instead of an instance of `gr.Textbox`.

## Additional Inputs

You may want to add additional parameters to your chatbot and expose them to your users through the Chatbot UI. For example, suppose you want to add a textbox for a system prompt, or a slider that sets the number of tokens in the chatbot's response. The `ChatInterface` class supports an `additional_inputs` parameter which can be used to add additional input components.
Expand Down
12 changes: 7 additions & 5 deletions test/test_processing_utils.py
Expand Up @@ -96,11 +96,7 @@ def test_save_url_to_cache(self, gradio_temp_dir):
f = processing_utils.save_url_to_cache(url2, cache_dir=gradio_temp_dir)
assert len([f for f in gradio_temp_dir.glob("**/*") if f.is_file()]) == 2

def test_save_url_to_cache_with_spaces(self, gradio_temp_dir):
url = "https://huggingface.co/datasets/freddyaboulton/gradio-reviews/resolve/main00015-20230906102032-7778-Wonderwoman VintageMagStyle _lora_SDXL-VintageMagStyle-Lora_1_, Very detailed, clean, high quality, sharp image.jpg"
processing_utils.save_url_to_cache(url, cache_dir=gradio_temp_dir)
assert len([f for f in gradio_temp_dir.glob("**/*") if f.is_file()]) == 1

@pytest.mark.flaky
def test_save_url_to_cache_with_redirect(self, gradio_temp_dir):
url = "https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/bread_small.png"
processing_utils.save_url_to_cache(url, cache_dir=gradio_temp_dir)
Expand Down Expand Up @@ -377,3 +373,9 @@ def test_add_root_url():
assert (
processing_utils.add_root_url(expected, new_root_url, root_url) == new_expected
)


def test_hash_url_encodes_url():
assert processing_utils.hash_url(
"https://www.gradio.app/image 1.jpg"
) == processing_utils.hash_bytes(b"https://www.gradio.app/image 1.jpg")

0 comments on commit e6d051d

Please sign in to comment.