Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for async functions and async generators to gr.ChatInterface #5116

Merged
merged 12 commits into from Aug 8, 2023

Conversation

abidlabs
Copy link
Member

@abidlabs abidlabs commented Aug 7, 2023

Added async support for all of the routes: the submit function, the example function, and the API functions. Test with any async function or generator, e.g.:

async def slow_echo(message, history):
    for i in range(len(message)):
        time.sleep(0.05)
        yield "You typed: " + message[: i+1]

demo = gr.ChatInterface(slow_echo, examples=["abcdef"], cache_examples=True).queue()

demo.launch()

Closes: #5115

@vercel
Copy link

vercel bot commented Aug 7, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Updated (UTC)
gradio ✅ Ready (Inspect) Visit Preview Aug 8, 2023 7:44pm

@gradio-pr-bot
Copy link
Contributor

gradio-pr-bot commented Aug 7, 2023

🦄 change detected

This Pull Request includes changes to the following packages.

Package Version
gradio minor
  • Maintainers can select this checkbox to manually select packages to update.

With the following changelog entry.

Add support for async functions and async generators to gr.ChatInterface

Maintainers or the PR author can modify the PR title to modify this entry.

Something isn't right?

  • Maintainers can change the version label to modify the version bump.
  • If the bot has failed to detect any changes, or if this pull request needs to update multiple packages to different versions or requires a more comprehensive changelog entry, maintainers can update the changelog file directly.

@gradio-pr-bot
Copy link
Contributor

gradio-pr-bot commented Aug 7, 2023

All the demos for this PR have been deployed at https://huggingface.co/spaces/gradio-pr-deploys/pr-5116-all-demos


You can install the changes in this PR by running:

pip install https://gradio-builds.s3.amazonaws.com/e2fb94d7f53fdfc3050a910de93c5a3b5a080530/gradio-3.39.0-py3-none-any.whl

if self.is_async:
first_response = await async_iteration(generator)
else:
first_response = next(generator)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we're switching everything to coroutines, any possible blocking code that's not async needs to be run in a separate thread otherwise we'll block the event loop. So for example use anyio.to_thread.run_sync to run the non-async code.

Might be better to utils.SyncToAsyncIterator for the generators to match what we do in blocks.

Otherwise this looks good though!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah nice I was going to ask why we do utils.SyncToAsyncIterator in blocks.py

@abidlabs
Copy link
Member Author

abidlabs commented Aug 8, 2023

Ok I've updated to use SyncToAsyncIterator @freddyaboulton. Let me know if you see anything else or if this is good to merge.

if self.is_async:
response = await self.fn(message, history, *args, **kwargs)
else:
response = self.fn(message, history, *args, **kwargs)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to run the non-coroutine in a separate thread otherwise we'll block the event loop. This is because the _submit_fn is now async.

If you run this code on your branch, two events can't run concurrently anymore

import random
import gradio as gr
import time

def random_response(message, history):
    time.sleep(5)
    return random.choice(["Yes", "No"])

demo = gr.ChatInterface(random_response)

if __name__ == "__main__":
    demo.launch()

Note how the textbox submit event has to wait:
chatbot_concurrency

compare to main:
chatbot_concurrency_main

Copy link
Member Author

@abidlabs abidlabs Aug 8, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah nice thanks @freddyaboulton, will fix

@gradio-pr-bot
Copy link
Contributor

gradio-pr-bot commented Aug 8, 2023

🎉 Chromatic build completed!

There are 0 visual changes to review.
There are 0 failed tests to fix.

@abidlabs
Copy link
Member Author

abidlabs commented Aug 8, 2023

Now should be fixed. Tested with regular functions as well as with iterators (the latter requires enabling queuing and setting concurrency count > 1), and non-async functions no longer block.

Copy link
Collaborator

@freddyaboulton freddyaboulton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the enhancement @abidlabs !!

@abidlabs
Copy link
Member Author

abidlabs commented Aug 8, 2023

Appreciate the guidance around non-blocking thanks @freddyaboulton!

@abidlabs abidlabs merged commit 0dc49b4 into main Aug 8, 2023
16 checks passed
@abidlabs abidlabs deleted the asynchat branch August 8, 2023 19:57
@pngwn pngwn mentioned this pull request Aug 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for async fn to ChatInterface
3 participants