New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for async functions and async generators to gr.ChatInterface
#5116
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
🦄 change detectedThis Pull Request includes changes to the following packages.
With the following changelog entry.
Maintainers or the PR author can modify the PR title to modify this entry.
|
All the demos for this PR have been deployed at https://huggingface.co/spaces/gradio-pr-deploys/pr-5116-all-demos You can install the changes in this PR by running: pip install https://gradio-builds.s3.amazonaws.com/e2fb94d7f53fdfc3050a910de93c5a3b5a080530/gradio-3.39.0-py3-none-any.whl |
gradio/chat_interface.py
Outdated
if self.is_async: | ||
first_response = await async_iteration(generator) | ||
else: | ||
first_response = next(generator) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since we're switching everything to coroutines, any possible blocking code that's not async needs to be run in a separate thread otherwise we'll block the event loop. So for example use anyio.to_thread.run_sync
to run the non-async code.
Might be better to utils.SyncToAsyncIterator
for the generators to match what we do in blocks
.
Otherwise this looks good though!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah nice I was going to ask why we do utils.SyncToAsyncIterator
in blocks.py
Ok I've updated to use |
gradio/chat_interface.py
Outdated
if self.is_async: | ||
response = await self.fn(message, history, *args, **kwargs) | ||
else: | ||
response = self.fn(message, history, *args, **kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need to run the non-coroutine in a separate thread otherwise we'll block the event loop. This is because the _submit_fn is now async.
If you run this code on your branch, two events can't run concurrently anymore
import random
import gradio as gr
import time
def random_response(message, history):
time.sleep(5)
return random.choice(["Yes", "No"])
demo = gr.ChatInterface(random_response)
if __name__ == "__main__":
demo.launch()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah nice thanks @freddyaboulton, will fix
🎉 Chromatic build completed! There are 0 visual changes to review. |
Now should be fixed. Tested with regular functions as well as with iterators (the latter requires enabling queuing and setting concurrency count > 1), and non-async functions no longer block. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the enhancement @abidlabs !!
Appreciate the guidance around non-blocking thanks @freddyaboulton! |
Added async support for all of the routes: the submit function, the example function, and the API functions. Test with any async function or generator, e.g.:
Closes: #5115