Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KeyError: 'heartbeat' when python client request took more than approximately 16 seconds #6319

Closed
1 task done
LeoAtlanto opened this issue Nov 7, 2023 · 12 comments · Fixed by #6693
Closed
1 task done
Assignees
Labels
bug Something isn't working gradio_client Related to the one of the gradio client libraries

Comments

@LeoAtlanto
Copy link

Describe the bug

Bug Info
I'm working on a project where several hosted gradio apps are connected by a python gradio client. One of the hosted gradio apps is a bit complicated and it would take about 30 seconds to finish the request. Under such circumstances, the error "KeyError: 'heartbeat'" occurs and the request fails.

Solution
/usr/local/lib/python3.10/dist-packages/gradio_client/utils.py
Add a key-value pair in function msg_to_status below line 136, or something like this:

    @staticmethod
    def msg_to_status(msg: str) -> Status:
        """Map the raw message from the backend to the status code presented to users."""
        return {
            "send_hash": Status.JOINING_QUEUE,
            "queue_full": Status.QUEUE_FULL,
            "estimation": Status.IN_QUEUE,
            "send_data": Status.SENDING_DATA,
            "process_starts": Status.PROCESSING,
            "process_generating": Status.ITERATING,
            "process_completed": Status.FINISHED,
            "progress": Status.PROGRESS,
            "heartbeat": Status.PROGRESS,      # add this line
        }[msg]

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

1.Create two scripts.

(1) host.py

import time
import gradio as gr

def greet(name):
    time.sleep(20)
    return "Hello " + name + "!"

demo = gr.Interface(fn=greet, inputs="text", outputs="text")
    
if __name__ == "__main__":
    demo.launch(show_api=False, server_name='0.0.0.0', server_port=12345)

(2) client.py

import gradio as gr
from gradio_client import Client

def greet(name):
    client = Client(src="http://xxx.xxx.xxx.xxx:12345")
    job = client.submit(name, fn_index=0)
    return job.result()

demo = gr.Interface(fn=greet, inputs="text", outputs="text")
    
if __name__ == "__main__":
    demo.launch(show_api=False, server_name='0.0.0.0', server_port=12346)

2.Launch two apps.

python host.py
python client.py

3.Use the client app to connect the host app.

Screenshot

No response

Logs

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/gradio/queueing.py", line 427, in call_prediction
    output = await route_utils.call_process_api(
  File "/usr/local/lib/python3.10/dist-packages/gradio/route_utils.py", line 232, in call_process_api
    output = await app.get_blocks().process_api(
  File "/usr/local/lib/python3.10/dist-packages/gradio/blocks.py", line 1484, in process_api
    result = await self.call_function(
  File "/usr/local/lib/python3.10/dist-packages/gradio/blocks.py", line 1106, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/usr/local/lib/python3.10/dist-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
  File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
  File "/usr/local/lib/python3.10/dist-packages/gradio/utils.py", line 665, in wrapper
    response = f(*args, **kwargs)
  File "/project/client.py", line 7, in greet
    return job.result()
  File "/usr/local/lib/python3.10/dist-packages/gradio_client/client.py", line 1456, in result
    return super().result(timeout=timeout)
  File "/usr/lib/python3.10/concurrent/futures/_base.py", line 458, in result
    return self.__get_result()
  File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/usr/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.10/dist-packages/gradio_client/client.py", line 869, in _inner
    predictions = _predict(*data)
  File "/usr/local/lib/python3.10/dist-packages/gradio_client/client.py", line 894, in _predict
    result = utils.synchronize_async(self._sse_fn, data, hash_data, helper)
  File "/usr/local/lib/python3.10/dist-packages/gradio_client/utils.py", line 664, in synchronize_async
    return fsspec.asyn.sync(fsspec.asyn.get_loop(), func, *args, **kwargs)  # type: ignore
  File "/usr/local/lib/python3.10/dist-packages/fsspec/asyn.py", line 106, in sync
    raise return_result
  File "/usr/local/lib/python3.10/dist-packages/fsspec/asyn.py", line 61, in _runner
    result[0] = await coro
  File "/usr/local/lib/python3.10/dist-packages/gradio_client/client.py", line 1075, in _sse_fn
    return await utils.get_pred_from_sse(
  File "/usr/local/lib/python3.10/dist-packages/gradio_client/utils.py", line 341, in get_pred_from_sse
    return task.result()
  File "/usr/local/lib/python3.10/dist-packages/gradio_client/utils.py", line 377, in stream_sse
    code=Status.msg_to_status(resp["msg"]),
  File "/usr/local/lib/python3.10/dist-packages/gradio_client/utils.py", line 128, in msg_to_status
    return {
KeyError: 'heartbeat'

System Info

【pip list|grep gradio】
gradio                    4.1.1
gradio_client             0.7.0

【gradio environment】
Gradio Environment Information:
------------------------------
Operating System: Linux
gradio version: 4.1.1
gradio_client version: 0.7.0

------------------------------------------------
gradio dependencies in your environment:

aiofiles: 23.2.1
altair: 5.1.2
fastapi: 0.104.1
ffmpy: 0.3.1
gradio-client==0.7.0 is not installed.
httpx: 0.25.0
huggingface-hub: 0.17.3
importlib-resources: 6.1.0
jinja2: 3.1.2
markupsafe: 2.1.3
matplotlib: 3.7.3
numpy: 1.22.2
orjson: 3.9.10
packaging: 23.1
pandas: 1.5.3
pillow: 9.2.0
pydantic: 2.3.0
pydub: 0.25.1
python-multipart: 0.0.6
pyyaml: 6.0.1
requests: 2.31.0
semantic-version: 2.10.0
tomlkit==0.12.0 is not installed.
typer: 0.9.0
typing-extensions: 4.8.0
uvicorn: 0.23.2
websockets: 11.0.3
authlib; extra == 'oauth' is not installed.
itsdangerous; extra == 'oauth' is not installed.


gradio_client dependencies in your environment:

fsspec: 2023.6.0
httpx: 0.25.0
huggingface-hub: 0.17.3
packaging: 23.1
requests: 2.31.0
typing-extensions: 4.8.0
websockets: 11.0.3

Severity

I can work around it

@LeoAtlanto LeoAtlanto added the bug Something isn't working label Nov 7, 2023
@freddyaboulton
Copy link
Collaborator

Do you want to open a PR to fix @LeoAtlanto ?

@abidlabs abidlabs added the gradio_client Related to the one of the gradio client libraries label Nov 7, 2023
@LeoAtlanto
Copy link
Author

Do you want to open a PR to fix @LeoAtlanto ?

I'd like to and I can do that tomorrow.

@stokesj0001
Copy link

I was able to reproduce this bug and validate the fix.

@ryanchesler
Copy link

I've hit a similar error but the entry is for "log" made a similar fix. Should I open PR for this if no one else has yet?

@zetyquickly
Copy link

I tried the fix, and it is indeed solves the issue for that line of code. But we need to make sure that "heartbeat" response is treated the right way in every piece of the repo.

After implementing workaround, I started to "JSONDecode" exceptions in gradio-client, meanwhile stdout of a client shows "heartbeat" response entries

@pseudotensor
Copy link
Contributor

pseudotensor commented Nov 28, 2023

It's been 2-3 weeks with this bug, can we have fix please? Thanks!

I'm unable to work around and it blocks all usage of gradio 4 after spending some time to upgrade.

This should be marked as major regression.

@pseudotensor
Copy link
Contributor

pseudotensor commented Nov 28, 2023

With "work-around" yes jsondecodeerror:

"""
Traceback (most recent call last):
  File "/data/conda/h2ogpt/lib/python3.10/concurrent/futures/process.py", line 243, in _process_worker
    r = call_item.fn(*call_item.args, **call_item.kwargs)
  File "/home/jon/h2ogpt/src/utils.py", line 963, in _traced_func
    return func(*args, **kwargs)
  File "/home/jon/h2ogpt/tests/test_client_calls.py", line 3901, in test_client1_tts
    res = client.predict(str(dict(kwargs)), api_name='/submit_nochat_api')
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/gradio_client/client.py", line 305, in predict
    return self.submit(*args, api_name=api_name, fn_index=fn_index).result()
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/gradio_client/client.py", line 1456, in result
    return super().result(timeout=timeout)
  File "/data/conda/h2ogpt/lib/python3.10/concurrent/futures/_base.py", line 445, in result
    return self.__get_result()
  File "/data/conda/h2ogpt/lib/python3.10/concurrent/futures/_base.py", line 390, in __get_result
    raise self._exception
  File "/data/conda/h2ogpt/lib/python3.10/concurrent/futures/thread.py", line 52, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/gradio_client/client.py", line 869, in _inner
    predictions = _predict(*data)
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/gradio_client/client.py", line 894, in _predict
    result = utils.synchronize_async(self._sse_fn, data, hash_data, helper)
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/gradio_client/utils.py", line 665, in synchronize_async
    return fsspec.asyn.sync(fsspec.asyn.get_loop(), func, *args, **kwargs)  # type: ignore
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/fsspec/asyn.py", line 103, in sync
    raise return_result
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/fsspec/asyn.py", line 56, in _runner
    result[0] = await coro
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/gradio_client/client.py", line 1075, in _sse_fn
    return await utils.get_pred_from_sse(
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/gradio_client/utils.py", line 342, in get_pred_from_sse
    return task.result()
  File "/data/conda/h2ogpt/lib/python3.10/site-packages/gradio_client/utils.py", line 374, in stream_sse
    resp = json.loads(line[5:])
  File "/data/conda/h2ogpt/lib/python3.10/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/data/conda/h2ogpt/lib/python3.10/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/data/conda/h2ogpt/lib/python3.10/json/decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 52 (char 51)
"""

Seems like the message is truncated to 65k.

@pseudotensor
Copy link
Contributor

Seems like the message is truncated to 65k.

So isn't that a server problem, not client problem?

pseudotensor added a commit to h2oai/h2ogpt that referenced this issue Nov 28, 2023
…However, return results still truncated to 65k making audio return impossible
@pseudotensor
Copy link
Contributor

I added the heartbeat work-around, but the truncation is super critical. Any chance to fix soon?

@pseudotensor
Copy link
Contributor

Note this doesn't just affect audio. The truncation is universal to all messages. I'll post separate issue.

@djaym7
Copy link

djaym7 commented Nov 30, 2023

i'm getting this error as well

@iosonopersia
Copy link

iosonopersia commented Dec 1, 2023

I'm getting the same error as @ryanchesler , it appeared after adding a gr.Info("...") message in the function on the Gradio app backend

I've hit a similar error but the entry is for "log" made a similar fix. Should I open PR for this if no one else has yet?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working gradio_client Related to the one of the gradio client libraries
Projects
None yet
9 participants