Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug in openai_aip.py caused by Pydantic #406

Closed
wants to merge 1 commit into from
Closed

Fix bug in openai_aip.py caused by Pydantic #406

wants to merge 1 commit into from

Conversation

chzhyang
Copy link

Fix bug in chunk.json() caused by Pydantic
#341
#353

Enviroment

python3.9 and python3.10

Reproduce

Send request by client.py

import openai
if __name__ == "__main__":
    openai.api_base = "http://localhost:8000/v1"
    openai.api_key = "none"
    for chunk in openai.ChatCompletion.create(
        model="chatglm2-6b",
        messages=[
            {
                "role": "system",
                "content": "You are a helpful assistant."
            },
            {
                "role": "user",
                "content": "Hello!"
            }
        ],
        stream=True
    ):
        if hasattr(chunk.choices[0].delta, "content"):
            print(chunk.choices[0].delta.content, end="", flush=True)
$ python client.py 
Traceback (most recent call last):
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/urllib3/response.py", line 710, in _error_catcher
    yield
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/urllib3/response.py", line 1077, in read_chunked
    self._update_chunk_length()
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/urllib3/response.py", line 1012, in _update_chunk_length
    raise InvalidChunkLength(self, line) from None
urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/requests/models.py", line 816, in generate
    yield from self.raw.stream(chunk_size, decode_content=True)
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/urllib3/response.py", line 937, in stream
    yield from self.read_chunked(amt, decode_content=decode_content)
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/urllib3/response.py", line 1106, in read_chunked
    self._original_response.close()
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/contextlib.py", line 137, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/urllib3/response.py", line 727, in _error_catcher
    raise ProtocolError(f"Connection broken: {e!r}", e) from e
urllib3.exceptions.ProtocolError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/sdp/ChatGLM2-6B/client.py", line 5, in <module>
    for chunk in openai.ChatCompletion.create(
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 166, in <genexpr>
    return (
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/openai/api_requestor.py", line 692, in <genexpr>
    return (
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/openai/api_requestor.py", line 115, in parse_stream
    for line in rbody:
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/requests/models.py", line 865, in iter_lines
    for chunk in self.iter_content(
  File "/home/sdp/miniconda3/envs/chatglm/lib/python3.9/site-packages/requests/models.py", line 818, in generate
    raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))

@laoshancun
Copy link

#415
maybe model_dump_json is better than json.dumps(chunk.model_dump()).

@chzhyang chzhyang closed this by deleting the head repository Jan 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants