Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG/Help] <openai_api> not work in my case #353

Open
1 task done
I-E-E-E opened this issue Jul 21, 2023 · 1 comment
Open
1 task done

[BUG/Help] <openai_api> not work in my case #353

I-E-E-E opened this issue Jul 21, 2023 · 1 comment

Comments

@I-E-E-E
Copy link

I-E-E-E commented Jul 21, 2023

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

Traceback (most recent call last):
File "/opt/conda/envs/g/lib/python3.9/site-packages/urllib3/response.py", line 710, in _error_catcher
yield
File "/opt/conda/envs/g/lib/python3.9/site-packages/urllib3/response.py", line 1077, in read_chunked
self._update_chunk_length()
File "/opt/conda/envs/g/lib/python3.9/site-packages/urllib3/response.py", line 1012, in _update_chunk_length
raise InvalidChunkLength(self, line) from None
urllib3.exceptions.InvalidChunkLength: InvalidChunkLength(got length b'', 0 bytes read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/conda/envs/g/lib/python3.9/site-packages/requests/models.py", line 816, in generate
yield from self.raw.stream(chunk_size, decode_content=True)
File "/opt/conda/envs/g/lib/python3.9/site-packages/urllib3/response.py", line 937, in stream
yield from self.read_chunked(amt, decode_content=decode_content)
File "/opt/conda/envs/g/lib/python3.9/site-packages/urllib3/response.py", line 1106, in read_chunked
self._original_response.close()
File "/opt/conda/envs/g/lib/python3.9/contextlib.py", line 137, in exit
self.gen.throw(typ, value, traceback)
File "/opt/conda/envs/g/lib/python3.9/site-packages/urllib3/response.py", line 727, in _error_catcher
raise ProtocolError(f"Connection broken: {e!r}", e) from e
urllib3.exceptions.ProtocolError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/share/ad/wangsiyuan09/test.py", line 5, in
for chunk in openai.ChatCompletion.create(
File "/opt/conda/envs/g/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 166, in
return (
File "/opt/conda/envs/g/lib/python3.9/site-packages/openai/api_requestor.py", line 692, in
return (
File "/opt/conda/envs/g/lib/python3.9/site-packages/openai/api_requestor.py", line 115, in parse_stream
for line in rbody:
File "/opt/conda/envs/g/lib/python3.9/site-packages/requests/models.py", line 865, in iter_lines
for chunk in self.iter_content(
File "/opt/conda/envs/g/lib/python3.9/site-packages/requests/models.py", line 818, in generate
raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ("Connection broken: InvalidChunkLength(got length b'', 0 bytes read)", InvalidChunkLength(got length b'', 0 bytes read))

Expected Behavior

No response

Steps To Reproduce

import openai
if name == "main":
openai.api_base = "http://localhost:8000/v1"
openai.api_key = "none"
for chunk in openai.ChatCompletion.create(
model="chatglm2-6b",
messages=[
{"role": "user", "content": "你好"}
],
stream=True
):
if hasattr(chunk.choices[0].delta, "content"):
print(chunk.choices[0].delta.content, end="", flush=True)

Environment

- OS: 
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

No response

@TGLTommy
Copy link

TGLTommy commented Jul 29, 2023

@I-E-E-E

这个主要是pydantic版本变了,导致.json()方法的某些参数被更改或移除了。具体操作如下:
修改脚本 openai_api.py 的代码一共3处:
将所有代码 : yield "{}".format(chunk.json(exclude_unset=True, ensure_ascii=False))
全部修改为 : yield json.dumps(chunk.dict(exclude_unset=True), ensure_ascii=False)
注意:需要在引入: import json
然后,你再运行一下,一切就正常啦 :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants