Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] 用通义千问API模式,LLM对话,返回错误:API通信遇到错误:peer closed connection without sending complete message body (incomplete chunked read) #1986

Closed
Jackle910 opened this issue Nov 7, 2023 · 2 comments
Labels
bug Something isn't working

Comments

@Jackle910
Copy link

Jackle910 commented Nov 7, 2023

问题描述 / Problem Description
用通义千问API模式,LLM对话,返回错误:API通信遇到错误:peer closed connection without sending complete message body (incomplete chunked read)

复现问题的步骤 / Steps to Reproduce
启动,并对话

预期的结果 / Expected Result
/

实际结果 / Actual Result
报错:API通信遇到错误:peer closed connection without sending complete message body (incomplete chunked read)

环境信息 / Environment Information

操作系统:Linux-4.19.91-009.ali4000.alios7.x86_64-x86_64-with-glibc2.17.
python版本:3.8.16 (default, Jun 12 2023, 18:09:05) 
[GCC 11.2.0]
项目版本:v0.2.7-preview
langchain版本:0.0.331. fastchat版本:0.2.32


当前使用的分词器:ChineseRecursiveTextSplitter
当前启动的LLM模型:['qwen-api'] @ cpu
{'api_key': 'sk-**************',
 'device': 'auto',
 'host': '0.0.0.0',
 'infer_turbo': False,
 'online_api': True,
 'port': 21006,
 'provider': 'QwenWorker',
 'version': 'qwen-turbo',
 'worker_class': <class 'server.model_workers.qwen.QwenWorker'>}
当前Embbedings模型: m3e-base @ cpu

附加信息 / Additional Information
直接curl 本地21006返回: {"detail":"Not Found"}
已安装依赖: pip install dashscope
控制台报错信息

ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/opt/conda/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.8/site-packages/fastapi/applications.py", line 1106, in __call__
    await super().__call__(scope, receive, send)
  File "/opt/conda/lib/python3.8/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/opt/conda/lib/python3.8/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/opt/conda/lib/python3.8/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/opt/conda/lib/python3.8/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/opt/conda/lib/python3.8/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/opt/conda/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/opt/conda/lib/python3.8/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.8/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/opt/conda/lib/python3.8/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/opt/conda/lib/python3.8/site-packages/starlette/routing.py", line 69, in app
    await response(scope, receive, send)
  File "/opt/conda/lib/python3.8/site-packages/starlette/responses.py", line 277, in __call__
    await wrap(partial(self.listen_for_disconnect, receive))
  File "/opt/conda/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 597, in __aexit__
    raise exceptions[0]
  File "/opt/conda/lib/python3.8/site-packages/starlette/responses.py", line 273, in wrap
    await func()
  File "/opt/conda/lib/python3.8/site-packages/starlette/responses.py", line 262, in stream_response
    async for chunk in self.body_iterator:
  File "/mnt/workspace/Langchain-Chatchat/server/chat/chat.py", line 39, in chat_iterator
    model = get_ChatOpenAI(
  File "/mnt/workspace/Langchain-Chatchat/server/utils.py", line 46, in get_ChatOpenAI
    model = ChatOpenAI(
  File "/opt/conda/lib/python3.8/site-packages/langchain/load/serializable.py", line 97, in __init__
    super().__init__(**kwargs)
  File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ChatOpenAI
__root__
  `openai` has no `ChatCompletion` attribute, this is likely due to an old version of the openai package. Try upgrading it with `pip install --upgrade openai`. (type=value_error)
2023-11-07 16:49:46,949 - utils.py[line:187] - ERROR: RemoteProtocolError: API通信遇到错误:peer closed connection without sending complete message body (incomplete chunked read)
{'base_url': 'http://127.0.0.1:7861', 'timeout': 300.0, 'proxies': {'all://127.0.0.1': None, 'all://localhost': None, 'http://127.0.0.1': None, 'http://': None, 'https://': None, 'all://': None}}
@Jackle910 Jackle910 added the bug Something isn't working label Nov 7, 2023
@lifx2015
Copy link

lifx2015 commented Nov 7, 2023

也遇到这个问题

@zRzRzRzRzRzRzR
Copy link
Collaborator

更新到0.2.7或者dev来解决这问题,我们在dev中能正常运行

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants