Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] 什么时候支持qwen-vl-chat?openai api或者chatchat api支持都可以,哪位提示一下在哪里更改代码来支持 #2969

Closed
Liam-Ji opened this issue Feb 8, 2024 · 9 comments
Assignees
Labels
enhancement New feature or request

Comments

@Liam-Ji
Copy link

Liam-Ji commented Feb 8, 2024

功能描述 / Feature Description
支持qwen-vl-chat

@Liam-Ji Liam-Ji added the enhancement New feature or request label Feb 8, 2024
Copy link

dosubot bot commented Feb 8, 2024

🤖

嗨,@Liam-Ji!很高兴再次看到你提出的问题。希望你最近一切顺利!

您可以在Langchain-Chatchat的配置文件中添加对qwen-vl-chat的支持。具体来说,您需要在LLM_MODELS列表中添加"qwen-vl-chat",并在"llm_model"字典中添加相应的模型路径。例如:

LLM_MODELS = ["chatglm3-6b", "zhipu-api", "openai-api", "qwen-vl-chat"]

"llm_model": {
    ...
    "qwen-vl-chat": "Qwen/qwen-vl-chat",
    ...
}

同时,如果您希望支持OpenAI API或ChatChat API,您需要在ONLINE_LLM_MODEL字典中添加相应的API配置。例如:

ONLINE_LLM_MODEL = {
    ...
    "openai-api": {
        "model_name": "gpt-4",
        "api_base_url": "https://api.openai.com/v1",
        "api_key": "your_openai_api_key",
        "openai_proxy": "",
    },
    "chatchat-api": {
        "model_name": "chatchat",
        "api_base_url": "https://api.chatchat.com/v1",
        "api_key": "your_chatchat_api_key",
        "chatchat_proxy": "",
    },
    ...
}

请注意,您需要将"your_openai_api_key"和"your_chatchat_api_key"替换为您的实际API密钥。

以上信息可以在configs/model_config.py.example中找到。

如果您在实施过程中遇到任何问题,欢迎随时提问。

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@zRzRzRzRzRzRzR zRzRzRzRzRzRzR self-assigned this Feb 8, 2024
@zRzRzRzRzRzRzR
Copy link
Collaborator

可是没有多模态知识库啊,加入了多模态模型的意义是

@Liam-Ji
Copy link
Author

Liam-Ji commented Feb 8, 2024

是这样的,我有个场景:识别图像场景的内容作为标签,qwen-vl-chat不能直接给出标签,但是他可以描述图像的内容,可以根据这个内容与知识库匹配得到标签。这个过程第一步用qwen-vl-chat得到描述,第二步用qwen-vl-chat+知识库得到标签。所以想看看LangChain-Chatchat能不能支持qwen-vl-chat

@Liam-Ji
Copy link
Author

Liam-Ji commented Feb 8, 2024

不需要多模态知识库,现有的文字知识库就能用。

@Liam-Ji
Copy link
Author

Liam-Ji commented Feb 8, 2024

这个发送“你是谁”,后台出来的错误信息,只拿到了返回的两个字“我是”
Uploading image.png…
Uploading 屏幕截图 2024-02-08 223823.png…
`2024-02-08 22:37:29 | INFO | httpx | HTTP Request: POST http://127.0.0.1:20002/worker_generate_stream "HTTP/1.1 200 OK"
JILI******************:stream1: 我是
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 259, in call
await wrap(partial(self.listen_for_disconnect, receive))
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 255, in wrap
await func()
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 232, in listen_for_disconnect
message = await receive()
^^^^^^^^^^^^^^^
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 587, in receive
await self.message_event.wait()
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/asyncio/locks.py", line 213, in wait
await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 7fb2109ea910

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last):
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
    | result = await app( # type: ignore[func-returns-value]
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
    | return await self.app(scope, receive, send)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
    | await super().call(scope, receive, send)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/applications.py", line 116, in call
    | await self.middleware_stack(scope, receive, send)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
    | raise exc
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
    | await self.app(scope, receive, _send)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
    | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
    | raise exc
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
    | await app(scope, receive, sender)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/routing.py", line 746, in call
    | await route.handle(scope, receive, send)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
    | await self.app(scope, receive, send)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
    | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
    | raise exc
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
    | await app(scope, receive, sender)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
    | await response(scope, receive, send)
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 252, in call
    | async with anyio.create_task_group() as task_group:
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 678, in aexit
    | raise BaseExceptionGroup(
    | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
    +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 255, in wrap
    | await func()
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 244, in stream_response
    | async for chunk in self.body_iterator:
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/concurrency.py", line 57, in iterate_in_threadpool
    | yield await anyio.to_thread.run_sync(_next, iterator)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
    | return await get_async_backend().run_sync_in_worker_thread(
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread
    | return await future
    | ^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    | result = context.run(func, *args)
    | ^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/concurrency.py", line 47, in _next
    | return next(iterator)
    | ^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/fastchat/serve/model_worker.py", line 109, in generate_stream_gate
    | for output in self.generate_stream_func(
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 56, in generator_context
    | response = gen.send(request)
    | ^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/fastchat/serve/inference.py", line 160, in generate_stream
    | out = model(
    | ^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    | return self._call_impl(*args, **kwargs)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    | return forward_call(*args, **kwargs)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/.cache/huggingface/modules/transformers_modules/Qwen-VL-Chat/modeling_qwen.py", line 856, in forward
    | transformer_outputs = self.transformer(
    | ^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    | return self._call_impl(*args, **kwargs)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    | return forward_call(*args, **kwargs)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/.cache/huggingface/modules/transformers_modules/Qwen-VL-Chat/modeling_qwen.py", line 704, in forward
    | outputs = block(
    | ^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    | return self._call_impl(*args, **kwargs)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    | return forward_call(*args, **kwargs)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/.cache/huggingface/modules/transformers_modules/Qwen-VL-Chat/modeling_qwen.py", line 371, in forward
    | attn_outputs = self.attn(
    | ^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
    | return self._call_impl(*args, **kwargs)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
    | return forward_call(*args, **kwargs)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "/home/liam/.cache/huggingface/modules/transformers_modules/Qwen-VL-Chat/modeling_qwen.py", line 303, in forward
    | attn_output, attn_weight = self._attn(
    | ^^^^^^^^^^^
    | File "/home/liam/.cache/huggingface/modules/transformers_modules/Qwen-VL-Chat/modeling_qwen.py", line 170, in _attn
    | attn_weights = attn_weights + attention_mask
    | ~~~~~~~~~~~~~^~~~~~~~~~~~~~~~
    | TypeError: unsupported operand type(s) for +: 'Tensor' and 'NoneType'
    +------------------------------------
    2024-02-08 22:37:30 | ERROR | stderr | ERROR: Exception in ASGI application
    2024-02-08 22:37:30 | ERROR | stderr | Traceback (most recent call last):
    2024-02-08 22:37:30 | ERROR | stderr | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 259, in call
    2024-02-08 22:37:30 | ERROR | stderr | await wrap(partial(self.listen_for_disconnect, receive))
    2024-02-08 22:37:30 | ERROR | stderr | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 255, in wrap
    2024-02-08 22:37:30 | ERROR | stderr | await func()
    2024-02-08 22:37:30 | ERROR | stderr | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 232, in listen_for_disconnect
    2024-02-08 22:37:30 | ERROR | stderr | message = await receive()
    2024-02-08 22:37:30 | ERROR | stderr | ^^^^^^^^^^^^^^^
    2024-02-08 22:37:30 | ERROR | stderr | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 587, in receive
    2024-02-08 22:37:30 | ERROR | stderr | await self.message_event.wait()
    2024-02-08 22:37:30 | ERROR | stderr | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/asyncio/locks.py", line 213, in wait
    2024-02-08 22:37:30 | ERROR | stderr | await fut
    2024-02-08 22:37:30 | ERROR | stderr | asyncio.exceptions.CancelledError: Cancelled by cancel scope 7f3f62825590
    2024-02-08 22:37:30 | ERROR | stderr |
    2024-02-08 22:37:30 | ERROR | stderr | During handling of the above exception, another exception occurred:
    2024-02-08 22:37:30 | ERROR | stderr |
    2024-02-08 22:37:30 | ERROR | stderr | + Exception Group Traceback (most recent call last):
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
    2024-02-08 22:37:30 | ERROR | stderr | | result = await app( # type: ignore[func-returns-value]
    2024-02-08 22:37:30 | ERROR | stderr | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
    2024-02-08 22:37:30 | ERROR | stderr | | return await self.app(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
    2024-02-08 22:37:30 | ERROR | stderr | | await super().call(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/applications.py", line 116, in call
    2024-02-08 22:37:30 | ERROR | stderr | | await self.middleware_stack(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
    2024-02-08 22:37:30 | ERROR | stderr | | raise exc
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
    2024-02-08 22:37:30 | ERROR | stderr | | await self.app(scope, receive, _send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in call
    2024-02-08 22:37:30 | ERROR | stderr | | await self.app(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
    2024-02-08 22:37:30 | ERROR | stderr | | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
    2024-02-08 22:37:30 | ERROR | stderr | | raise exc
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
    2024-02-08 22:37:30 | ERROR | stderr | | await app(scope, receive, sender)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/routing.py", line 746, in call
    2024-02-08 22:37:30 | ERROR | stderr | | await route.handle(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
    2024-02-08 22:37:30 | ERROR | stderr | | await self.app(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/routing.py", line 75, in app
    2024-02-08 22:37:30 | ERROR | stderr | | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/_exception_handler.py", line 55, in wrapped_app
    2024-02-08 22:37:30 | ERROR | stderr | | raise exc
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/_exception_handler.py", line 44, in wrapped_app
    2024-02-08 22:37:30 | ERROR | stderr | | await app(scope, receive, sender)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
    2024-02-08 22:37:30 | ERROR | stderr | | await response(scope, receive, send)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 252, in call
    2024-02-08 22:37:30 | ERROR | stderr | | async with anyio.create_task_group() as task_group:
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 678, in aexit
    2024-02-08 22:37:30 | ERROR | stderr | | raise BaseExceptionGroup(
    2024-02-08 22:37:30 | ERROR | stderr | | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
    2024-02-08 22:37:30 | ERROR | stderr | +-+---------------- 1 ----------------
    2024-02-08 22:37:30 | ERROR | stderr | | Traceback (most recent call last):
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions
    2024-02-08 22:37:30 | ERROR | stderr | | yield
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 209, in _receive_event
    2024-02-08 22:37:30 | ERROR | stderr | | event = self._h11_state.next_event()
    2024-02-08 22:37:30 | ERROR | stderr | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/h11/_connection.py", line 469, in next_event
    2024-02-08 22:37:30 | ERROR | stderr | | event = self._extract_next_receive_event()
    2024-02-08 22:37:30 | ERROR | stderr | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/h11/_connection.py", line 419, in _extract_next_receive_event
    2024-02-08 22:37:30 | ERROR | stderr | | event = self._reader.read_eof() # type: ignore[attr-defined]
    2024-02-08 22:37:30 | ERROR | stderr | | ^^^^^^^^^^^^^^^^^^^^^^^
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/h11/_readers.py", line 204, in read_eof
    2024-02-08 22:37:30 | ERROR | stderr | | raise RemoteProtocolError(
    2024-02-08 22:37:30 | ERROR | stderr | | h11._util.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
    2024-02-08 22:37:30 | ERROR | stderr | |
    2024-02-08 22:37:30 | ERROR | stderr | | The above exception was the direct cause of the following exception:
    2024-02-08 22:37:30 | ERROR | stderr | |
    2024-02-08 22:37:30 | ERROR | stderr | | Traceback (most recent call last):
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions
    2024-02-08 22:37:30 | ERROR | stderr | | yield
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 249, in aiter
    2024-02-08 22:37:30 | ERROR | stderr | | async for part in self._httpcore_stream:
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 361, in aiter
    2024-02-08 22:37:30 | ERROR | stderr | | async for part in self._stream:
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 337, in aiter
    2024-02-08 22:37:30 | ERROR | stderr | | raise exc
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 329, in aiter
    2024-02-08 22:37:30 | ERROR | stderr | | async for chunk in self._connection._receive_response_body(**kwargs):
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 198, in _receive_response_body
    2024-02-08 22:37:30 | ERROR | stderr | | event = await self._receive_event(timeout=timeout)
    2024-02-08 22:37:30 | ERROR | stderr | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 208, in _receive_event
    2024-02-08 22:37:30 | ERROR | stderr | | with map_exceptions({h11.RemoteProtocolError: RemoteProtocolError}):
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/contextlib.py", line 158, in exit
    2024-02-08 22:37:30 | ERROR | stderr | | self.gen.throw(typ, value, traceback)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    2024-02-08 22:37:30 | ERROR | stderr | | raise to_exc(exc) from exc
    2024-02-08 22:37:30 | ERROR | stderr | | httpcore.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
    2024-02-08 22:37:30 | ERROR | stderr | |
    2024-02-08 22:37:30 | ERROR | stderr | | The above exception was the direct cause of the following exception:
    2024-02-08 22:37:30 | ERROR | stderr | |
    2024-02-08 22:37:30 | ERROR | stderr | | Traceback (most recent call last):
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 255, in wrap
    2024-02-08 22:37:30 | ERROR | stderr | | await func()
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/starlette/responses.py", line 244, in stream_response
    2024-02-08 22:37:30 | ERROR | stderr | | async for chunk in self.body_iterator:
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/fastchat/serve/openai_api_server.py", line 472, in chat_completion_stream_generator
    2024-02-08 22:37:30 | ERROR | stderr | | async for content in generate_completion_stream(gen_params, worker_addr):
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/fastchat/serve/openai_api_server.py", line 657, in generate_completion_stream
    2024-02-08 22:37:30 | ERROR | stderr | | async for raw_chunk in response.aiter_raw():
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_models.py", line 990, in aiter_raw
    2024-02-08 22:37:30 | ERROR | stderr | | async for raw_stream_bytes in self.stream:
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_client.py", line 146, in aiter
    2024-02-08 22:37:30 | ERROR | stderr | | async for chunk in self._stream:
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 248, in aiter
    2024-02-08 22:37:30 | ERROR | stderr | | with map_httpcore_exceptions():
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/contextlib.py", line 158, in exit
    2024-02-08 22:37:30 | ERROR | stderr | | self.gen.throw(typ, value, traceback)
    2024-02-08 22:37:30 | ERROR | stderr | | File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions
    2024-02-08 22:37:30 | ERROR | stderr | | raise mapped_exc(message) from exc
    2024-02-08 22:37:30 | ERROR | stderr | | httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
    2024-02-08 22:37:30 | ERROR | stderr | +------------------------------------
    2024-02-08 22:37:30,098 - utils.py[line:25] - ERROR: peer closed connection without sending complete message body (incomplete chunked read)
    Traceback (most recent call last):
    File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions
    yield
    File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 209, in _receive_event
    event = self._h11_state.next_event()
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/h11/_connection.py", line 469, in next_event
    event = self._extract_next_receive_event()
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/h11/_connection.py", line 419, in _extract_next_receive_event
    event = self._reader.read_eof() # type: ignore[attr-defined]
    ^^^^^^^^^^^^^^^^^^^^^^^
    File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/h11/_readers.py", line 204, in read_eof
    raise RemoteProtocolError(
    h11._util.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions
yield
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 249, in aiter
async for part in self._httpcore_stream:
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 361, in aiter
async for part in self._stream:
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 337, in aiter
raise exc
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 329, in aiter
async for chunk in self._connection._receive_response_body(**kwargs):
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 198, in _receive_response_body
event = await self._receive_event(timeout=timeout)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_async/http11.py", line 208, in _receive_event
with map_exceptions({h11.RemoteProtocolError: RemoteProtocolError}):
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/liam/workspace/Langchain-Chatchat/server/utils.py", line 23, in wrap_done
await fn
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain/chains/base.py", line 385, in acall
raise e
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain/chains/base.py", line 379, in acall
await self._acall(inputs, run_manager=run_manager)
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain/chains/llm.py", line 275, in _acall
response = await self.agenerate([inputs], run_manager=run_manager)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain/chains/llm.py", line 142, in agenerate
return await self.llm.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 553, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 513, in agenerate
raise exceptions[0]
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 616, in _agenerate_with_cache
return await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain_community/chat_models/openai.py", line 522, in _agenerate
return await agenerate_from_stream(stream_iter)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 86, in agenerate_from_stream
async for chunk in stream:
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/langchain_community/chat_models/openai.py", line 488, in _astream
async for chunk in await acompletion_with_retry(
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/openai/_streaming.py", line 116, in aiter
async for item in self._iterator:
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/openai/_streaming.py", line 129, in stream
async for sse in iterator:
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/openai/_streaming.py", line 120, in _iter_events
async for sse in self._decoder.aiter(self.response.aiter_lines()):
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/openai/_streaming.py", line 231, in aiter
async for line in iterator:
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_models.py", line 966, in aiter_lines
async for text in self.aiter_text():
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_models.py", line 953, in aiter_text
async for byte_content in self.aiter_bytes():
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_models.py", line 932, in aiter_bytes
async for raw_bytes in self.aiter_raw():
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_models.py", line 990, in aiter_raw
async for raw_stream_bytes in self.stream:
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_client.py", line 146, in aiter
async for chunk in self._stream:
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 248, in aiter
with map_httpcore_exceptions():
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "/home/liam/miniconda3/envs/LangChain-Chatchat/lib/python3.11/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
2024-02-08 22:37:30,102 - utils.py[line:28] - ERROR: RemoteProtocolError: Caught exception: peer closed connection without sending complete message body (incomplete chunked read)`

@Zhiqfu
Copy link

Zhiqfu commented Feb 19, 2024

请问您的问题解决了吗?我也遇到同样的问题了

@Liam-Ji
Copy link
Author

Liam-Ji commented Feb 20, 2024

还没有

@jackeyke
Copy link

jackeyke commented Mar 1, 2024

同样的问题,有没有解决?

@seabean1999
Copy link

seabean1999 commented Mar 13, 2024

引入多模态的场景中,对应解决方案的设计思路这里面提到的应该可以满足 如何借助LLMs构建支持文本、表格、图片的知识库 - 知乎

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants