Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]baichuan在线api运行错误 #1638

Closed
szdengdi opened this issue Oct 2, 2023 · 3 comments
Closed

[BUG]baichuan在线api运行错误 #1638

szdengdi opened this issue Oct 2, 2023 · 3 comments
Labels
bug Something isn't working

Comments

@szdengdi
Copy link

szdengdi commented Oct 2, 2023

问题描述 / Problem Description
1、model_config.py里面有"baichuan-api",配置后,启动会报“在线模型 ‘baichuan-api’ 的provider没有正确配置”
2、server_config.py里面没有baichuan-api的端口配置

复现问题的步骤 / Steps to Reproduce
1.model_config.py里面有"baichuan-api",配置后,启动报“ | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置”
2. 注释掉“provider”,仍然无法启动
3. server_config.py里面没有baichuan-api的端口配置

预期的结果 / Expected Result
baichuan_api运行

实际结果 / Actual Result
启动报错,webui无显示

环境信息 / Environment Information

  • langchain-ChatGLM 版本/commit 号:0.2.5
  • 是否使用 Docker 部署(是/否):否
  • 使用的模型(ChatGLM2-6B / Qwen-7B 等):baichuan-api
  • 操作系统及版本 / Operating system and version:windows11
  • Python 版本 / Python version:3.10.9

附加信息 / Additional Information
2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置
{'api_key': '65eab5fb 7640855c04685d2',
'device': 'cpu',
'host': '127.0.0.1',
'infer_turbo': False,
'model_path': None,
'online_api': True,
'port': 21007,
'provider': 'BaiChuanWorker',
'secret_key': 'iJ0 62yuGAc=',
'version': 'Baichuan2-53B'}
当前Embbedings模型: m3e-base @ cpu
==============================Langchain-Chatchat Configuration==============================

2023-10-02 10:33:07 | INFO | root | 正在启动服务:
2023-10-02 10:33:07 | INFO | root | 如需查看 llm_api 日志,请前往 C:\tmp\Langchain-Chatchat\logs
2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置
2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置
2023-10-02 10:33:12 | WARNING | root | Sending SIGKILL to {'qianfan-api': , 'zhipu-api': , 'minimax-api': , 'qwen-api': , 'xinghuo-api': , 'fangzhou-api': }
Traceback (most recent call last):
File "C:\tmp\Langchain-Chatchat\startup.py", line 705, in start_main_server
controller_started.wait() # 等待controller启动完成
File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\managers.py", line 1093, in wait
return self._callmethod('wait', (timeout,))
File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\managers.py", line 818, in _callmethod
kind, result = conn.recv()
File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\connection.py", line 250, in recv
buf = self._recv_bytes()
File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\connection.py", line 305, in _recv_bytes
waitres = _winapi.WaitForMultipleObjects(
File "C:\tmp\Langchain-Chatchat\startup.py", line 573, in f
raise KeyboardInterrupt(f"{signalname} received")

@szdengdi szdengdi added the bug Something isn't working label Oct 2, 2023
@szdengdi
Copy link
Author

szdengdi commented Oct 4, 2023

1、server\model_workers下的__init__.py,增加:from .baichuan import BaiChuanWorker
2、configs下的server_config.py,增加:
"baichuan-api": {
"port": 21007,
},

@liunux4odoo
Copy link
Collaborator

1、server\model_workers下的__init__.py,增加:from .baichuan import BaiChuanWorker 2、configs下的server_config.py,增加: "baichuan-api": { "port": 21007, },

感谢反馈。开发版中已修复。当前可以按照 @szdengdi 的办法手动修改一下。

@liurr9810
Copy link

问题描述 / Problem Description 1、model_config.py里面有"baichuan-api",配置后,启动会报“在线模型 ‘baichuan-api’ 的provider没有正确配置” 2、server_config.py里面没有baichuan-api的端口配置

复现问题的步骤 / Steps to Reproduce 1.model_config.py里面有"baichuan-api",配置后,启动报“ | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置” 2. 注释掉“provider”,仍然无法启动 3. server_config.py里面没有baichuan-api的端口配置

预期的结果 / Expected Result baichuan_api运行

实际结果 / Actual Result 启动报错,webui无显示

环境信息 / Environment Information

  • langchain-ChatGLM 版本/commit 号:0.2.5
  • 是否使用 Docker 部署(是/否):否
  • 使用的模型(ChatGLM2-6B / Qwen-7B 等):baichuan-api
  • 操作系统及版本 / Operating system and version:windows11
  • Python 版本 / Python version:3.10.9

附加信息 / Additional Information 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 {'api_key': '65eab5fb 7640855c04685d2', 'device': 'cpu', 'host': '127.0.0.1', 'infer_turbo': False, 'model_path': None, 'online_api': True, 'port': 21007, 'provider': 'BaiChuanWorker', 'secret_key': 'iJ0 62yuGAc=', 'version': 'Baichuan2-53B'} 当前Embbedings模型: m3e-base @ cpu ==============================Langchain-Chatchat Configuration==============================

2023-10-02 10:33:07 | INFO | root | 正在启动服务: 2023-10-02 10:33:07 | INFO | root | 如需查看 llm_api 日志,请前往 C:\tmp\Langchain-Chatchat\logs 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 2023-10-02 10:33:07 | ERROR | root | AttributeError: 在线模型 ‘baichuan-api’ 的provider没有正确配置 2023-10-02 10:33:12 | WARNING | root | Sending SIGKILL to {'qianfan-api': , 'zhipu-api': , 'minimax-api': , 'qwen-api': , 'xinghuo-api': , 'fangzhou-api': } Traceback (most recent call last): File "C:\tmp\Langchain-Chatchat\startup.py", line 705, in start_main_server controller_started.wait() # 等待controller启动完成 File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\managers.py", line 1093, in wait return self._callmethod('wait', (timeout,)) File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\managers.py", line 818, in _callmethod kind, result = conn.recv() File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\connection.py", line 250, in recv buf = self._recv_bytes() File "C:\Users\13510.conda\envs\chatchat\lib\multiprocessing\connection.py", line 305, in _recv_bytes waitres = _winapi.WaitForMultipleObjects( File "C:\tmp\Langchain-Chatchat\startup.py", line 573, in f raise KeyboardInterrupt(f"{signalname} received")

您好,怎么获取的secret key呢?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants