Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

支持通过配置项同时启动多个模型,将Wiki纳入samples知识库 #2002

Merged
merged 2 commits into from Nov 9, 2023

Conversation

liunux4odoo
Copy link
Collaborator

新功能:

  • 将 LLM_MODEL 配置项改为 LLM_MODELS 列表,同时启动多个模型
  • 将 wiki 纳入 samples 知识库

依赖变化:

  • 指定 streamlit~=1.27.0。1.26.0会报rerun错误,1.28.0会有无限刷新错误

修复优化:

  • 优化 get_default_llm_model 逻辑
  • 适配 Qwen 在线 API 做 Embeddings 时最大 25 行的限制
  • 列出知识库磁盘文件时跳过 . 开头的文件

优化 get_default_llm_model 逻辑
指定 streamlit~=1.27.0。1.26.0会报rerun错误,1.28.0会有无限刷新错误
适配 Qwen 在线 API 做 Embeddings 时最大 25 行的限制
列出知识库磁盘文件时跳过 . 开头的文件
@liunux4odoo liunux4odoo merged commit b51ba11 into chatchat-space:dev Nov 9, 2023
@tanglu86
Copy link

现在只能同时启动多个模型, 不能在切换时再启动了吗?能否增加一个这样的开关

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants