Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] OLLAMA 配置偏差 #1351

Closed
QIN2DIM opened this issue Feb 22, 2024 · 11 comments · Fixed by #1397
Closed

[Bug] OLLAMA 配置偏差 #1351

QIN2DIM opened this issue Feb 22, 2024 · 11 comments · Fixed by #1397
Assignees
Labels
🐛 Bug Something isn't working | 缺陷 Model Provider 模型服务商

Comments

@QIN2DIM
Copy link

QIN2DIM commented Feb 22, 2024

💻 Operating System

Ubuntu

📦 Environment

Docker

🌐 Browser

Firefox

🐛 Bug Description

OLLAMA 默认的 modelcard 显示错误

PixPin_2024-02-22_17-43-33

(base) root@x# ollama ls
NAME                    ID              SIZE    MODIFIED     
deepseek-coder:33b      acec7c0b0fd9    18 GB   3 weeks ago 
deepseek-coder:6.7b     ce298d984115    3.8 GB  3 weeks ago 
gemma:latest            cb9e0badc99d    4.8 GB  16 hours ago
llava:34b-v1.6          3d2d24f46674    20 GB   3 weeks ago 
yi:34b-chat             5f8365d57cb8    19 GB   3 weeks ago

机器上没有部署默认显示的 model,直接选中开启聊天会报错。例如:没有下载 llama2 的模型。

PixPin_2024-02-22_17-43-21

添加了自定义模型 yi:34b-chat(已下载)则正常通信:

yi

我尝试在 docker-compose.yaml 填写环境变量 CUSTOM_MODELS: -llama2 但似乎对 ollama 的模型卡片不起作用。

🚦 Expected Behavior

No response

📷 Recurrence Steps

No response

📝 Additional Information

No response

@QIN2DIM QIN2DIM added the 🐛 Bug Something isn't working | 缺陷 label Feb 22, 2024
@lobehubbot
Copy link
Member

👀 @QIN2DIM

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@QIN2DIM
Copy link
Author

QIN2DIM commented Feb 22, 2024

/settings/llm 的 OLLAMA 配置里单独填写 CUSTOM_MODELS 有效,但这并不是预期的效果,

图片

减少模型后:

图片

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


It is effective to fill in CUSTOM_MODELS separately in the OLLAMA configuration of /settings/llm, but this is not the expected effect.

Picture

After reducing the model:

Picture

@arvinxx
Copy link
Contributor

arvinxx commented Feb 22, 2024

@sjy 看来需要补一个 OLLAMA_CUSTOM_MODELS 的环境变量

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@sjy It seems that you need to add an environment variable of OLLAMA_CUSTOM_MODELS

@QIN2DIM
Copy link
Author

QIN2DIM commented Feb 22, 2024

@arvinxx 也许可以像 #1352 一样,从 ollama 暴露的 /api/tags 扫描已拉取的模型列表,不过这样操作也会引入新的问题。

好处就是可以直接从 response 拿到 ModelProviderCard 中的 model_id,调用模型的时候不会有传参上的问题。

不太方便的地方就是扫到的 model_id 毕竟是代码化的命名,要 display 到前端可能需要单独命名才会好看,logo,vision,functionCall 之类的设置就更不好搞了。

不过考虑到 Ollama 还支持开发者从 huggingface 之类的平台上拉取开源模型自己 build 量化模型,命名也就千奇百怪了,如果能直接利用 Ollama 的基础设施,那缝上 lobe-chat 门槛会低不少。

ModelProviderCard

https://github.com/lobehub/lobe-chat/blob/main/src/config/modelProviders/ollama.ts

chatModels:
  - displayName: Qwen Chat 70B
    functionCall: false
    hidden: true
    id: qwen:70b-chat
    tokens: 32768
    vision: false
  - displayName: Mistral
    functionCall: false
    id: mistral
    tokens: 4800
    vision: false

/api/tags 返回值样例

{
  "name": "yi:34b-chat",
  "model": "yi:34b-chat",
  "modified_at": "2024-01-31T01:00:36.520299656+08:00",
  "size": 19466547560,
  "digest": "5f8365d57cb897be38692650353fe3eb98bf89bf8d1260fb24cd66a2c2122d70",
  "details":
    {
      "parent_model": "",
      "format": "gguf",
      "family": "llama",
      "families": null,
      "parameter_size": "34B",
      "quantization_level": "Q4_0",
    },
}

@arvinxx arvinxx assigned arvinxx and unassigned arvinxx Feb 22, 2024
@arvinxx arvinxx added the Model Provider 模型服务商 label Feb 22, 2024
@arvinxx
Copy link
Contributor

arvinxx commented Feb 22, 2024

@sjy 是不是和我说的差不多?大家都往这方面想了

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@sjy Is it similar to what I said? Everyone is thinking about this

@sjy
Copy link
Contributor

sjy commented Feb 23, 2024

@sjy 是不是和我说的差不多?大家都往这方面想了
是的,前后脚

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@sjy Is it similar to what I said? Everyone is thinking about this
Yes, front and rear feet

@lobehubbot
Copy link
Member

@QIN2DIM

This issue is closed, If you have any questions, you can comment and reply.
此问题已经关闭。如果您有任何问题,可以留言并回复。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 Model Provider 模型服务商
Projects
Archived in project
4 participants