-
-
Notifications
You must be signed in to change notification settings - Fork 10k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feedback for “在 LobeChat 中使用 Ollama” #1933
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
The same problem, using ollama in the connectivity check, the check failed; but after modifying the environment variables during deployment, ollama can be used |
Thanks for the reply. Is this modifying the environment variables of lobe-Chat? Are the parameters specified in the docker run command? |
Here is the compose method, configure the yml file, defined in the yml configuration file, |
连通性检查失败,但实际可以使用 |
Connectivity check failed, but actually works |
使用的wsl环境,Ollama直接安装,lobechat使用docker部署的。 |
In the wsl environment used, Ollama is installed directly, and lobechat is deployed using docker. |
你好,运行docker容器以后,默认还是可以访问到宿主机器的ip和服务的,只需要将OLLAMA_PROXY_URL设置成你的宿主机的ip,不能是127.0.0.1,默认的bridge模式,127指向容器自己。 |
Hello, after running the docker container, you can still access the IP and services of the host machine by default. You only need to set OLLAMA_PROXY_URL to the IP of your host machine. It cannot be 127.0.0.1. In the default bridge mode, 127 points to the container itself. |
windows的本机部署docker的Lobe Chat,试试用host.docker.internal替代ip行不行。 |
To deploy docker's Lobe Chat locally on windows, try using host.docker.internal instead of ip. |
lobehub在unraid中,使用bond0网络,docker run运行,ip为192.168.50.152:3210,同时ollama,使用bond0网络,docker run运行,使用bond0网络,ip为192.168.50.151:11434, 我已经使用了npm代理了ollama,代理地址示例为:ollama.xxxx.com. 而且告诉我Ollama is running, 当我在设置页面检查ollama,接口代理地址设置为https://ollama.xxxx.com/v1,告诉我未检测到 Ollama 服务,请检查是否正常启动 |
你可以试着 docker -it 进入那个运行lobe chat的容器,然后ping一下lobe chat的服务器的ip,可以ping通就再试一下curl 192.168.50.151:11434 看看能否访问到ollama服务。记得改一下apt install的源,这样子apt update和istall 会快一些 |
是的,检测一直失败,实际可以使用 |
Yes, the detection keeps failing and it actually works |
@sjy 我感觉是不是之前拿 modellist 来检查的思路并不是特别好?要不还是改回用模型调用? |
@sjy I feel like the idea of using modellist to check before was not particularly good? Or should we change back to using model calling? |
This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 0.149.0 🎉 The release is available on: Your semantic-release bot 📦🚀 |
按教程容器部署后,始终无法连接本地Ollama的端口,是需要特殊的配置吗?
确认Ollama运行及服务正常。
The text was updated successfully, but these errors were encountered: