Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feedback for “在 LobeChat 中使用 Ollama” #1933

Closed
ysyx2008 opened this issue Apr 9, 2024 · 28 comments · Fixed by #2168
Closed

Feedback for “在 LobeChat 中使用 Ollama” #1933

ysyx2008 opened this issue Apr 9, 2024 · 28 comments · Fixed by #2168
Labels

Comments

@ysyx2008
Copy link

ysyx2008 commented Apr 9, 2024

按教程容器部署后,始终无法连接本地Ollama的端口,是需要特殊的配置吗?

确认Ollama运行及服务正常。

图片

@lobehubbot
Copy link
Member

👀 @ysyx2008

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@njuFerret
Copy link

njuFerret commented Apr 9, 2024

按教程容器部署后,始终无法连接本地Ollama的端口,是需要特殊的配置吗?

确认Ollama运行及服务正常。

图片

同样的问题,使用ollama在连通性检查中,检查失败;但在部署时修改环境变量后,可以使用ollama

   environment:
       - 'OLLAMA_PROXY_URL=http://ip:port/v1'

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


After deploying the container according to the tutorial, I still cannot connect to the local Ollama port. Is there any special configuration required?

Confirm that Ollama is running and serving normally.

![Picture](https://private-user-images.githubusercontent.com/11251894/320687548-aac3e45a-d4b7-46d3-b736-24cf9ca2ca79.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaX Rodriguez ODc1NDgtYWFjM2U0NWEtZDRiNy00NmQzLWI3MzYtMjRjZjljYTJjYTc5LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNF pBJTJGMjAyNDA0MDklMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNDA5VDAzMTI0MFomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTcxMzZiZTBhOG QwNzRmYWQ4N2MxNGZmNDMxYjUyYjdlNWIzMjIzYzQ5YTU5NTllODI0NWFlN2U4OTU3MjE3MDYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn 0.HvuDSBsYTCuEth-bgbzDcb0NnCgWMTW0mHqjzXpayJw)

The same problem, using ollama in the connectivity check, the check failed; but after modifying the environment variables during deployment, ollama can be used
environment: - 'OLLAMA_PROXY_URL=http://ip:port/v1'

@ysyx2008
Copy link
Author

ysyx2008 commented Apr 9, 2024

按教程容器部署后,始终无法连接本地Ollama的端口,是需要特殊的配置吗?
确认Ollama运行及服务正常。
图片

同样的问题,使用ollama在连通性检查中,检查失败;但在部署时修改环境变量后,可以使用ollama

   environment:
       - 'OLLAMA_PROXY_URL=http://ip:port/v1'

感谢回复,这是修改lobe-Chat的环境变量吧?是在docker run命令中指定参数吗?

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


After deploying the container according to the tutorial, I still cannot connect to the local Ollama port. Is there any special configuration required?
Confirm that Ollama is running and serving normally.
! [Picture] (https://private- user-images.githubusercontent.com/11251894/320687548-AAC3E45A-46D3-46-3CF9CA79.png?jwt=eyjhbgcioi Jiuzi1niisinr5ikpxvcj9.eyjpc3mioijnaxrodwiuy29tiiwiyxvkijoicmf3LMDPDGH1YNVZXJJB250LMNVBSISIMTLETUILCJLEHAIOJE3MT I2MZI2NJASIM5IIM5II6MTCXMJYZMJMJM2MCWICGF0ACI6II8XMTI1MTG5NC8ZMJA2ODC1NDGTYWFJM2U0NWETZDRINY00NMQZLWI3MJLJYTC5LNBUZZ9YL UFTei1BBGDVC09QVDTNC1ITUFDLVNIQTI1NIZYLUFTEI1DCMVKZW50AWFLSUFWQ0UXTQTUFFLNFPBJGMJGLMDKZ1lyxn0ltelmkz ZMYUYRMF3CZRMFCMVXDWVZDCZYLUFTEI1EYXRLPTIWMJQWNDA5VDAZMTI0MFOMWC1BBXOTRXHWAXJLCZ0ZMDAMWC1bxotu2lPTCXMZZZMZZZRMY WQ4N2MXNGZMNDMXYJDLNWIZMJIZYZQ5YTU5NTLLODI0NWFLN2U3MDYMWC1BXOTU2LNBMVHzgvycz1ob3nyx2LKPTAMA2V5X2LKPT AMCMVWB19PZD0IN0.HVUDSBSYTCUETH-BGBZDCB0NNNNNNNNNNNCGWMTWMHQJZXPAYJW))

The same problem, using ollama in the connectivity check, the check failed; but after modifying the environment variables during deployment, ollama can be used

environment:
- 'OLLAMA_PROXY_URL=http://ip:port/v1'

Thanks for the reply. Is this modifying the environment variables of lobe-Chat? Are the parameters specified in the docker run command?

@njuFerret
Copy link

按教程容器部署后,始终无法连接本地Ollama的端口,是需要特殊的配置吗?
确认Ollama运行及服务正常。
图片

同样的问题,使用ollama在连通性检查中,检查失败;但在部署时修改环境变量后,可以使用ollama

   environment:
       - 'OLLAMA_PROXY_URL=http://ip:port/v1'

感谢回复,这是修改lobe-Chat的环境变量吧?是在docker run命令中指定参数吗?

这里是compose方式,配置yml文件,在yml配置文件中定义,

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


After deploying the container according to the tutorial, I still cannot connect to the local Ollama port. Is there any special configuration required?
Confirm that Ollama is running and serving normally.
![Picture](https://private-user-images.githubusercontent.com/11251894/320687548-aac3e45a-d4b7-46d3-b736-24cf9ca2ca79.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJ naXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTI2MzI2NjAsIm5iZiI6MTcxMjYzMjM2MCwicGF0aCI6Ii8xMTI1MTg5NC8zMj A2ODc1NDgtYWFjM2U0NWEtZDRiNy00NmQzLWI3MzYtMjRjZjljYTJjYTc5LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFF LNFpBJTJGMjAyNDA0MDklMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNDA5VDAzMTI0MFomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTcxMzZiZTB hOGQwNzRmYWQ4N2MxNGZmNDMxYjUyYjdlNWIzMjIzYzQ5YTU5NTllODI0NWFlN2U4OTU3MjE3MDYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0 wIn0.HvuDSBsYTCuEth-bgbzDcb0NnCgWMTW0mHqjzXpayJw)

The same problem, using ollama in the connectivity check, the check failed; but after modifying the environment variables during deployment, ollama can be used

environment:
- 'OLLAMA_PROXY_URL=http://ip:port/v1'

Thanks for the reply. Is this modifying the environment variables of lobe-Chat? Are the parameters specified in the docker run command?

Here is the compose method, configure the yml file, defined in the yml configuration file,

@MrOops1985
Copy link

连通性检查失败,但实际可以使用

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Connectivity check failed, but actually works

@ysyx2008
Copy link
Author

ysyx2008 commented Apr 9, 2024

连通性检查失败,但实际可以使用

测试了,用不了:
图片

ollama本身正常运行:
图片

@ysyx2008
Copy link
Author

ysyx2008 commented Apr 9, 2024

使用的wsl环境,Ollama直接安装,lobechat使用docker部署的。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


In the wsl environment used, Ollama is installed directly, and lobechat is deployed using docker.

@GitHUB-ZYD
Copy link

你好,运行docker容器以后,默认还是可以访问到宿主机器的ip和服务的,只需要将OLLAMA_PROXY_URL设置成你的宿主机的ip,不能是127.0.0.1,默认的bridge模式,127指向容器自己。
设置好以后,检查还是会失败,但是对话是可以进行的。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Hello, after running the docker container, you can still access the IP and services of the host machine by default. You only need to set OLLAMA_PROXY_URL to the IP of your host machine. It cannot be 127.0.0.1. In the default bridge mode, 127 points to the container itself.
After setting it up, the check will still fail, but the conversation will be possible.

@MapleEve
Copy link
Contributor

image
我是这样设置的(Windwos),然后局域网暴露给 NAS 上的 Docker Lobe Chat,这样他就可以用这台机器的局域网 IP 来访问本地的大模型了 (别忘记防火墙设置)

@xinsheng2008
Copy link

windows的本机部署docker的Lobe Chat,试试用host.docker.internal替代ip行不行。

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


To deploy docker's Lobe Chat locally on windows, try using host.docker.internal instead of ip.

@chuan1127
Copy link

lobehub在unraid中,使用bond0网络,docker run运行,ip为192.168.50.152:3210,同时ollama,使用bond0网络,docker run运行,使用bond0网络,ip为192.168.50.151:11434, 我已经使用了npm代理了ollama,代理地址示例为:ollama.xxxx.com. 而且告诉我Ollama is running, 当我在设置页面检查ollama,接口代理地址设置为https://ollama.xxxx.com/v1,告诉我未检测到 Ollama 服务,请检查是否正常启动
Show Details
json
{
"host": "ollama.xxx.com",
"message": "please check whether your ollama service is available",
"provider": "ollama"
}
由于我在unraid中运行,使用的是bond0网络, 我需要怎么做?还是说我需要使用host网络运行。

@GitHUB-ZYD
Copy link

lobehub在unraid中,使用bond0网络,docker run运行,ip为192.168.50.152:3210,同时ollama,使用bond0网络,docker run运行,使用bond0网络,ip为192.168.50.151:11434, 我已经使用了npm代理了ollama,代理地址示例为:ollama.xxxx.com. 而且告诉我Ollama is running, 当我在设置页面检查ollama,接口代理地址设置为https://ollama.xxxx.com/v1,告诉我未检测到 Ollama 服务,请检查是否正常启动 Show Details json { "host": "ollama.xxx.com", "message": "please check whether your ollama service is available", "provider": "ollama" } 由于我在unraid中运行,使用的是bond0网络, 我需要怎么做?还是说我需要使用host网络运行。

你可以试着 docker -it 进入那个运行lobe chat的容器,然后ping一下lobe chat的服务器的ip,可以ping通就再试一下curl 192.168.50.151:11434 看看能否访问到ollama服务。记得改一下apt install的源,这样子apt update和istall 会快一些

@chuan1127
Copy link

ollama
chat
这是我在unraid中的配置,他们是可以单独运行的,同时已经按照官网的要求代理了,配置如下:
location / {
proxy_pass $forward_scheme://$server:$port;
proxy_set_header Host $http_host;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Host $http_host;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Method $request_method;
proxy_set_header X-Forwarded-Ssl on;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Original-URL $scheme://$http_host$request_uri;
proxy_set_header X-Forwarded-Uri $request_uri;
# 添加 CORS 头部,如果需要
add_header 'Access-Control-Allow-Origin' '*';
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
proxy_buffering off;
proxy_cache off;
send_timeout 600;
proxy_connect_timeout 600;
proxy_send_timeout 600;
proxy_read_timeout 600;
}
但是检查显示:未检测到 Ollama 服务,请检查是否正常启动
Show Details
json
{
"host": "ollama.xxx.com",
"message": "please check whether your ollama service is available",
"provider": "ollama"
}

@MapleEve
Copy link
Contributor

ollama chat 这是我在unraid中的配置,他们是可以单独运行的,同时已经按照官网的要求代理了,配置如下: location / { proxy_pass forwardscheme://forward_scheme://server:port;proxysetheaderHostport; proxy_set_header Host http_host; proxy_http_version 1.1; proxy_set_header Upgrade httpupgrade;proxysetheaderConnection"upgrade";proxysetheaderX−Forwarded−Forhttp_upgrade; proxy_set_header Connection "upgrade"; proxy_set_header X-Forwarded-For remote_addr; proxy_set_header X-Forwarded-For Double subscripts: use braces to clarifyproxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Host http_host; proxy_set_header X-Forwarded-Proto scheme;proxysetheaderX−Forwarded−Methodscheme; proxy_set_header X-Forwarded-Method request_method; proxy_set_header X-Forwarded-Ssl on; proxy_set_header X-Real-IP remoteaddr;proxysetheaderX−Original−URLremote_addr; proxy_set_header X-Original-URL scheme://httphosthttp_hostrequest_uri; proxy_set_header X-Forwarded-Uri $request_uri; # 添加 CORS 头部,如果需要 add_header 'Access-Control-Allow-Origin' '*'; add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS'; proxy_buffering off; proxy_cache off; send_timeout 600; proxy_connect_timeout 600; proxy_send_timeout 600; proxy_read_timeout 600; } 但是检查显示:未检测到 Ollama 服务,请检查是否正常启动 Show Details json { "host": "ollama.xxx.com", "message": "please check whether your ollama service is available", "provider": "ollama" }

我在Ubuntu的推理机和训练机以及MacOS上都部署过Ollama都没重现这个情况,因为我都是局域网内部署和请求,所以如果是外网访问的我这边也没有什么办法来重现,可能需要其他人来看看什么问题了。

@bigsuperangel
Copy link

连通性检查失败,但实际可以使用

是的,检测一直失败,实际可以使用

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Connectivity check fails, but actually works

Yes, the detection keeps failing and it actually works

@arvinxx
Copy link
Contributor

arvinxx commented Apr 19, 2024

@sjy 我感觉是不是之前拿 modellist 来检查的思路并不是特别好?要不还是改回用模型调用?

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@sjy I feel like the idea of ​​using modellist to check before was not particularly good? Or should we change back to using model calling?

@chuan1127
Copy link

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿

Connectivity check fails, but actually works

Yes, the detection keeps failing and it actually works

1
我这里应该使没有正常工作,但是我单独使用 Ollama,使正常的

@lobehubbot
Copy link
Member

@ysyx2008

This issue is closed, If you have any questions, you can comment and reply.
此问题已经关闭。如果您有任何问题,可以留言并回复。

@lobehubbot
Copy link
Member

🎉 This issue has been resolved in version 0.149.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants