-
-
Notifications
You must be signed in to change notification settings - Fork 13.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Ollama running on separate docker #2241
Comments
@netgfx what is the ollama URL that you are using...if its running in docker plz use host.docker.internal:11434 as the URL. hope this helps |
I tried that but didn't work, also not sure how it would work, meaning that it is hosted on a separate server (with docker yes) but different domain altogether |
@netgfx find the IP address of the docker container (run docker inspect and try to ping that. If ur able to ping that container, then use the IP as the URL with the port 11434 and it will work. if ping fails, then there is some network issue. check the subnets of the machine from where ur trying to ping and the container of ollama. if they are diff subnets then use NAT translation so that ur able to ping. hope this helps. |
@netgfx can you try if changing the url to |
@HenryHengZJ I did try it but nothing, still get the |
yep this solved my issue, ollama on wsl and flowise in container bridge network, only by using http://host.docker.internal:11434 would it work! |
It also fails with a non-dockerized instance of ollama and a dockerized instance of flowise, both on the same windows machine. Moreover, on the same machine a dockerized instance of open-webui is perfectly fetching models served by ollama. found solution here https://github.com/ollama/ollama/blob/main/docs/faq.md It works fine now. |
Describe the bug
I get the
fetch failed
error when trying to use a dockerized Ollama instance. API is working fine on its own, only when used from Flowise it can't connect for some reason. I have tried with the11434
port and without it (as the NGinx will passthrough the connection)The error on the Flowise docker (separate instance) is:
To Reproduce
![Screenshot 2024-04-23 112914](https://private-user-images.githubusercontent.com/864248/324756177-3f388334-af4c-4a08-b1f1-eca7405794f3.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTkyNTUxMDUsIm5iZiI6MTcxOTI1NDgwNSwicGF0aCI6Ii84NjQyNDgvMzI0NzU2MTc3LTNmMzg4MzM0LWFmNGMtNGEwOC1iMWYxLWVjYTc0MDU3OTRmMy5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNjI0JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDYyNFQxODQ2NDVaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT01MTJmN2MwMGViZmI2YjExYzQ1NWM4ZmEyOTk2NzEzMjRlNDg1NDU1MjI2ODMzYTA2MDU2M2Q5MzdjZjdlNzVhJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.JQN2TP2zqvvWJj1EhFgJyeV_112BvZoPG56EKefgpyc)
Simple setup
Expected behavior
Connection with Ollama to be successful since the API is working, chat to return a result
Screenshots
Added above
Flow
If applicable, add exported flow in order to help replicating the problem.
Setup
npx flowise start
,pnpm start
]Additional context
The text was updated successfully, but these errors were encountered: