Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Ollama running on separate docker #2241

Open
netgfx opened this issue Apr 23, 2024 · 8 comments
Open

[BUG] Ollama running on separate docker #2241

netgfx opened this issue Apr 23, 2024 · 8 comments

Comments

@netgfx
Copy link

netgfx commented Apr 23, 2024

Describe the bug
I get the fetch failed error when trying to use a dockerized Ollama instance. API is working fine on its own, only when used from Flowise it can't connect for some reason. I have tried with the 11434 port and without it (as the NGinx will passthrough the connection)

The error on the Flowise docker (separate instance) is:

TypeError: fetch failed
at Object.fetch (node:internal/deps/undici/undici:11731:11)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async createOllamaStream (/usr/local/lib/node_modules/flowise/node_modules/@langchain/community/dist/utils/ollama.cjs:12:22)
at async createOllamaChatStream (/usr/local/lib/node_modules/flowise/node_modules/@langchain/community/dist/utils/ollama.cjs:60:5)
at async ChatOllama._streamResponseChunks (/usr/local/lib/node_modules/flowise/node_modules/@langchain/community/dist/chat_models/ollama.cjs:378:30)
at async ChatOllama._call (/usr/local/lib/node_modules/flowise/node_modules/@langchain/community/dist/chat_models/ollama.cjs:486:26)
at async ChatOllama._generate (/usr/local/lib/node_modules/flowise/node_modules/@langchain/core/dist/language_models/chat_models.cjs:352:22)
at async Promise.allSettled (index 0)
at async ChatOllama._generateUncached (/usr/local/lib/node_modules/flowise/node_modules/@langchain/core/dist/language_models/chat_models.cjs:114:25)
at async ChatOllama.invoke (/usr/local/lib/node_modules/flowise/node_modules/@langchain/core/dist/language_models/chat_models.cjs:54:24)
at async RunnableSequence.invoke (/usr/local/lib/node_modules/flowise/node_modules/@langchain/core/dist/runnables/base.cjs:1027:33)
at async ConversationChain_Chains.run (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/chains/ConversationChain/ConversationChain.js:103:19)
at async App.buildChatflow (/usr/local/lib/node_modules/flowise/dist/index.js:1530:19)
at async /usr/local/lib/node_modules/flowise/dist/index.js:1066:13

To Reproduce
Simple setup
Screenshot 2024-04-23 112914

Expected behavior
Connection with Ollama to be successful since the API is working, chat to return a result

Screenshots
Added above

Flow
If applicable, add exported flow in order to help replicating the problem.

Setup

  • Installation [e.g. docker, npx flowise start, pnpm start]
  • Flowise Version 1.5.1
  • Node Version: 18.19.1
  • OS: Linux
  • Browser Chrome (any)

Additional context

@netgfx netgfx changed the title [BUG] Ollama runnin on separate docker [BUG] Ollama running on separate docker Apr 24, 2024
@prithvi151080
Copy link

prithvi151080 commented Apr 24, 2024

@netgfx what is the ollama URL that you are using...if its running in docker plz use host.docker.internal:11434 as the URL. hope this helps

@netgfx
Copy link
Author

netgfx commented Apr 24, 2024

I tried that but didn't work, also not sure how it would work, meaning that it is hosted on a separate server (with docker yes) but different domain altogether

@prithvi151080
Copy link

@netgfx find the IP address of the docker container (run docker inspect and try to ping that. If ur able to ping that container, then use the IP as the URL with the port 11434 and it will work. if ping fails, then there is some network issue. check the subnets of the machine from where ur trying to ping and the container of ollama. if they are diff subnets then use NAT translation so that ur able to ping. hope this helps.

@HenryHengZJ
Copy link
Contributor

@netgfx can you try if changing the url to http://host.docker.internal:11434?

@netgfx
Copy link
Author

netgfx commented Apr 26, 2024

@HenryHengZJ I did try it but nothing, still get the fetch failed error

@bigsk1
Copy link

bigsk1 commented May 25, 2024

yep this solved my issue, ollama on wsl and flowise in container bridge network, only by using http://host.docker.internal:11434 would it work!

@netgfx
Copy link
Author

netgfx commented Jun 5, 2024

I tried the host.docker.internal:11434 and even went and setup both images from my local environment (WSL Ubuntu)
But it still fails, this is my setup
Screenshot 2024-06-05 232411

I can ping host.docker.internal:11434 from the docker cli, and I can also access it from the browser, it seems only flowise isn't able to connect with it for some reason :(

image

@Z3n7r4ck3r
Copy link

Z3n7r4ck3r commented Jun 7, 2024

It also fails with a non-dockerized instance of ollama and a dockerized instance of flowise, both on the same windows machine. Moreover, on the same machine a dockerized instance of open-webui is perfectly fetching models served by ollama.

found solution here https://github.com/ollama/ollama/blob/main/docs/faq.md

It works fine now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants