-
Notifications
You must be signed in to change notification settings - Fork 7.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to connect dify to Ollama for Windows #2516
Comments
🤖 Hey there, @jafrank88! 👋 Great to see you diving into another adventure with us. How's everything going on your end? Based on the information you've provided, it seems like you're trying to connect Dify, running in a Docker container, to Ollama, running natively on Windows. The If Ollama is running on the same machine and listens on port 3001, you should be able to connect to it from the Docker container using the host machine's IP address or localhost if the Docker container is running in host network mode. However, the context does not provide specific information about how to configure Dify to connect to Ollama, or whether Dify supports connecting to Ollama at all. Therefore, you may need to refer to Dify's documentation or support for specific instructions on how to configure it to connect to Ollama. If you are still unable to connect Dify to Ollama, it could be due to network restrictions, firewall settings, or other configuration issues. You may need to troubleshoot these potential issues to establish a successful connection. Please note that the context does not provide any information about AnythingLLM, so it's unclear why you can connect to Ollama with AnythingLLM but not with Dify. In conclusion, while the Regarding the error message you're receiving, unfortunately, I'm unable to view the image you've linked. Could you please provide the exact error message text? This will help me better understand the issue you're facing. As for the similar issue you found, it seems that using LiteLLM with Ollama is working according to this issue. However, it's unclear if this solution applies to your situation as the context does not provide enough information. I hope this information is helpful. If you have any more details or if there's anything else you'd like to know, please feel free to ask. Sources
|
I am not having any problems reaching Dify, so I don't think it is a container issue, but wanted to raise it in case I am wrong. |
@jafrank88 https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-windows Please try this docs to see if this issue persists.
|
Self Checks
Dify version
0.5.6
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
When using Dify in Docker, I am unable to connect to Ollama running natively in Windows.
Start Dify, go to Settings, Model Provider, Ollama and try to connect to Ollama running on Windows. Do I need to add -p when running Dify in docker, such as docker compose -p 3001:3001 up -d ?
I am able to connect to Ollama with AnythingLLM, so wonder if it is a Dify issue or because Dify is in a container.
✔️ Expected Behavior
I was expecting to be able to use the llm model (Mistral) and the embedding model from Ollama with my self-hosted instance of Dify.
❌ Actual Behavior
I received an error suggesting something was wrong, as seen here https://cdn.discordapp.com/attachments/1187025321447403530/1209926509368967259/image.png?ex=65e8b2df&is=65d63ddf&hm=3ecdaf1b9099b302287a9fb56b25e2529c21297a63aa8daff54cd04496024be3&
The text was updated successfully, but these errors were encountered: