Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to connect dify to Ollama for Windows #2516

Closed
3 tasks done
jafrank88 opened this issue Feb 22, 2024 · 3 comments
Closed
3 tasks done

Unable to connect dify to Ollama for Windows #2516

jafrank88 opened this issue Feb 22, 2024 · 3 comments
Assignees
Labels
🐞 bug Something isn't working

Comments

@jafrank88
Copy link

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • Pleas do not modify this template :) and fill in all the required fields.

Dify version

0.5.6

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

When using Dify in Docker, I am unable to connect to Ollama running natively in Windows.

Start Dify, go to Settings, Model Provider, Ollama and try to connect to Ollama running on Windows. Do I need to add -p when running Dify in docker, such as docker compose -p 3001:3001 up -d ?

I am able to connect to Ollama with AnythingLLM, so wonder if it is a Dify issue or because Dify is in a container.

✔️ Expected Behavior

I was expecting to be able to use the llm model (Mistral) and the embedding model from Ollama with my self-hosted instance of Dify.

❌ Actual Behavior

I received an error suggesting something was wrong, as seen here https://cdn.discordapp.com/attachments/1187025321447403530/1209926509368967259/image.png?ex=65e8b2df&is=65d63ddf&hm=3ecdaf1b9099b302287a9fb56b25e2529c21297a63aa8daff54cd04496024be3&

@jafrank88 jafrank88 added the 🐞 bug Something isn't working label Feb 22, 2024
Copy link

dosubot bot commented Feb 22, 2024

🤖

Hey there, @jafrank88! 👋 Great to see you diving into another adventure with us. How's everything going on your end?

Based on the information you've provided, it seems like you're trying to connect Dify, running in a Docker container, to Ollama, running natively on Windows.

The -p option in Docker is used to publish a container's port to the host. This means that the Docker container's port will be accessible on the host machine at the specified port. In your case, -p 3001:3001 will make the Docker container's port 3001 accessible on the host machine's port 3001.

If Ollama is running on the same machine and listens on port 3001, you should be able to connect to it from the Docker container using the host machine's IP address or localhost if the Docker container is running in host network mode.

However, the context does not provide specific information about how to configure Dify to connect to Ollama, or whether Dify supports connecting to Ollama at all. Therefore, you may need to refer to Dify's documentation or support for specific instructions on how to configure it to connect to Ollama.

If you are still unable to connect Dify to Ollama, it could be due to network restrictions, firewall settings, or other configuration issues. You may need to troubleshoot these potential issues to establish a successful connection.

Please note that the context does not provide any information about AnythingLLM, so it's unclear why you can connect to Ollama with AnythingLLM but not with Dify.

In conclusion, while the -p option can be used to expose a Docker container's port to the host, the context does not provide sufficient information to fully answer your question about connecting Dify to Ollama.

Regarding the error message you're receiving, unfortunately, I'm unable to view the image you've linked. Could you please provide the exact error message text? This will help me better understand the issue you're facing.

As for the similar issue you found, it seems that using LiteLLM with Ollama is working according to this issue. However, it's unclear if this solution applies to your situation as the context does not provide enough information.

I hope this information is helpful. If you have any more details or if there's anything else you'd like to know, please feel free to ask.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@jafrank88
Copy link
Author

I am not having any problems reaching Dify, so I don't think it is a container issue, but wanted to raise it in case I am wrong.

@crazywoola
Copy link
Member

crazywoola commented Feb 23, 2024

@jafrank88 https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-windows Please try this docs to see if this issue persists.

On windows, Ollama inherits your user and system environment variables.
First Quit Ollama by clicking on it in the task bar
Edit system environment variables from the control panel
Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc.
Click OK/Apply to save
Run ollama from a new terminal window

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants