Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could not intergrate LiteLLM into Dify via OpenAI-API-Compatible #4460

Closed
4 tasks done
tuanlv14 opened this issue May 16, 2024 · 2 comments
Closed
4 tasks done

Could not intergrate LiteLLM into Dify via OpenAI-API-Compatible #4460

tuanlv14 opened this issue May 16, 2024 · 2 comments
Labels
🐞 bug Something isn't working

Comments

@tuanlv14
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.6.8

Cloud or Self Hosted

Cloud

Steps to reproduce

I use OpenAI-API-Compatible to add my local Lite LLM under Base_URL = http://localhost:4000 ; LiteLLM API key, Model_name (masked, proxy) by LiteLLM. The function of LiteLLM worked well with curl and python test method. But I cannot add local LiteLLM into Dify.

✔️ Expected Behavior

I expect LiteLLM can work well with Dify.

❌ Actual Behavior

The error: An error occurred during credentials validation: HTTPConnectionPool(host='localhost', port=4000): Max retries exceeded with url: /chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f966023a4d0>: Failed to establish a new connection: [Errno 111] Connection refused'))

Copy link

dosubot bot commented May 16, 2024

To resolve the issue where Dify cannot connect to LiteLLM running on localhost, follow these steps:

  1. Change LiteLLM Binding: Ensure LiteLLM is bound to 0.0.0.0 instead of localhost to make it accessible over the network.

  2. Update Base URL in Dify: If Dify is running in Docker and LiteLLM is on the host, use http://host.docker.internal:4000 as the Base URL instead of localhost.

  3. Test Network Accessibility: From within the Dify Docker container, use curl to test connectivity to the LiteLLM service.

  4. Restart Both Services: After making these adjustments, restart LiteLLM and Dify.

This approach addresses the common issue where Docker containers cannot directly access services on the host using localhost. Adjusting the service binding and using host.docker.internal for Docker setups should help Dify connect to LiteLLM.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@dosubot dosubot bot added the 🐞 bug Something isn't working label May 16, 2024
@tuanlv14
Copy link
Author

tuanlv14 commented Jun 1, 2024

I confirmed that your help is working.

@tuanlv14 tuanlv14 closed this as completed Jun 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant