LiteLLM Example Configs #1038
Replies: 12 comments 23 replies
-
ahhhh yes ty! |
Beta Was this translation helpful? Give feedback.
-
this is amazing @justinh-rahb - if you have any feedback for how we can improve our own docs for this, let me know - https://docs.litellm.ai/docs/
|
Beta Was this translation helpful? Give feedback.
-
No, in v0.1.115 (latest version) open-webui still cannot set LiteLLM to make any Claude 3 model work. Unless... Upgrade LiteLLM to the latest version v1.34.12.
Replace litellm==1.30.7 with litellm==1.34.12 in the ./backend/requirements.txt file.
Then use the locally created docker image ghcr.io/open-webui/open-webui:latest. You can use the Claude 3 models added by LiteLLM. |
Beta Was this translation helpful? Give feedback.
-
Delete the local open-webui images, After restarting using ghcr.io/open-webui/open-webui:main, I can confirm that the newly added claude 3 model can run correctly.
Your response and reminder are greatly appreciated. Thank you. @justinh-rahb |
Beta Was this translation helpful? Give feedback.
-
Thanks a lot for the elements you shared. I am facing a difficulty in setting the access of open webui to litellm.
|
Beta Was this translation helpful? Give feedback.
-
Hi, thanks for the support! But I'm not clear how to set it up on OpenWebUI. It there a step-by-step tutorial somewhere? :) |
Beta Was this translation helpful? Give feedback.
-
How to get the LiteLLM API key? What is the LiteLLM API base URL? Are we supposed to install LiteLLM before adding values here? How to install that and get it connected with Openwebui? WHat value to give for RPM? |
Beta Was this translation helpful? Give feedback.
-
Dear all,
If I do understand,open webui comes with its own litellm. It is built
inside.
Perhaps they could allow at term to reference an external litellm, a better
practice.
François
Le sam. 25 mai 2024, 11:49, Ravishankar Ayyakkannu ***@***.***>
a écrit :
… How to get the LiteLLM API ket? What is the LiteLLM API base URL? Are we
supposed to install LiteLLM before adding values here? How to install that
and get it connected with Openwebui? WHat value to give for RPM?
—
Reply to this email directly, view it on GitHub
<#1038 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKZRFFWWCPJ7N2RSYWBY6DZEBNBRAVCNFSM6AAAAABEF46NNOVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TKNJUG42DS>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I recently updated OUI and I tried to add my Azure API Key and endpoint but with no results. The LiteLLM option there used to be is nowhere to be seen |
Beta Was this translation helpful? Give feedback.
-
Please have a look to the new version>0.2
Use environmental variable
https://docs.openwebui.com/getting-started/env-configuration#litellm
Le jeu. 6 juin 2024, 09:24, JuanjoJG-WATA ***@***.***> a
écrit :
… I made it work after a few tweaks. @justinh-rahb
<https://github.com/justinh-rahb> @krrishdholakia
<https://github.com/krrishdholakia> @tjbck <https://github.com/tjbck>
First I installed litellm like Krish said in the comment above this one:
pip install 'litellm[proxy]'
Then I changed directory to open-webui and created a yaml file called
*config.yaml* and I filled it with the first 6 lines of this example:
(
https://docs.litellm.ai/docs/proxy/quick_start#create-a-config-for-litellm-proxy
)
Then I ran the .yaml with:
litellm --config open-webui/config.yaml
In the OpenWebUI interface I went to Settings -> Connections -> Made a new
OpenAI API with *http://localhost:4000 <http://localhost:4000>* and also
tried with the proper IP, alfo followed by the port to make sure it worked
remotely.
I closed both the process of the yaml and openwebui and ran them again to
commit the changes.
And voilá, it's working.
image.png (view on web)
<https://github.com/open-webui/open-webui/assets/165764759/26f6b4dc-99d4-46b3-8dd8-b4c7ece1ff17>
—
Reply to this email directly, view it on GitHub
<#1038 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKZRFCPNLG2VOIEMDMFZBLZGAFEJAVCNFSM6AAAAABEF46NNOVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TMOBWGI2TS>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I just realized that this is a minor step back. It shouldn't have been removed just yet. I've had Anthropic and OpenAI added as LiteLLM models, and it worked great, for both image and text, however that's not working anymore. I now have three options:
Honestly, all three don't sound great haha |
Beta Was this translation helpful? Give feedback.
-
Caution
NONE OF THE BELOW IS RELEVANT ANYMORE, LITELLM HAS BEEN REMOVED FROM OPEN WEBUI
https://docs.openwebui.com/migration#migrating-from-internal-to-external-litellm
Anthropic
Model strings:
claude-2
claude-2.1
claude-instant-1.2
claude-3-sonnet-20240229
claude-3-opus-20240229
claude-3-haiku-20240307
Claude 2.1
Claude 3 "Sonnet"
Claude 3 "Opus"
Groq
Model strings:
mixtral-8x7b-32768
llama2-70b-4096
Mixtral 8x7B
Llama2 70B
Google Gemini
Model strings:
gemini-pro
Gemini Pro
Mistral
Model strings:
open-mistral-7b
(akamistral-tiny-2312
)open-mixtral-8x7b
(akamistral-small-2312
)mistral-small-latest
(akamistral-small-2402
)mistral-medium-latest
(akamistral-medium-2312
)mistral-large-latest
(akamistral-large-2402
)Open Mixtral 8x7B (formerly
mistral-small
)Mistral Medium
Mistral Large
Azure OpenAI
Model strings:
gpt35turbo
gpt4
GPT 3.5 Turbo
OpenAI
Model strings:
gpt-3.5-turbo
gpt-4
gpt-4-turbo-preview
gpt-4-vision-preview
GPT 4 Turbo
"OpenAI-compatible" endpoints
Ollama
Llama2
Beta Was this translation helpful? Give feedback.
All reactions