-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed as not planned
Closed as not planned
Copy link
Labels
Description
What happened?
A bug happened!
litellm/litellm:v1.59.8-stable (and older)
webui -> /health Models
Incorrect base_url when checking health status for deepseek models. Access from the code to this model is correct.
Relevant log output
"unhealthy_endpoints": [
{
"custom_llm_provider": "deepseek",
"model": "deepseek/deepseek-reasoner",
"cache": {
"no-cache": true
},
"error": "litellm.AuthenticationError: AuthenticationError: DeepseekException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: xxxxxxxxxxxxxxxxxxxx. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}\nHave you set 'mode' - https://docs.litellm.ai/docs/proxy/health#embedding-models\nstack trace: Traceback (most recent call last):\n File \"/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py\", line 770, in acompletion\n headers, response = await self.make_openai_chat_completion_request(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n ...<4 lines>...\n )\n ^\n File \"/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/logging_utils.py\", line 131, in async_wrapper\n result = await func(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py\", line 418, in make_openai_chat_completion_request\n raise e\n File \"/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py\", line 400, in make_openai_chat_completion_request\n await openai_aclient.chat.completions.with_raw_response.create(\n **data, timeout=timeout\n )\n File \"/usr/lib/python3.13/site-packages/openai/_legacy_response.py\", line 373, in wrapped\n return cast(LegacyAPIResponse[R],"
},
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
v1.59.8-stable
Twitter / LinkedIn details
No response