-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Open
Labels
Description
What happened?
Since updating to 1.72.x , I've encountered SSL errors when connecting to Azure AI models. (Before 1.69 everything worked fine)
Attempts to bypass verification using:
litellm.client_session = httpx.Client(verify=False)
litellm.aclient_session = httpx.AsyncClient(verify=False)
solves the problem for azure openai models. But doesn't work for Mistral and Deepseek Async client (Sync client works perfectly)
Error:
SSLCertificateVerificationError: unable to get local issuer certificate
Adding litellm.ssl_verify = False
leads to a new error regarding invalid input format:
Error:
BadRequestError: invalid input error - Input should be a valid dictionary/object, received string.
The problem seems to happen to other models than OpenAI's Azure models
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
v1.72?2
Mazyod