Skip to content

[Bug]: SSL verification issues with Azure AI client starting from litellm 1.69 #11591

@AmineDjeghri

Description

@AmineDjeghri

What happened?

Since updating to 1.72.x , I've encountered SSL errors when connecting to Azure AI models. (Before 1.69 everything worked fine)
Attempts to bypass verification using:

litellm.client_session = httpx.Client(verify=False)  
litellm.aclient_session = httpx.AsyncClient(verify=False)

solves the problem for azure openai models. But doesn't work for Mistral and Deepseek Async client (Sync client works perfectly)

Error:

SSLCertificateVerificationError: unable to get local issuer certificate

Adding litellm.ssl_verify = False leads to a new error regarding invalid input format:

Error:

BadRequestError: invalid input error - Input should be a valid dictionary/object, received string.

The problem seems to happen to other models than OpenAI's Azure models

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.72?2

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions