Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skip SSL verification option missing while creating model client in v0.4.x #5795

Open
avinashmihani opened this issue Mar 3, 2025 · 2 comments · May be fixed by #5811
Open

Skip SSL verification option missing while creating model client in v0.4.x #5795

avinashmihani opened this issue Mar 3, 2025 · 2 comments · May be fixed by #5811

Comments

@avinashmihani
Copy link

What happened?

Describe the bug
I was using v0.2.x earlier an agentic implementation. The agent to interact with LLM, was expecting LLM_CONFIG, which had an option to skip the SSL verification. Now in v0.4.x, the model client initiation does not have an option for skipping verification.

To Reproduce
To reproduce, use in v0.2 as below:

class MyHttpClient(httpx.Client):
    def __deepcopy__(self, memo):
        return self

llm_config = {
    "config_list": [
        {
            "model": "some-model",
            "api_key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
            "base_url": "https://genai-api.example.com/v1",
            "http_client": MyHttpClient(verify=False),
        }
    ],
}

In v0.4, model client is initiated as below.

model_client = OpenAIChatCompletionClient(
            model=model,
            api_key=api_key,
            base_url=base_url,
            model_capabilities={"vision": True, "function_calling": True, "json_output": True}
        )

Expected behavior
It is expected that in v0.4 to have an option within OpenAIChatCompletionClient module like llm_config in v0.2 to handle http_client.

Sample below:

class MyHttpClient(httpx.Client):
    def __deepcopy__(self, memo):
        return self

model_client = OpenAIChatCompletionClient(
            model=model,
            api_key=api_key,
            base_url=base_url,
            model_capabilities={"vision": True, "function_calling": True, "json_output": True},
            http_client=MyHttpClient(verify=False)
        )

Screenshots
Below is the error:

httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1006)

Additional context
We need to bypass SSL verification for the custom model integration with our agents implementation from local machine.

Which packages was the bug in?

Python AgentChat (autogen-agentchat>=0.4.0)

AutoGen library version.

Python 0.4.7

Other library version.

No response

Model used

No response

Model provider

None

Other model provider

No response

Python version

None

.NET version

None

Operating system

None

@ekzhu
Copy link
Collaborator

ekzhu commented Mar 3, 2025

Thanks for the issue. I think we can fix this by adding http_client to the list of BaseOpenAIClientConfiguration:

https://github.com/microsoft/autogen/blob/main/python/packages/autogen-ext/src/autogen_ext/models/openai/config/__init__.py#L35-L45

@avinashmihani can you submit a PR to fix this?

@ekzhu ekzhu added this to the 0.4.x-python milestone Mar 3, 2025
@avinashmihani
Copy link
Author

@ekzhu I have added the PR for the change you mentioned.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants