Description
What happened?
Describe the bug
I was using v0.2.x earlier an agentic implementation. The agent to interact with LLM, was expecting LLM_CONFIG, which had an option to skip the SSL verification. Now in v0.4.x, the model client initiation does not have an option for skipping verification.
To Reproduce
To reproduce, use in v0.2 as below:
class MyHttpClient(httpx.Client):
def __deepcopy__(self, memo):
return self
llm_config = {
"config_list": [
{
"model": "some-model",
"api_key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"base_url": "https://genai-api.example.com/v1",
"http_client": MyHttpClient(verify=False),
}
],
}
In v0.4, model client is initiated as below.
model_client = OpenAIChatCompletionClient(
model=model,
api_key=api_key,
base_url=base_url,
model_capabilities={"vision": True, "function_calling": True, "json_output": True}
)
Expected behavior
It is expected that in v0.4 to have an option within OpenAIChatCompletionClient module like llm_config in v0.2 to handle http_client.
Sample below:
class MyHttpClient(httpx.Client):
def __deepcopy__(self, memo):
return self
model_client = OpenAIChatCompletionClient(
model=model,
api_key=api_key,
base_url=base_url,
model_capabilities={"vision": True, "function_calling": True, "json_output": True},
http_client=MyHttpClient(verify=False)
)
Screenshots
Below is the error:
httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1006)
Additional context
We need to bypass SSL verification for the custom model integration with our agents implementation from local machine.
Which packages was the bug in?
Python AgentChat (autogen-agentchat>=0.4.0)
AutoGen library version.
Python 0.4.7
Other library version.
No response
Model used
No response
Model provider
None
Other model provider
No response
Python version
None
.NET version
None
Operating system
None