You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I was using v0.2.x earlier an agentic implementation. The agent to interact with LLM, was expecting LLM_CONFIG, which had an option to skip the SSL verification. Now in v0.4.x, the model client initiation does not have an option for skipping verification.
Expected behavior
It is expected that in v0.4 to have an option within OpenAIChatCompletionClient module like llm_config in v0.2 to handle http_client.
What happened?
Describe the bug
I was using v0.2.x earlier an agentic implementation. The agent to interact with LLM, was expecting LLM_CONFIG, which had an option to skip the SSL verification. Now in v0.4.x, the model client initiation does not have an option for skipping verification.
To Reproduce
To reproduce, use in v0.2 as below:
In v0.4, model client is initiated as below.
Expected behavior
It is expected that in v0.4 to have an option within OpenAIChatCompletionClient module like llm_config in v0.2 to handle http_client.
Sample below:
Screenshots
Below is the error:
Additional context
We need to bypass SSL verification for the custom model integration with our agents implementation from local machine.
Which packages was the bug in?
Python AgentChat (autogen-agentchat>=0.4.0)
AutoGen library version.
Python 0.4.7
Other library version.
No response
Model used
No response
Model provider
None
Other model provider
No response
Python version
None
.NET version
None
Operating system
None
The text was updated successfully, but these errors were encountered: