Is your feature request related to a problem? Please describe.
Right now, if an agent's llm_config is set to None (default in ConversableAgent and AssistantAgent), or if config_list is empty.
from autogen import UserProxyAgent, AssistantAgent
# assistant = AssistantAgent("assistant", llm_config={"config_list": []})
assistant = AssistantAgent("assistant")
proxy = UserProxyAgent("user")
proxy.initiate_chat(assistant)
We get confusing error message from the openai client:
TypeError: Missing required arguments; Expected either ('messages' and 'model') or ('messages', 'model' and 'stream') arguments to be given
Describe the solution you'd like
We should raise an error early if the llm_config is None or config_list is empty, as it is guaranteed to lead to error as the model is not specified.
Alternatively, we should set llm_config by default to false in ConversableAgent and make llm_config a required argument in AssistantAgent. As the current default None is always going to cause error.
Additional context
Related:
#1210
Is your feature request related to a problem? Please describe.
Right now, if an agent's
llm_configis set to None (default inConversableAgentandAssistantAgent), or ifconfig_listis empty.We get confusing error message from the openai client:
Describe the solution you'd like
We should raise an error early if the
llm_configis None orconfig_listis empty, as it is guaranteed to lead to error as themodelis not specified.Alternatively, we should set
llm_configby default to false inConversableAgentand makellm_configa required argument inAssistantAgent. As the current defaultNoneis always going to cause error.Additional context
Related:
#1210