Skip to content

Confusing error message when llm_config is not initialized or is None. #1522

@ekzhu

Description

@ekzhu

Is your feature request related to a problem? Please describe.

Right now, if an agent's llm_config is set to None (default in ConversableAgent and AssistantAgent), or if config_list is empty.

from autogen import UserProxyAgent, AssistantAgent

# assistant = AssistantAgent("assistant", llm_config={"config_list": []})
assistant = AssistantAgent("assistant")
proxy = UserProxyAgent("user")

proxy.initiate_chat(assistant)

We get confusing error message from the openai client:

TypeError: Missing required arguments; Expected either ('messages' and 'model') or ('messages', 'model' and 'stream') arguments to be given

Describe the solution you'd like

We should raise an error early if the llm_config is None or config_list is empty, as it is guaranteed to lead to error as the model is not specified.

Alternatively, we should set llm_config by default to false in ConversableAgent and make llm_config a required argument in AssistantAgent. As the current default None is always going to cause error.

Additional context

Related:

#1210

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions