Skip to content

[Issue]: chat_messages usage in 2 agents with tools #2968

Closed
@kk2491

Description

@kk2491

Describe the issue

First of all, thank you all for making this awesome tool.

I have recently started using autogen, and the use case I am trying to achieve is the sequence of related queries one after the other in separate chats. In this case, I am using only 2 agents (one for use proxy and the other one is assistant).

For example, first question would be : create a resource in x. assistant and user proxy works together, uses the tools and creates the resource. Second followup question would be update the resource you just created. How can I provide the previous chat history, so that the agents know the context.

As per the documentation I see that the chat_messages can be used to provide the previous conversation. However I am not able to exactly figure out on how to populate this, could you please help me here?

I am using the below python script

import autogen  
from autogen.coding import LocalCommandLineCodeExecutor
from external_tools import ( 
    create_abc_resource,
    update_abc_resource,
)

def fetch_chat_history(chat_history):
    formatted_chat_history = []
    # for each_chat in chat_history:
    return chat_history

def run_autogen():

    config_list = [
        {
            "model" : "llama3-70b-8192",
            "api_type" : "open_ai",
            "base_url" : "API_URL",
            "api_key" : "XYZ"
        }
    ]

    llm_config = {
        "timeout" : 600,
        "seed" : 42, 
        "config_list" : config_list,
        "temperature" : 0
    }

    chat_history_list = []

    while True:

        user_proxy = autogen.ConversableAgent(
            name="User",
            llm_config=False,
            is_termination_msg=lambda msg: msg.get("content") is not None and "TERMINATE" in msg["content"],
            human_input_mode="NEVER",
        )

        assistant = autogen.ConversableAgent(
            name="Assistant",
            system_message="You are a helpful assistant. "
            "You can help with the simple calculations. "
            "Return 'TERMINATE' when the task is done. ",
            llm_config=llm_config,
            chat_messages={user_proxy : chat_history_list}
        )

        assistant.register_for_llm(name="create_abc_resource", description="Create a resource in ABC")(create_abc_resource)
        assistant.register_for_llm(name="update_abc_resource", description="Update a resource in ABC given the id")(update_abc_resource)        
        
        user_proxy.register_for_execution(name="create_abc_resource")(create_abc_resource)
        user_proxy.register_for_execution(name="update_abc_resource")(update_abc_resource)

        question = input("Enter Query : ")
        chat_result = user_proxy.initiate_chat(
            assistant,
            message=question,
        )
        print("=== chat result ===")   
        print(chat_result) 

        current_chat_history = fetch_chat_history(chat_result.chat_history)
        chat_history_list = chat_history_list + current_chat_history

    return

if __name__ == "__main__":
    run_autogen()

After the first query, the chat_result output is as given below:

ChatResult(
	chat_id=None, 
	chat_history=[{'content': 'create a resource in x', 'role': 'assistant'}, {'tool_calls': [{'id': 'call_fyzx', 'function': {'arguments': '{"input":{"name":"name"}}', 'name': 'create_abc_resource'}, 'type': 'function'}], 'content': None, 'role': 'assistant'}, {'content': '{"id": "66729022a205c5a5975e76f8", "name": "name",    "createdTimestamp": "2024-06-19T08:00:34.862Z", "lastUpdatedTimestamp": "2024-06-19T08:00:34.862Z"}', 'tool_responses': [{'tool_call_id': 'call_fyzx', 'role': 'tool', 'content': '{"id": "66729022a205c5a5975e76f8", "name": "name", "createdTimestamp": "2024-06-19T08:00:34.862Z", "lastUpdatedTimestamp": "2024-06-19T08:00:34.862Z"}'}], 'role': 'tool'}, {'content': 'The resource has been created in x. The ID is 66729022a205c5a5975e76f8.', 'role': 'user'}, {'content': '', 'role': 'assistant'}, {'content': 'TERMINATE', 'role': 'user'}], 
	summary='', 
	cost={'usage_including_cached_inference': {'total_cost': 0, 'llama3-70b-8192': {'cost': 0, 'prompt_tokens': 3767, 'completion_tokens': 122, 'total_tokens': 3889}}, 'usage_excluding_cached_inference': {'total_cost': 0, 'llama3-70b-8192': {'cost': 0, 'prompt_tokens': 3767, 'completion_tokens': 122, 'total_tokens': 3889}}}, 
	human_input=[]
)

How can I populate the chat_messages field in the agents, so that the agents know the previous context when I ask a new question?

Thank you,
KK

Steps to reproduce

No response

Screenshots and logs

No response

Additional Information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    0.2Issues which are related to the pre 0.4 codebaseneeds-triagequestionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions