Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: Resumable GroupChat #2359

Closed
Tracked by #2358
ekzhu opened this issue Apr 11, 2024 · 21 comments
Closed
Tracked by #2358

[Feature Request]: Resumable GroupChat #2359

ekzhu opened this issue Apr 11, 2024 · 21 comments
Labels
enhancement New feature or request group chat group-chat-related issues

Comments

@ekzhu
Copy link
Collaborator

ekzhu commented Apr 11, 2024

Is your feature request related to a problem? Please describe.

Currently, when a group chat terminates, there isn't a straightforward way to resume it, either by resuming the current GroupChat object, or by creating a new GroupChat object and start from the previous message history.

GroupChat class already accepts messages in its constructor, so we can inject previous message history. However, the pain point is how to restart the conversation.

See discussion on this here: #2301

Describe the solution you'd like

No response

Additional context

No response

@ekzhu ekzhu added enhancement New feature or request group chat group-chat-related issues labels Apr 11, 2024
@marklysze
Copy link
Collaborator

I'll start testing out methods to resume a GroupChat and publish any questions and findings here.

@hardchor
Copy link
Collaborator

Check the implementation of ResumableGroupChatManager , that was incredibly helpful for my own use case

@marklysze
Copy link
Collaborator

Check the implementation of ResumableGroupChatManager , that was incredibly helpful for my own use case

Thanks for noting that @hardchor, I'll check it out! I'm taking it's from here.

@hardchor
Copy link
Collaborator

Yes, sorry, I meant to include that link. There were two notable omissions from the example code in that blog:

  1. When broadcasting the message, it was checking if the agent to broadcast to was self (i.e. GroupChatManager) instead of speaker
  2. It wasn't triggering the .send(...) method from the speaker's POV. I only found that out after hours of head scratching why user.initiate_chat(...) wasn't returning the full history.

So, my full implementation is:

from typing import Dict, List, Optional
from autogen.agentchat import GroupChatManager, GroupChat


class ResumableGroupChatManager(GroupChatManager):
    def __init__(self, groupchat: GroupChat, history: Optional[List[Dict]] = None, **kwargs):
        super().__init__(groupchat, **kwargs)

        if history:
            self.restore_from_history(history)

    def restore_from_history(self, history) -> None:
        for message in history:
            # broadcast the message to all agents except the speaker.  This idea is the same way GroupChat is implemented in AutoGen for new messages, this method simply allows us to replay old messages first.
            speaker_name = message.get("name", self.groupchat.admin_name)
            if speaker_name is None:
                raise ValueError("Speaker name is missing in the message and no admin name is set in the group chat")
            speaker = self.groupchat.agent_by_name(name=speaker_name)
            if speaker is None:
                raise ValueError(f"Speaker {speaker_name} not found in the group chat")

            self.groupchat.append(message, speaker)

            for agent in self.groupchat.agents:
                if agent != speaker:
                    self.send(message, agent, request_reply=False, silent=False)

            speaker.send(message, self, request_reply=False)

To get the results, I then call

group_chat_manager = ResumableGroupChatManager(
    ....
    history=history,
)

chat_result = initiating_agent.initiate_chat(..., clear_history=False)

@marklysze
Copy link
Collaborator

marklysze commented Apr 30, 2024

Thanks for detailing your implementation, @hardchor :)... very helpful...

I've been testing a bit with your previous link's code with amendments. It was helpful having the code and your updated code will also be helpful.

At this stage I'm testing without creating a new class, but either a function on an existing class or a new class, as you've done, may be the way to go...

I'll update with my findings and code once I've tested it in a few group chat scenarios...

@ekzhu
Copy link
Collaborator Author

ekzhu commented Apr 30, 2024

Thank you @hardchor ! This is very helpful.

@marklysze
Copy link
Collaborator

marklysze commented May 1, 2024

I've tested some code with the GroupChat notebooks in the documentation and I think the following works to resume GroupChats. I believe creating the necessary resume function shouldn't be too difficult. Thanks again to @hardchor for your code!

Here's the approach I took to test:

  1. Take the notebook code as is and reduce the max_rounds to a point that we can then resume from (e.g. 2 rounds)
  2. After the notebook code, we save the messages from the GroupChat object. This is our saved state.
# --- Copy state out of original chat, it's just the messages
messages_og = []
for message in groupchat.messages:
    messages_og.append(message)
  1. We then delete and recreate the whole chat to ensure all objects are new and in their newly created state
# Clear out all the objects so we can be sure we're not using them. E.g.
del user_proxy
del engineer
del scientist
del planner
del executor
del critic
del groupchat
del manager

<Paste in the original code where agents, group chat, and manager are created>
  1. Restore the state into the new objects. This comprises of putting saved messages into agents and getting the last speaking agent
# ---- Bring previous state into new chat

# Put previous messages into new agents
last_speaker_name = ""
last_message = ""
for message in messages_og:

    message_speaker_agent = groupchat.agent_by_name(message["name"])

    # Add previous messages to each agent (except their own messages)
    for agent in groupchat.agents:
        if agent.name != message["name"]:
            manager.send(message, groupchat.agent_by_name(agent.name), request_reply=False, silent=True)

    # Add previous message to the new groupchat
    groupchat.append(message, message_speaker_agent)

    # Last speaker agent
    last_speaker_name = message["name"]

    # Last message to check for termination (we could avoid this by ignoring termination check for resume in the future)
    last_message = message

# Get last speaker as agent
previous_last_agent = groupchat.agent_by_name(name=last_speaker_name)
  1. Cater for termination - this isn't something final code would need but at the moment the run_chat function runs the termination check right away, so if the last message satisfies the termination function then it won't resume. So, for testing we check and remove the termination keyword.
# Check if the last message contains termination
if manager._is_termination_msg(last_message):
    # Remove termination
    print("Last message contains termination, removing termination keyword")
    messages_og[-1]["content"] = last_message["content"].replace("TERMINATE", "")
'''

  1. Resume the chat
manager.run_chat(messages=messages_og, config=groupchat, sender=previous_last_agent)

This worked with all notebooks (tweaking the termination check if needed). Resumption picked up code and ran it, continued conversations, etc. I tested on my own code as well (debating) and it continued the conversation okay.

Here's a summary of the tests and links to the Python files and outputs

Notebook Notes Files
agentchat_groupchat_customized Resumes and picks up previous code issue and resumes to fix it. Code, Output
agentchat_groupchat_finite_state_machine Resumes with the history of the number of chocolates from the original chat. Correctly carries out sequence and counts until the end. Code, Output
agentchat_groupchat_RAG Tested 3 versions (RAG, NORAG, RAG CHAT), resumes with messages going to the LLM containing the history. For RAG/NORAG terminates with the next message as already completed the task in the first section. For RAG CHAT – picked up the suggested function call from the original chat and executed it in the resumed chat. Code, NoRAG Output, RAG Output, RAG Chat Output
agentchat_groupchat_research Resumes and carries across the code generated by an agent and that is then executed with another agent. Code, Output
agentchat_groupchat_stateflow Resumes and carries across the code generated and executes it. Code, Output

Here's a notebook with the code in it


At this stage, I believe it's possible that with just the message history we can resume the group chat. This could be handled with run_chat or a new resume_chat function that operates in a similar way.

It would need to take in the messages, assign the messages to the agents, ignore initial termination check, and include all the validations to make sure agents in the messages existed, etc. See steps 2 and 4 above.

Happy to continue the conversation from here...

@ekzhu
Copy link
Collaborator Author

ekzhu commented May 1, 2024

Thanks @marklysze for the update. The test result is very useful! I have two questions:

  1. If we were to add an API for resuming the group chat, should this be a method on GroupChat or GroupChatManager?
  2. In your notebook example: https://github.com/marklysze/AutoGenCodeTesting/blob/master/groupchat_resume/agentchat_groupchat_research_resume.ipynb, the group chat is restarted using the run_chat method in the GroupChatManager. Is there a way to start a group chat through initiate_chat but with previous state in the group chat? The reason I ask is because initiate_chat has the functionalities to perform summarization on the chat history, so it might be useful to be able to do that.

For termination message removal, I think this can be part of the API for resuming a group chat, and caller can specify what to do with the termination messages, as in some cases the termination message may not be "TERMINATE". If user does not specify what to do with the termination messages, and the history contains it, we can raise a warning to notify the existence of termination messages in the history and warn that the group chat might be terminated early.

@marklysze
Copy link
Collaborator

Thanks @ekzhu...

  1. If we were to add an API for resuming the group chat, should this be a method on GroupChat or GroupChatManager?

I think logically I would initially say GroupChat, however, there's a case for consistency with resuming other, non-group, types of chats so perhaps agent-side resuming is better (e.g. GroupChatManager)?

A lot of chats are also initiated through an agent, so resuming through an agent makes it feel similar. I also like that the manager agent is responsible for resuming, which feels more agentic (?).

The one caveat here, though, is I don't think we want the user to resume a group chat through any agent other than the group chat manager.

  1. In your notebook example: https://github.com/marklysze/AutoGenCodeTesting/blob/master/groupchat_resume/agentchat_groupchat_research_resume.ipynb, the group chat is restarted using the run_chat method in the GroupChatManager. Is there a way to start a group chat through initiate_chat but with previous state in the group chat? The reason I ask is because initiate_chat has the functionalities to perform summarization on the chat history, so it might be useful to be able to do that.

Let me check that out, I haven't used summarisation so I'll have a look at that, too. If it makes sense to use initiate_chat as a basis for resuming, we can base the resume function on that rather than run_chat.

For termination message removal, I think this can be part of the API for resuming a group chat, and caller can specify what to do with the termination messages, as in some cases the termination message may not be "TERMINATE". If user does not specify what to do with the termination messages, and the history contains it, we can raise a warning to notify the existence of termination messages in the history and warn that the group chat might be terminated early.

Yep, sounds fair.

@ekzhu
Copy link
Collaborator Author

ekzhu commented May 1, 2024

Sounds like we can have a GroupChatManager.recover method to recover from a previous chat history.

@Mai0313
Copy link
Collaborator

Mai0313 commented May 2, 2024

For #2359 (comment)

It seems reasonable enough, but when I tried something similar with my own project at the company, I had to consider the token limit for GPT-3.5/GPT-4.

A few months ago, I saved all conversations into Redis. After doing so, I realized that this approach might either reach the token limit or consume a significant number of tokens. While I'm not concerned about the tokens spent, I do want to avoid hitting the limit.

For instance, with a lengthy conversation, it's not clear to me how you would segment it into manageable chunks.

In my project, I discovered two methods to circumvent this issue:

  1. Introduce a summary agent to condense the conversation before saving it to the database.
  2. Use [:TOKEN_LIMIT] to simply split the string (this is crude but straightforward).

@marklysze
Copy link
Collaborator

@Mai0313, thanks for highlighting that, it's definitely something I've been encountering in my general use of LLMs and particularly with long conversations.

The summary agent sounds interesting, does that condense the live chat messages or condense for saving only?

I assume manging the total conversation length/tokens is going to be an issue independent of the resuming functionality?

@ekzhu
Copy link
Collaborator Author

ekzhu commented May 2, 2024

Long context handling inside a group chat is definitely something we need to add to the library. For single agent, the current way to do it is to to use the transform message capability: https://microsoft.github.io/autogen/docs/topics/long_contexts

Currently the capability is not working for GroupChat or GroupChatManager, as the group chat select speaker is using all messages. However @marklysze has made a recent change to convert the speaker selection into a two agent chat, so we can potentially apply the transformation capability to the agents in the inner chat.

cc @WaelKarkoub for awareness

@marklysze
Copy link
Collaborator

Ah, that transform message capability is very interesting...

Being able to transform messages for the select speaker inner chat would be very useful. Let me know if you have any specific approaches and I'm happy to incorporate.

In line with that, in my separate, and much earlier, testing I found that using an LLM to summarise each message was also a good way of maintaining the context of the message and significantly reduce its length. I'm not sure whether introducing LLM-based TransformMessages capability would work though, as it appears to run for the full set of messages each time (which will reach token limits).

So, how about an option to summarise each message when it comes in, by sending it back to the LLM prefixed with a summarise prompt, and store it alongside the context (e.g. a message has both "context" and "summarised")? This would be similar to the reflection_with_llm summary method but on an individual message instead of the whole conversation.

@WaelKarkoub
Copy link
Collaborator

I'm currently reviewing the group chat implementation and noticed that it doesn't expose the inner agent externally, which could complicate integration with transform messages. Also, the roles of the group chat and group chat manager seem to have become blurred. Do you think it would be beneficial to refactor these components to better define their functions? I propose modeling the group chat primarily as a data store, with the group chat manager facilitating agent-to-agent communication.

A while back I suggested creating a protocol for an agent selector, #1791 (comment). With this setup, we could implement an AutoSelector that accepts an agent in the constructor. This would allow users to seamlessly equip the agent with the transform messages capability. What are everyone's thoughts on this approach?

In line with that, in my separate, and much earlier, testing I found that using an LLM to summarise each message was also a good way of maintaining the context of the message and significantly reduce its length. I'm not sure whether introducing LLM-based TransformMessages capability would work though, as it appears to run for the full set of messages each time (which will reach token limits).

It depends on how you design the transform. For example, you can create a transform that summarizes only the last message, or only if the message reaches a certain token count. The design for the transform message is pretty flexible and should allow us to create any behavior we would like. I'm more than happy to collaborate on adding a new transform that summarizes messages (or any other transform ideas you might have)

@ekzhu
Copy link
Collaborator Author

ekzhu commented May 3, 2024

we could implement an AutoSelector that accepts an agent in the constructor. This would allow users to seamlessly equip the agent with the transform messages capability. What are everyone's thoughts on this approach?

Do you think we don't necessarily need to apply the capability concept here? It is basically the chat history we want to transform; can we just add a "message transformer" to the constructor?

@marklysze
Copy link
Collaborator

I've been quite focused on auto group chat and need to expand my view on the architecture of the GroupChat. I know it's a topic that has been highlighted for discussion and I would like clarity around GroupChat vs GroupChatManager, as well.

I'm happy to press forward with the resuming functionality, e.g. GroupChatManager.recover/resume but equally fine with holding off if further discussion would change that implementation considerably.

For the message transformations on the select speaker messages - would be great to work together @WaelKarkoub. I'd be keen to try and keep LLM summarisation at a message-level occurring only once per message to minimise LLM-hits. Should I create an issue?

@WaelKarkoub
Copy link
Collaborator

we could implement an AutoSelector that accepts an agent in the constructor. This would allow users to seamlessly equip the agent with the transform messages capability. What are everyone's thoughts on this approach?

Do you think we don't necessarily need to apply the capability concept here? It is basically the chat history we want to transform; can we just add a "message transformer" to the constructor?

@ekzhu interesting idea, I didn't think about it that way. My goal was to keep it consistent with how we use message transform, but I do like your approach better.

For the message transformations on the select speaker messages - would be great to work together @WaelKarkoub. I'd be keen to try and keep LLM summarisation at a message-level occurring only once per message to minimise LLM-hits. Should I create an issue?

@marklysze Yup sounds good 👍

@ekzhu
Copy link
Collaborator Author

ekzhu commented May 3, 2024

I'm happy to press forward with the resuming functionality, e.g. GroupChatManager.recover/resume

Let's go forward with the functionality first before refining it.

I would like clarity around GroupChat vs GroupChatManager, as well.

The separation is certainly a bit unclear. At this stage, we still want to understand the usage cases and functionalities so keep it this way for the time being.

There is an experimental branch where we can explore improved architecture.

@marklysze
Copy link
Collaborator

For the message transformations on the select speaker messages - would be great to work together @WaelKarkoub. I'd be keen to try and keep LLM summarisation at a message-level occurring only once per message to minimise LLM-hits. Should I create an issue?

@marklysze Yup sounds good 👍

I've created issue #2583 to address this separate capability and will work on the PR, thanks @WaelKarkoub for the chat around the implementation of this.

I'll work on the resuming group chat functionality for this issue and create a new PR when I have a first pass.

@marklysze marklysze mentioned this issue May 8, 2024
3 tasks
@ekzhu
Copy link
Collaborator Author

ekzhu commented May 13, 2024

This has been addressed thanks @marklysze !

@ekzhu ekzhu closed this as completed May 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request group chat group-chat-related issues
Projects
None yet
Development

No branches or pull requests

5 participants