Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Registering a function (tool or function) wipes out any register_custom_client class in place #2930

Open
scruffynerf opened this issue Jun 12, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@scruffynerf
Copy link

scruffynerf commented Jun 12, 2024

Describe the bug

Edge case, but shows an incorrect assumption, and unsure where it should be fixed... either the update_tool_signature (and update_function_signature same problem) OR adjust the custom_client setup process

In conversable_agent.py, If you call the register_function(s) they end up calling update_tool_signature which calls this line last: self.client = OpenAIWrapper(**self.llm_config)

Normally that's fine, but if you've already setup a custom model, and run the register_custom_client(), it wipes out the Class setup, and on running, it complains the class isn't being used, and asks you to fix that.

The 2 step process of registering a custom client is adding a pointer to the config like so:
{"model_client_cls": "CustomModelClient"}

This created a 'placeholder' in the agent's client, not an actual Class used (yet)

but it doesn't actually get applied till you run :

assistant.register_model_client(model_client_cls=CustomModelClient)

This actually changes the placeholder to use the Class.

The bug: if you later run something like

assistant.register_for_llm(name="calculator", description="A calculator tool that accepts nested expression as input")(calculator)

or
assistant.register_for_execution(name="calculator")(calculator)

the CustomModelClient Class is removed from the agent. (rerunning the register_model_client function shows that's all that's needed)

Why? Because the assumption above: reloading that client via the OpenAIWrapper with the llmconfig ends up resetting back to the first step, since the placeholder is restored.

Workaround: Always do register_custom_model() LAST, and if you update or change the tool config, you might need to reregister.

Fix: either fix the reload of the llm_config OR fix the custom_model registration process to work solely from the llm_config with the need for a second function called.

Steps to reproduce

  1. Setup a custom model client using agent, such as https://github.com/microsoft/autogen/blob/main/notebook/agentchat_custom_model.ipynb
    This is not a local setup issue, or llm model related, it's the custom client Class that matters here.
  2. Set up the 2 parts: the llm_config in the agent, and then register the custom_model
    That works.
  3. add a tool/function and register it with the agent.
    You'll get the error that the custom class is in the config, but not registered.
  4. reorder the calls, so the registering tool/function happens first, the registering the custom Class happens second
    no error.

Model Used

any, tested on multiple custom client classes (Instructor and Ollama Raw, neither using OpenAI standard client), it's not the model, it's the client, despite 'register_custom_model' being the name of the function affected.

Expected Behavior

It should respect the existing client config, including the registered custom model class.

Screenshots and logs

No response

Additional Information

No response

@scruffynerf
Copy link
Author

hope will be fixed in #2946

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant