Skip to content

Issue: Tool is always called with a string parameter instead of Model despite using OPENAI_FUNCTIONS agent  #8042

@emmanuelvisage

Description

@emmanuelvisage

Issue you'd like to raise.

Hi,

I have defined an OPENAI_FUNCTIONS agent. I have created a tool from a function where I have defined a BaseModel as input parameter

class FooInputModel(BaseModel):
    id: str 
    name: str

agent_kwargs = {
    "extra_prompt_messages": [MessagesPlaceholder(variable_name="memory")]
}
memory = ConversationBufferMemory(memory_key="memory", return_messages=True)

tool= Tool.from_function(
      name= "FooGenerator",
      description= "Foo the bar",
      func=foo,
      args_schema= FooInputModel
    )

agent = initialize_agent([tool],
                         llm,
                         agent=AgentType.OPENAI_FUNCTIONS,
                         agent_kwargs=agent_kwargs,
                         memory=memory)

my function foo is properly called when necessary, however the input is always a string whereas I would like a "FooInputModel". How can I achieve this ? And how can I see if the agent is actually using the functions calling from OpenAI because I have doubts it's working and when I print the agent I don't see any FunctionMessage in the history.

Thanks

Suggestion:

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions