Functions not called when i use langchain_openai.ChatOpenAI
as llm
#16139
-
When I use This is my code: llm = ChatOpenAI(openai_api_key="xxx",temperature=0.5)
tools = [get_band_packages]
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
system_msg
),
MessagesPlaceholder(variable_name="chat_history"),
("user", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
chat_history = []
functions = [format_tool_to_openai_function(t) for t in tools]
llm_with_tools = llm.bind(functions=functions)
agent = (
{
"input": lambda x: x["input"],
"agent_scratchpad": lambda x: format_to_openai_function_messages(
x["intermediate_steps"]
),
"chat_history": lambda x: x["chat_history"],
}
| prompt
| llm_with_tools
| OpenAIFunctionsAgentOutputParser()
)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "xxxxxxx","chat_history":[]}) This is the official documentation of And here is how def bind_functions(
self,
functions: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable]],
function_call: Optional[str] = None,
**kwargs: Any,
) -> Runnable[LanguageModelInput, BaseMessage]:
"""Bind functions (and other objects) to this chat model.
Args:
functions: A list of function definitions to bind to this chat model.
Can be a dictionary, pydantic model, or callable. Pydantic
models and callables will be automatically converted to
their schema dictionary representation.
function_call: Which function to require the model to call.
Must be the name of the single provided function or
"auto" to automatically determine which function to call
(if any).
kwargs: Any additional parameters to pass to the
:class:`~langchain.runnable.Runnable` constructor.
"""
formatted_functions = [convert_to_openai_function(fn) for fn in functions]
if function_call is not None:
if len(formatted_functions) != 1:
raise ValueError(
"When specifying `function_call`, you must provide exactly one "
"function."
)
if formatted_functions[0]["name"] != function_call:
raise ValueError(
f"Function call {function_call} was specified, but the only "
f"provided function was {formatted_functions[0]['name']}."
)
function_call_ = {"name": function_call}
kwargs = {**kwargs, "function_call": function_call_}
return super().bind(
functions=formatted_functions,
**kwargs,
) |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
🤖 Hello @Dovahkiin8625! I'm Dosu, a friendly bot here to assist you in resolving issues, answering queries, and helping you contribute more efficiently to the LangChain project. While we're waiting for a human maintainer to join us, I'm here to make your coding journey smoother. Let's make some software magic together! Based on the information you've provided, it seems like you're trying to use the Instead, you should use the Here's an example of how you can use the llm = ChatOpenAI(openai_api_key="xxx",temperature=0.5)
tools = [get_band_packages]
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
system_msg
),
MessagesPlaceholder(variable_name="chat_history"),
("user", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
chat_history = []
functions = [format_tool_to_openai_function(t) for t in tools]
llm_with_tools = llm.bind_functions(functions=functions)
agent = (
{
"input": lambda x: x["input"],
"agent_scratchpad": lambda x: format_to_openai_function_messages(
x["intermediate_steps"]
),
"chat_history": lambda x: x["chat_history"],
}
| prompt
| llm_with_tools
| OpenAIFunctionsAgentOutputParser()
)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "xxxxxxx","chat_history":[]}) In this example, the Please try this approach and let me know if it resolves your issue. If you have any further questions or need additional clarification, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
🤖 Hello @Dovahkiin8625! I'm Dosu, a friendly bot here to assist you in resolving issues, answering queries, and helping you contribute more efficiently to the LangChain project. While we're waiting for a human maintainer to join us, I'm here to make your coding journey smoother. Let's make some software magic together! Based on the information you've provided, it seems like you're trying to use the Instead, you should use the Here's an example of how you can use the llm = ChatOpenAI(openai_api_key="xxx",temperature=0.5)
tools = [get_band_packages]
prompt = ChatPromptTemplate.from_messages(
[
(
"system",
system_msg
),
MessagesPlaceholder(variable_name="chat_history"),
("user", "{input}"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
]
)
chat_history = []
functions = [format_tool_to_openai_function(t) for t in tools]
llm_with_tools = llm.bind_functions(functions=functions)
agent = (
{
"input": lambda x: x["input"],
"agent_scratchpad": lambda x: format_to_openai_function_messages(
x["intermediate_steps"]
),
"chat_history": lambda x: x["chat_history"],
}
| prompt
| llm_with_tools
| OpenAIFunctionsAgentOutputParser()
)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke({"input": "xxxxxxx","chat_history":[]}) In this example, the Please try this approach and let me know if it resolves your issue. If you have any further questions or need additional clarification, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
🤖
Hello @Dovahkiin8625!
I'm Dosu, a friendly bot here to assist you in resolving issues, answering queries, and helping you contribute more efficiently to the LangChain project. While we're waiting for a human maintainer to join us, I'm here to make your coding journey smoother. Let's make some software magic together!
Based on the information you've provided, it seems like you're trying to use the
functions
parameter with theChatOpenAI
class. However, as per the issue #12415 in the LangChain repository, thefunctions
parameter has been deprecated and is no longer recognized by OpenAI.Instead, you should use the
bind_functions
method of theChatOpenAI
class to bind your functions to the…