-
I want to know how I can run a ChatModel query that can use tools, but without putting this inside an Agent. For example something like this # Create simple tool
@tool
def myfunc(self, x):
"""This is a simple function that returns myfunc(x)."""
return x ** 0.5
# Create AI
llm = ChatOpenAI(tool=myfunc)
# Ask the llm to use the tool
messages = [
HumanMessage(content="Hello! Can you calculate myfunc(4)?"),
]
llm(messages) where the output should be the json object function call. More broadly, tools are input directly into the OpenAI api create/predict function call, so it seems to me that it would make most sense to have tools should be usable at that level. The LangChain tutorial, on the other hand, introduces Tools alongside Agents, which is several layers of abstraction above the simple prediction call. This seems like a mistake, right? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
🤖 Hello, Thank you for your question. You can indeed use tools directly with the OpenAI API create/predict function call, without embedding them in an Agent. This can be done using the # Import necessary modules
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage
from langchain.tools import MoveFileTool, format_tool_to_openai_function
# Initialize the chat model
model = ChatOpenAI(model="gpt-3.5-turbo-0613")
# Define your tools
tools = [MoveFileTool()]
functions = [format_tool_to_openai_function(t) for t in tools]
# Define your messages
messages = [HumanMessage(content="move file foo to bar")]
# Call the predict_messages method
message = model.predict_messages(messages, functions=functions)
# Print the assistant's reply
print(message.additional_kwargs["function_call"]) In this example, Please note that this method is not asynchronous, so you do not need to use the As for your question about the I hope this helps! If you have any more questions, feel free to ask. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
🤖
Hello,
Thank you for your question. You can indeed use tools directly with the OpenAI API create/predict function call, without embedding them in an Agent. This can be done using the
predict_messages
method of theChatOpenAI
class. Here's an example: