Using Model from Ollama in ChatOpenAI doesnt invoke the tools with bind_tools #21907
Replies: 5 comments 5 replies
-
To address the issue of invoking tools with
from langchain_openai import ChatOpenAI
from langchain.tools import MoveFileTool, format_tool_to_openai_function
llm = ChatOpenAI(
api_key="ollama",
model="llama3:8b-instruct-fp16",
base_url="http://localhost:11434/v1",
)
tools = [MoveFileTool()]
functions = [format_tool_to_openai_function(t) for t in tools]
llm_with_tools = llm.bind_tools(tools=functions)
If following these steps doesn't resolve your issue, please provide more details about the errors or behavior you're experiencing for further assistance.
|
Beta Was this translation helpful? Give feedback.
-
Hi @AtmehEsraa, You cannot use |
Beta Was this translation helpful? Give feedback.
-
See this guide to add ad-hoc too calling capability to models that do not support it natively: https://python.langchain.com/v0.2/docs/how_to/tools_prompting/ Keep in mind that the gap in quality can be very big between a model that's been fine-tuned for tool calling and one that hasn't. For tool calling models see: https://python.langchain.com/v0.2/docs/how_to/tool_calling/ And a list of such models here: https://python.langchain.com/v0.2/docs/integrations/chat/ |
Beta Was this translation helpful? Give feedback.
-
Getting similar error when using |
Beta Was this translation helpful? Give feedback.
-
Is there any good solution? |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
Using Model from Ollama in ChatOpenAI doesnt invoke the tools with bind_tools
System Info
..
Beta Was this translation helpful? Give feedback.
All reactions