-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Are there some models in ollama can support function calling or object return? #194
Comments
@chaos369 Looks like ollama does support tool calls, but litellm currently has some bugs with how it is parsing these. I've opened an issue there BerriAI/litellm#3333 When that is fixed you should be able to use ollama models via the Something like this (based on example from #50 (comment)) from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel
@prompt(
"Count to 5",
model=LitellmChatModel("ollama_chat/llama2", api_base="http://localhost:11434")
)
def test() -> list[int]: ...
test() @knoopx you might be interested in following this issue |
Thank you. |
I've opened a PR on litellm that should fix this BerriAI/litellm#3469 Just need to bump the litellm dependency version in magentic once that's merged and released. |
@chaos369 @DevAseel I have just published https://github.com/jackmpcollins/magentic/releases/tag/v0.23.0 which enables structured outputs and function calling with ollama 🦙 Depending on what model you use you might need to prompt it to "use the tool" or add more details of the response format in the prompt. Example from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel
@prompt(
"Count to {n}. Use the tool to return in the format [1, 2, 3, ...]",
model=LitellmChatModel("ollama_chat/llama2", api_base="http://localhost:11434")
)
def count_to(n: int) -> list[int]: ...
count_to(5)
# > [1, 2, 3, 4, 5] Please let me know if you hit any issues with this. Thanks |
It seems hardly to return an object. @jackmpcollins code for testingwizardlm2=LitellmChatModel("ollama/wizardlm2:7b") llm_model = wizardlm2 class Superhero(BaseModel): @prompt("Create a Superhero named {name}, use the tool to return a Superhero struct.", model=llm_model) hero = create_superhero("Garden Man") the result when "llm_model = wizardlm2"Traceback (most recent call last): the result when "llm_model = llama3"Traceback (most recent call last): |
@chaos369 I think the The first issue is caused by the model returning an invalid function name, |
No description provided.
The text was updated successfully, but these errors were encountered: