-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can this be used with Ollama? #50
Comments
Hi @carlosalvidrez, it should be possible to use Also some relevant notes from the README
|
Hi @carlosalvidrez I just released https://github.com/jackmpcollins/magentic/releases/tag/v0.9.0 which adds LiteLLM as a backend and allows you to use Ollama (and many other providers and LLMs) with magentic. The above note about magentic feature support is still relevant. The following code should work for you after from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel
@prompt(
"Talk to me! ",
model=LitellmChatModel("ollama/llama2"),
)
def say_hello() -> str:
...
say_hello() See the Backend/LLM Configuration section of the README for how to set the LiteLLM backend and Ollama as the default using environment variables. Please let me know if you run into any issues. Thanks! |
Will give it a shot, thank you!! |
llmlite is kind of meh, has the prompt template hardcored in the library and basically only supports llama2... |
@knoopx Is there something that you haven't been able to do using magentic? For local models I think I've very happy to look into a specific issue if you want to open a new github issue for it! |
Wondering if you guys plan to allow this to work with local Ollama models.
Thanks!
The text was updated successfully, but these errors were encountered: