Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can this be used with Ollama? #50

Closed
carlosalvidrez opened this issue Nov 1, 2023 · 5 comments
Closed

Can this be used with Ollama? #50

carlosalvidrez opened this issue Nov 1, 2023 · 5 comments

Comments

@carlosalvidrez
Copy link

Wondering if you guys plan to allow this to work with local Ollama models.
Thanks!

@jackmpcollins
Copy link
Owner

Hi @carlosalvidrez, it should be possible to use magentic with ollama by using a proxy/adapter to make it compatible with the OpenAI API. See for example the instructions in PR #43 which use litellm. I have tried just now to get this working but I am getting errors from litellm. Please let me know if you get this working.

Also some relevant notes from the README

Since magentic uses the openai Python package, setting the OPENAI_API_BASE environment variable or openai.api_base in code allows you to use it with any OpenAI-compatible API e.g. Azure OpenAI Service, LocalAI. Note that if the API does not support function calling then you will not be able to create prompt-functions that return Python objects, but other features of magentic will still work.

@jackmpcollins
Copy link
Owner

jackmpcollins commented Nov 6, 2023

Hi @carlosalvidrez I just released https://github.com/jackmpcollins/magentic/releases/tag/v0.9.0 which adds LiteLLM as a backend and allows you to use Ollama (and many other providers and LLMs) with magentic. The above note about magentic feature support is still relevant.

The following code should work for you after pip install --upgrade magentic[litellm] and assuming ollama is installed.

from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel


@prompt(
    "Talk to me! ",
    model=LitellmChatModel("ollama/llama2"),
)
def say_hello() -> str:
    ...


say_hello()

See the Backend/LLM Configuration section of the README for how to set the LiteLLM backend and Ollama as the default using environment variables.

Please let me know if you run into any issues. Thanks!

@carlosalvidrez
Copy link
Author

Will give it a shot, thank you!!

@knoopx
Copy link

knoopx commented Apr 26, 2024

llmlite is kind of meh, has the prompt template hardcored in the library and basically only supports llama2...

@jackmpcollins
Copy link
Owner

llmlite is kind of meh, has the prompt template hardcored in the library and basically only supports llama2...

@knoopx Is there something that you haven't been able to do using magentic? For local models I think LitellmChatModel + ollama is one option, and OpenaiChatModel + openai-compatible API is another option. For example with llama.cpp https://github.com/abetlen/llama-cpp-python/blob/main/examples/notebooks/Functions.ipynb

I've very happy to look into a specific issue if you want to open a new github issue for it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants