Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Ollama (via openai compatible APIs) #16

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

raff
Copy link

@raff raff commented Jul 14, 2024

Use -cm ollama to select the default model (llama3) or -cm ollama:{model} to select a differnt model.
Ollama should be running locally on the default port http://localhost:11434

Use `-cm ollama` to select the default model (llama3) or `-cm ollama:{model}` to select a differnt model.
Ollama should be running locally on the default port `http://localhost:11434`
Copy link
Owner

@baalimago baalimago left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, and many thanks for the addition! One request though: could you please update the README.md with an entry similar to the other vendors? I think a link that gives a brief start to get started with ollama plus the requirement of default ollama port.

Also, I don't have a beefy enough computer to run any models supplied via ollama, but I trust you've verified functionality yourself

@raff
Copy link
Author

raff commented Jul 15, 2024

Sure, I'll update the readme. And yes, I did verified myself (chat and query commands)

Copy link
Owner

@baalimago baalimago left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This also needs to get updated: /baalimago/clai/blob/main/internal/text/querier_setup.go#L20 in order to support the config system for the different models. Right now, I suspect the configuration file for any ollama model will overwrite each others at <os-config-dir>/.clai/VENDOR_NOT_FOUND.json.

I think <os-config-dir>/.clai/ollama_ollama_{model} would be an appropriate format, with {model} being just ollama for the default case.

You can verify the problem+solution by checking out <os-config-dir>/.clai/ (probably ~/.config/.clai) or running clai setup -> 1 (model files) -> c (configure) and seeing ollama_ollama_... config files (and the lack thereof, until change)

EDIT: I think I'll refactor this so that this is inside the respective vendors, but I'll do so aftter this PR to make it less confusing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants