We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
To use local models with ollama a sample configuration is config.yaml
llm_api_key: no need llm_base_url: http://localhost:11434 llm_custom_provider: null llm_model: ollama/mistral
Very cool project
The text was updated successfully, but these errors were encountered:
Thanks, I updated the readme.
Sorry, something went wrong.
No branches or pull requests
To use local models with ollama a sample configuration is
config.yaml
Very cool project
The text was updated successfully, but these errors were encountered: