Skip to content

Conversation

@jlandowner
Copy link
Contributor

Hi, I'm running ollama on a remote server and I'm a bit confused because there are --anthropic-url and --openai-url flags but no flag for the Ollama Base URL.

Looking at the code, I found it uses ollama's ClientFromEnvironment() so I can use the standard ollama environment variables.

https://github.com/mark3labs/mcphost/blob/v0.4.4/pkg/llm/ollama/provider.go#L27

I think it's good to document this for newcomers.

@ezynda3 ezynda3 merged commit f1ed1fc into mark3labs:main Apr 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants