Skip to content

Add Ollama Cloud support with Bearer token auth#610

Merged
mehmetozguldev merged 2 commits intomasterfrom
feat/ollama-cloud-support
Apr 16, 2026
Merged

Add Ollama Cloud support with Bearer token auth#610
mehmetozguldev merged 2 commits intomasterfrom
feat/ollama-cloud-support

Conversation

@mehmetozguldev
Copy link
Copy Markdown
Member

@mehmetozguldev mehmetozguldev commented Apr 15, 2026

Summary

Extends the Ollama provider so it works against both local Ollama servers and Ollama Cloud. Previously the provider hardcoded a no-auth local endpoint, so pointing it at https://ollama.com silently failed.

  • Refactor OllamaProvider to hold an optional API key and attach Authorization: Bearer … to streaming requests, model discovery (/api/tags), and connection checks.
  • Add isOllamaCloudUrl / OLLAMA_CLOUD_BASE_URL helpers; reuse them across ai-settings, ai-chat-service, and the provider itself.
  • AI Settings → Ollama: new Local/Cloud preset buttons and a password-style API key field wired through the existing secure-token service (storeProviderApiToken / getProviderApiToken / removeProviderApiToken).
  • Sync the stored key into the provider singleton at startup via settings-effects so getModels and the connection indicator work on first load.
  • Chat stream guard: if the endpoint is cloud but no key is stored, fail fast with a message that points back to settings.
  • getModels now appends the parameter size (e.g. llama3 (8B)) when the server reports it.
  • Rename provider label from "Ollama (Local)" to "Ollama" since it covers both modes.

Docs referenced: https://docs.ollama.com/api/introduction, https://docs.ollama.com/cloud.

Test plan

  • Local Ollama: open Settings → AI → Ollama, status indicator goes green, models populate from /api/tags, chat streams.
  • Click "Cloud": URL switches to https://ollama.com, warning shows until a key is entered.
  • Paste an Ollama Cloud key, Save → toast confirms, status turns green, cloud model list loads, chat streams.
  • Remove key → status flips to error, clear remediation text shown.
  • Switch back to Local → everything works without the key.
  • Search Settings for "ollama api key" → new entry appears.

Fixes #372

Ollama previously only worked against local servers with no
authentication. This extends the provider so the same integration can
point at Ollama Cloud (https://ollama.com) by sending the user's API
key as a Bearer token.

- Refactor OllamaProvider to hold an optional API key and include an
  Authorization header on streaming requests, model discovery, and
  connection checks.
- Add isOllamaCloudUrl helper plus OLLAMA_CLOUD_BASE_URL / DEFAULT_
  OLLAMA_BASE_URL exports for reuse across settings and chat service.
- Surface Local/Cloud preset buttons and a password-style API key input
  in AI Settings, wired through the existing secure-token service
  (store / retrieve / remove).
- Sync the stored key into the provider singleton at startup via
  settings-effects so getModels and connection checks work immediately
  after reload.
- Guard chat streams: if the endpoint is cloud but no key is stored,
  surface a clear error pointing back to settings.
- Improve getModels output to include parameter size when the server
  reports it, and rename provider label from "Ollama (Local)" to
  "Ollama" now that it supports both modes.
@mehmetozguldev mehmetozguldev added the AI AI features and agents label Apr 15, 2026
@mehmetozguldev mehmetozguldev self-assigned this Apr 15, 2026
@mehmetozguldev mehmetozguldev added this to the v0.4.6 milestone Apr 15, 2026
@mehmetozguldev mehmetozguldev merged commit 7098881 into master Apr 16, 2026
1 of 2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

AI AI features and agents

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ollama support

1 participant