Skip to content

fix(ollama): guard nil filter in galleryop.ListModels (#9817)#9836

Merged
mudler merged 1 commit into
masterfrom
worktree-fix-9817-ollama-endpoint
May 15, 2026
Merged

fix(ollama): guard nil filter in galleryop.ListModels (#9817)#9836
mudler merged 1 commit into
masterfrom
worktree-fix-9817-ollama-endpoint

Conversation

@localai-bot
Copy link
Copy Markdown
Collaborator

Summary

  • Nil-guard the filter argument inside galleryop.ListModels (mirrors the same guard already present in ModelConfigLoader.GetModelConfigsByFilter).
  • Add a Ginkgo regression test that fails on master and passes with the fix.

Root cause (closes #9817)

The Ollama /api/tags handler (core/http/endpoints/ollama/models.go:21) calls:

galleryop.ListModels(bcl, ml, nil, galleryop.SKIP_IF_CONFIGURED)

ListModels then iterates over loose files in ModelsPath and invokes filter(m, nil) unconditionally. With filter == nil, the first non-skipped file in the models directory triggers a nil-function panic. Echo's recover middleware aborts the response mid-flight, which Home Assistant's httpx/ollama client surfaces as:

httpx.RemoteProtocolError: Server disconnected without sending a response.

(See traceback in #9817 — fails inside client.list() -> GET /api/tags.)

Every other caller of ListModels already supplies a non-nil filter (config.NoFilterFn, BuildNameFilterFn, BuildUsecaseFilterFn), so the Ollama endpoint was the only path hitting this. Fixing it in ListModels itself prevents the same footgun from recurring at any future call site.

Test plan

  • go test ./core/services/galleryop/... — passes
  • go test ./core/http/endpoints/ollama/... — passes
  • Reverting the fix while keeping the new test in place produces the expected failure (verified locally)
  • Manual: point Home Assistant's Ollama integration at a LocalAI instance with a loose file in the models directory and confirm client.list() no longer disconnects

The Ollama /api/tags handler passes a nil filter to galleryop.ListModels.
When ModelsPath contains any non-skipped loose file the function then
calls filter(name, nil) and panics, which Echo surfaces to clients as
"Server disconnected without sending a response" - the exact failure
Home Assistant's Ollama integration reports against LocalAI.

Mirror the nil guard already present in
ModelConfigLoader.GetModelConfigsByFilter so every caller is safe, and
add a regression test that exercises the loose-file path with a nil
filter.

Assisted-by: claude:claude-opus-4-7 [Claude Code]
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
@mudler mudler merged commit c33d36b into master May 15, 2026
14 of 20 checks passed
@mudler mudler deleted the worktree-fix-9817-ollama-endpoint branch May 15, 2026 08:07
@mudler mudler added the bug Something isn't working label May 15, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Home Assistant can't connect to Ollama endpoint

2 participants