Skip to content

fix(ollama): accept prompt alias on /api/embed for Ollama parity#9780

Merged
mudler merged 1 commit into
masterfrom
worktree-fix-ollama-embed-prompt
May 12, 2026
Merged

fix(ollama): accept prompt alias on /api/embed for Ollama parity#9780
mudler merged 1 commit into
masterfrom
worktree-fix-ollama-embed-prompt

Conversation

@localai-bot
Copy link
Copy Markdown
Collaborator

Summary

Test plan

  • New Ginkgo specs in core/schema/ollama_test.go covering both single-string and array prompt payloads, plus precedence when both keys are set.
  • Pre-existing input (string / []string / []any) behavior preserved.
  • go test ./core/schema/... — 30/30 pass.
  • go vet ./core/schema/... ./core/http/endpoints/ollama/... — clean.
  • Manual smoke (reviewer): curl -s localhost:8080/api/embed -d '{"model":"<m>","prompt":"hello"}' now returns embeddings instead of 400.

Ollama's embedding endpoint accepts both `input` and `prompt` as the
input string value (see ollama/ollama docs/api.md#generate-embeddings).
LocalAI only accepted `input`, which broke client libraries that send
the `prompt` form.

Add `Prompt` to OllamaEmbedRequest and have GetInputStrings fall back
to it when Input is unset. Input still wins when both are provided.

Fixes #9767.

Assisted-by: Claude:claude-opus-4-7 [Claude Code]
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
@mudler mudler added the bug Something isn't working label May 12, 2026
@mudler mudler merged commit a57e736 into master May 12, 2026
55 checks passed
@mudler mudler deleted the worktree-fix-ollama-embed-prompt branch May 12, 2026 15:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Accept both "prompt" and "input" as input string value for interacting with the Ollama embedding API endpoint

2 participants