Skip to content

fix: pass fully-qualified provider/model ID to modelcaps.Load#2738

Merged
simonferquel merged 5 commits into
docker:mainfrom
simonferquel-clanker:fix/modelcaps-qualified-model-id
May 11, 2026
Merged

fix: pass fully-qualified provider/model ID to modelcaps.Load#2738
simonferquel merged 5 commits into
docker:mainfrom
simonferquel-clanker:fix/modelcaps-qualified-model-id

Conversation

@simonferquel-clanker
Copy link
Copy Markdown
Contributor

Problem

modelcaps.Load requires a provider/model identifier (e.g. anthropic/claude-sonnet-4-6) to look up model capabilities in the models.dev database. All provider call sites were passing just the bare model name (e.g. claude-sonnet-4-6), so the lookup always missed and image/PDF attachments were silently dropped with StrategyDrop.

Fixes #2737.

Changes

  • pkg/model/provider/anthropic/client.go: pass Provider+"/"+Model to convertDocument in the standard (non-beta) path
  • pkg/model/provider/anthropic/beta_converter.go: same fix for the beta API path
  • pkg/model/provider/openai/client.go: fix both oaistream.ConvertMessages and convertDocumentToResponseInput calls
  • pkg/model/provider/dmr/client.go: fix oaistream.ConvertMessages call
  • pkg/model/provider/bedrock/client.go: fix convertMessages call
  • pkg/model/provider/gemini/client.go: fix convertMessagesToGemini call

Testing

  • Existing tests for all affected packages pass: go test ./pkg/model/provider/... ./pkg/attachment/...

@simonferquel-clanker simonferquel-clanker requested a review from a team as a code owner May 11, 2026 09:00
Copy link
Copy Markdown
Contributor

@simonferquel simonferquel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we are missing indirect calls to it, in the chat package, are'nt we ?

Copy link
Copy Markdown
Contributor

@simonferquel simonferquel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To avoid similar issues in the future, refactor modelcaps to be typesafe:

  • intrroduce an ID struct (Provider, Model)
  • introduce a NewID constructor (Provider, Model string)
  • and a ParseID func (singleParamWithSlashSeparator string) (ID, error)

Make all modelcaps functions taking ids as a parameter take this ID struct instead

@dgageot
Copy link
Copy Markdown
Member

dgageot commented May 11, 2026

@simonferquel are there tests for that?

@simonferquel-clanker
Copy link
Copy Markdown
Contributor Author

Yes — tests were added as part of the refactor pushed in the latest two commits:

  • TestLoadFromStore_QualifiedIDRequired (regression test, pkg/attachment/modelcaps/): directly locks in the Anthropic client passes bare model name to modelcaps.Load, causing image attachments to be dropped #2737 fix by asserting that a bare model name without provider prefix (NewID("", "claude-3-5-sonnet")) returns conservative text-only caps, while the fully-qualified ID (NewID("anthropic", "claude-3-5-sonnet")) correctly resolves to vision-capable.
  • TestNewID, TestParseID_Valid, TestParseID_Invalid: cover the new modelcaps.ID type — construction, round-trip via String(), and error cases.
  • All existing attachments_test.go and messages_test.go files updated to use the new typed API (modelcaps.ID{} / modelcaps.NewID(...)).

Comment thread pkg/attachment/modelcaps/modelcaps.go Outdated
}

// String returns the "provider/model" representation of the ID.
func (id ID) String() string {
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Every provider already has an ID() method, why do we need this, feels like a duplicate.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question. There are two reasons this isn't a duplicate of base.Config.ID():

  1. Semantics differ. base.Config.ID() uses DisplayOrModel() — it returns the user-configured alias when present (e.g. "anthropic/my-sonnet"). That's correct for display, routing, and telemetry. modelcaps.ID always carries the resolved Model name (e.g. "anthropic/claude-sonnet-4-5") because models.dev keys are pinned identifiers — a user alias won't be found in the database.

  2. Circular dependency. modelcaps lives in pkg/attachment/modelcaps and is imported by every provider package. base.Config lives in pkg/model/provider/base, which itself imports modelcaps. Reusing base.Config here would create a cycle. The ID type has to live in modelcaps (or a shared leaf package) so both sides can reference it — base.Config.ModelCapsID() returns a modelcaps.ID, and modelcaps.Load consumes one.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ID is the new structured type for Provider+Model pair

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, but it's only used for String() really, I didn't yet find the place where having a type for this is useful? Gotta look closer at the code I guess

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the ID method that existed did return a string like provider/model. It now returns the structured ID.

The problem is mostly that modelscaps.Load and co took a single string as the identifier, and it led me into thinking something like "opus-4.7" would work. It does not.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, String is just there for string representation of the type. modselscap.Load now takes an ID

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, no I see your point: actually ID should belong to the modelsdev package rather than modelscaps, so that the type safety goes down to the modelsdev store (and modelsdev consumer also benefit from it).

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the ID method that existed did return

It still exists

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it now returns a type-safe ID instead of a string

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done — moved in commit ef439d0. modelsdev.ID now lives in pkg/modelsdev/id.go alongside NewID, ParseID, ParseIDOrZero, and String(). Store.GetModel takes modelsdev.ID directly, so the type safety extends all the way to the store layer. modelcaps no longer defines any ID type — it just consumes modelsdev.ID in Load and LoadFromStore.

@aheritier aheritier added area/providers/openai For features/issues/fixes related to the usage of OpenAI models area/providers/anthropic For features/issues/fixes related to the usage of Anthropic models priority:high Major impact, should be addressed within 2 days area/providers/bedrock AWS Bedrock provider support area/providers/gemini Google Gemini provider support area/providers/docker-model-runner Docker Model Runner (DMR) local inference labels May 11, 2026
@docker-agent
Copy link
Copy Markdown

PR Review Failed — The review agent encountered an error and could not complete the review. View logs.

Comment thread pkg/rag/strategy/semantic_embeddings.go Outdated
// calculateSemanticUsageCost calculates cost for semantic LLM usage.
func calculateSemanticUsageCost(modelsStore modelStore, modelID string, usage *chat.Usage) float64 {
if usage == nil || modelsStore == nil || modelID == "" || strings.HasPrefix(modelID, "dmr/") {
func calculateSemanticUsageCost(modelsStore modelStore, modelIDStr string, usage *chat.Usage) float64 {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modelId should be of type modelsdev.ID and provider.Provider.ID() should return a modelsdev.ID

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit 99a88db. calculateSemanticUsageCost now takes modelsdev.ID directly (no intermediate string). Provider.ID() returns modelsdev.ID across the full interface — all 50+ call sites updated: slog/fmt use the Stringer impl transparently, struct comparisons work directly, and string-consuming functions (AgentInfo, event constructors) call .String().

Comment thread pkg/runtime/runtime.go Outdated
if sess != nil && (sess.InputTokens > 0 || sess.OutputTokens > 0) {
var contextLimit int64
if m, err := r.modelsStore.GetModel(ctx, modelID); err == nil && m != nil {
if m, err := r.modelsStore.GetModel(ctx, modelsdev.ParseIDOrZero(modelID)); err == nil && m != nil {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could we also make getEffectiveModelID return a modelsdev.ID ?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit 99a88db. getEffectiveModelID (and getAgentModelID) now return modelsdev.ID. The GetModel call at this line now passes the ID directly — no more ParseIDOrZero wrapper needed.

Copy link
Copy Markdown
Contributor

@simonferquel simonferquel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

THis is to big a change, reset the pr to just fix the original issue, and file an issue to introduce typesafe model IDs (provider + model name struct) trough the whole code base as a change suggestion

modelcaps.Load requires a "provider/model" identifier (e.g.
"anthropic/claude-sonnet-4-6") to look up model capabilities in the
models.dev database. All provider call sites were passing just the bare
model name (e.g. "claude-sonnet-4-6"), so the lookup always missed and
attachments (images, PDFs) were silently dropped with StrategyDrop.

Fix all affected call sites across anthropic, openai, dmr, bedrock,
and gemini providers to pass c.ModelConfig.Provider+"/"+c.ModelConfig.Model.
@simonferquel-clanker
Copy link
Copy Markdown
Contributor Author

Reset. Force-pushed to reduce the PR to just the original one-line bugfix (commit 446b4e0 — 6 files, 7 insertions/deletions).

Filed #2743 to track the full type-safe modelsdev.ID refactor (Provider.ID(), Store.GetModel, modelcaps.Load, all downstream callers) as a standalone follow-up.

Covers the bug where callers passed a bare model name (e.g.
"claude-sonnet-4-6") to modelcaps.Load instead of a fully-qualified
"provider/model" identifier, causing all image/PDF attachments to be
silently dropped.

Three tests added:
- modelcaps.TestLoadFromStore_QualifiedIDRequired: verifies at the
  modelcaps layer that a bare model name does NOT resolve to vision caps
  while the qualified ID does.
- anthropic.TestConvertDocumentAnthropic_QualifiedIDRequired: end-to-end
  through convertDocumentWithCaps; image is dropped for bare ID and
  preserved as a native image block for the qualified ID.
- oaistream.TestConvertDocument_QualifiedIDRequired: same for the OpenAI/
  oaistream path; image is dropped for bare ID and preserved as an image
  URL part for the qualified ID.

All three inject a fake modelsdev.Store (no network) and use the
LoadFromStore helper so the tests are deterministic and fast.
@simonferquel-clanker
Copy link
Copy Markdown
Contributor Author

Added regression tests in commit 1dbac3d (3 files, no production code changes):

  • modelcaps.TestLoadFromStore_QualifiedIDRequired — at the modelcaps layer: bare model name "claude-sonnet-4-6" returns text-only caps; qualified "anthropic/claude-sonnet-4-6" returns vision+pdf caps
  • anthropic.TestConvertDocumentAnthropic_QualifiedIDRequired — end-to-end through convertDocumentWithCaps: image dropped for bare ID, native image block produced for qualified ID
  • oaistream.TestConvertDocument_QualifiedIDRequired — same for the OpenAI/oaistream path: image dropped for bare ID, image URL part produced for qualified ID

All three use an injected fake modelsdev.Store (no network, deterministic).

// It uses LoadFromStore (fake in-memory store) to avoid network calls, and
// feeds the resulting caps into convertDocumentWithCaps so we can verify the
// full chain: qualified ID -> vision caps -> image block produced.
func TestConvertDocumentAnthropic_QualifiedIDRequired(t *testing.T) {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is not really what we want to test: we want to test that the fully qualified ID is passed when resolving the models caps

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit 9e4db92. Added convertDocumentFromStore(ctx, doc, modelID, store) to anthropic/attachments.go as the store-injectable path; convertDocument delegates to it. The test now calls convertDocumentFromStore directly, exercising the full modelID → LoadFromStore → caps → blocks chain: bare "claude-sonnet-4-6" misses the store (text-only, image dropped), qualified "anthropic/claude-sonnet-4-6" hits it (vision caps, image block produced).

// TestConvertDocument_QualifiedIDRequired is the regression test for the bug
// where callers passed a bare model name instead of a "provider/model" ID,
// causing modelcaps.Load to miss every model and silently drop image/PDF.
func TestConvertDocument_QualifiedIDRequired(t *testing.T) {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here: we want to validate that the fully qualified model ID is passed when resolving caps

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit 9e4db92. Same pattern applied to oaistream: added convertDocumentFromStore(ctx, doc, modelID, store) to oaistream/attachments.go; the test calls it directly with a fake store, exercising modelID → LoadFromStore → caps → parts: bare "gpt-4o" misses the store (image dropped), qualified "openai/gpt-4o" hits it (image URL part produced).

The previous regression tests called convertDocumentWithCaps with
pre-resolved caps, bypassing the modelID → store lookup path. The
reviewer asked to test that the fully-qualified ID is what gets passed
to the caps resolver.

Add convertDocumentFromStore(ctx, doc, modelID, store) to anthropic and
oaistream attachments — the store-injectable path that convertDocument
delegates to. The two regression tests now call convertDocumentFromStore
directly, which exercises the full:

  modelID string → LoadFromStore(store, modelID) → ModelCapabilities → blocks

chain without hitting the network:

- bare "claude-sonnet-4-6" → store miss → text-only caps → image dropped
- qualified "anthropic/claude-sonnet-4-6" → store hit → vision caps → image block

Same pattern for oaistream (gpt-4o / openai/gpt-4o).
Comment thread pkg/model/provider/anthropic/client.go Outdated
case chat.MessagePartTypeDocument:
if part.Document != nil {
docBlocks, err := convertDocument(ctx, *part.Document, c.ModelConfig.Model)
docBlocks, err := convertDocument(ctx, *part.Document, c.ModelConfig.Provider+"/"+c.ModelConfig.Model)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe use c.ID() ?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit bda5311 — using c.ID() across all call sites.

Comment thread pkg/model/provider/dmr/client.go Outdated
// system/user messages, which is needed by some local models run by DMR.
func (c *Client) convertMessages(ctx context.Context, messages []chat.Message) []openai.ChatCompletionMessageParamUnion {
openaiMessages := oaistream.ConvertMessages(ctx, messages, c.ModelConfig.Model)
openaiMessages := oaistream.ConvertMessages(ctx, messages, c.ModelConfig.Provider+"/"+c.ModelConfig.Model)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

c.ID()

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit bda5311 — using c.ID() across all call sites.

Comment thread pkg/model/provider/gemini/client.go Outdated
}

contents := convertMessagesToGemini(ctx, messages, c.ModelConfig.Model)
contents := convertMessagesToGemini(ctx, messages, c.ModelConfig.Provider+"/"+c.ModelConfig.Model)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

c.ID()

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit bda5311 — using c.ID() across all call sites.

Comment thread pkg/model/provider/openai/client.go Outdated
case chat.MessagePartTypeDocument:
if part.Document != nil {
docParts, err := convertDocumentToResponseInput(ctx, *part.Document, c.ModelConfig.Model)
docParts, err := convertDocumentToResponseInput(ctx, *part.Document, c.ModelConfig.Provider+"/"+c.ModelConfig.Model)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

c.ID()

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit bda5311 — using c.ID() across all call sites.

…stability

Per reviewer request, replace all c.ModelConfig.Provider+"/"+c.ModelConfig.Model
expressions with c.ID() at the modelcaps lookup call sites in every provider.

Also wire store injection so regression tests can verify the full path:

anthropic/client.go:
  - Add modelsStore *modelsdev.Store field (nil = use real store)
  - Add convertDoc() method routing through store when set, using c.ID()
  - convertUserMultiContent now calls c.convertDoc()

anthropic/beta_converter.go:
  - convertBetaUserMultiContent calls c.convertDoc()

openai/client.go:
  - Add modelsStore field; convertMessages routes through ConvertMessagesFromStore
  - Responses API inline call uses c.ID()

oaistream/messages.go:
  - Add ConvertMessagesFromStore / ConvertMultiContentFromStore (store-injectable)
  - Internal convertMessagesWithStore / convertMultiContentWithStore helpers

dmr, gemini, bedrock:
  - Use c.ID() directly

Tests updated:
  - anthropic test constructs Client{modelsStore: store} and calls
    convertUserMultiContent — image present only if c.ID() is used
  - oaistream test calls ConvertMultiContentFromStore with qualified vs
    bare ID — image present only when provider prefix is included
@@ -23,7 +24,18 @@ import (
// - text with InlineText → TextBlockParam with TXTEnvelope
// - unsupported / no content → nil (logged as warning)
func convertDocument(ctx context.Context, doc chat.Document, modelID string) ([]anthropic.ContentBlockParamUnion, error) {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove that function (only take the one that takes a modelsdev.Store as a parameter. Make sure Client construction always initializes a modelsdev.Store.

Apply the same in oaistream package

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit a490b33. convertDocument (bare, network-calling) removed from all provider packages — only convertDocumentFromStore remains. All provider clients (anthropic, openai, dmr, bedrock, gemini) now initialise modelsStore in NewClient (falling back to an empty store on error). Same applied to oaistream: bare ConvertMessages/ConvertMultiContent removed; only ConvertMessagesFromStore/ConvertMultiContentFromStore remain.

Copy link
Copy Markdown
Contributor

@simonferquel simonferquel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Simplify the tests: I do not care about making sure not passing the full id skips images, I only care about the end to end: client configured with a Provider / Model that has the image caps should not skip images

…re at construction

Per reviewer request, eliminate the dual-path (bare store + injected store)
in favour of a single path that always goes through a *modelsdev.Store.

Changes:
- anthropic/attachments.go: remove convertDocument(); keep only
  convertDocumentFromStore(ctx, doc, modelID, store)
- oaistream/attachments.go: same
- openai/attachments.go: remove convertDocumentToResponseInput(); keep only
  convertDocumentToResponseInputFromStore(ctx, doc, modelID, store)
- bedrock/attachments.go: same pattern
- gemini/attachments.go: same pattern

All provider clients now initialise modelsStore in NewClient:
- anthropic, openai, dmr, bedrock, gemini each call modelsdev.NewStore()
  at construction; on error they fall back to an empty store (text-only caps)
  rather than a nil store
- convertDoc / convertMessages / convertMessagesToGemini / convertMessages
  (bedrock) all pass c.modelsStore through — no nil checks

oaistream/messages.go:
- Remove ConvertMessages / ConvertMultiContent (bare); keep only
  ConvertMessagesFromStore / ConvertMultiContentFromStore
- convertMultiContentWithStore always passes store to convertDocumentFromStore

All test files updated to use FromStore / WithCaps variants and import
modelsdev where needed.
simonferquel
simonferquel previously approved these changes May 11, 2026

slog.DebugContext(ctx, "OpenAI client created successfully", "model", cfg.Model)

store, err := modelsdev.NewStore()
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of the clients make their own store, let's drill one down please

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in commit 2bc0b53. Added options.WithModelsStore(store) to the options package. createDirectProvider in factory.go (and factory_js.go) now creates the store once and injects it via that option. Each provider's NewClient checks globalOptions.ModelsStore() first and only falls back to modelsdev.NewStore() when called directly (not through the factory), so backward compatibility is preserved.

@simonferquel-clanker
Copy link
Copy Markdown
Contributor Author

CI failure on 2bc0b53 was the pre-existing flaky TestAttachedServer_DeleteSessionStopsEventStream (2-second timing test in pkg/server, zero overlap with our changes).

Also pushed commit 24ce3c0 addressing rumpl's 'drill one store down' feedback more directly: instead of creating the store in the factory, the code now passes the store that callers already hold:

  • teamloader: getModelsForAgent and getFallbackModelsForAgent already call modelsdev.NewStore() before their provider loops — now pass that store via options.WithModelsStore
  • runtime/model_switcher.go: createProviderFromConfig extracts the underlying *modelsdev.Store from the runtime's lazyModelStore (new rawStore() helper) and passes it via options.WithModelsStore
  • factory.go / factory_js.go: store creation removed (providers still fall back to the singleton modelsdev.NewStore() when no store is injected)

@simonferquel-clanker
Copy link
Copy Markdown
Contributor Author

CI failure on 2bc0b53 was the pre-existing flaky TestAttachedServer_DeleteSessionStopsEventStream (2-second timing test in pkg/server, zero overlap with our changes).

Pushed commit 24ce3c0 addressing rumpl's 'drill one store down' feedback: instead of creating a store in the factory, the code now passes the store that callers already hold. teamloader's getModelsForAgent/getFallbackModelsForAgent already call modelsdev.NewStore() — now pass it via options.WithModelsStore. runtime/model_switcher.createProviderFromConfig extracts the store from lazyModelStore via a new rawStore() helper. factory.go/factory_js.go no longer create a store (providers fall back to the singleton NewStore() when none is injected).

@simonferquel-clanker simonferquel-clanker force-pushed the fix/modelcaps-qualified-model-id branch from 24ce3c0 to a490b33 Compare May 11, 2026 12:49
@simonferquel simonferquel merged commit edef1e5 into docker:main May 11, 2026
12 checks passed
dgageot added a commit to dgageot/cagent that referenced this pull request May 11, 2026
Rebase PR docker#2746 on top of origin/main, resolving conflicts with PR docker#2738.

Key changes:
- Add RuntimeConfig.ModelsDevStore() lazy accessor with sync.Once caching
- Add options.WithModelsDevStore to thread the store through provider options
- Merge modelcaps.Load and LoadFromStore into a single Load(store, modelID)
- Remove per-client modelsStore fields from all provider clients (anthropic,
  openai, gemini, bedrock, dmr) — they now get the store via ModelOptions
- Convert modelsdev.NewStore from sync.OnceValues singleton to a plain
  function since RuntimeConfig handles the caching
- Wire RuntimeConfig.ModelsDevStore through teamloader, cmd/root/models,
  and rag/strategy/embedding
- Update detectCachingSupport (bedrock) to accept store parameter
- Rename oaistream ConvertMessagesFromStore/ConvertMultiContentFromStore
  to ConvertMessages/ConvertMultiContent
- Keep c.ID() (qualified provider/model) from PR docker#2738 at all call sites
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area/providers/anthropic For features/issues/fixes related to the usage of Anthropic models area/providers/bedrock AWS Bedrock provider support area/providers/docker-model-runner Docker Model Runner (DMR) local inference area/providers/gemini Google Gemini provider support area/providers/openai For features/issues/fixes related to the usage of OpenAI models priority:high Major impact, should be addressed within 2 days

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Anthropic client passes bare model name to modelcaps.Load, causing image attachments to be dropped

6 participants