Skip to content

Comments

fix: use configured model in llm-slug-generator instead of hardcoded …#23286

Open
wsman wants to merge 1 commit intoopenclaw:mainfrom
wsman:fix/llm-slug-generator-use-configured-model
Open

fix: use configured model in llm-slug-generator instead of hardcoded …#23286
wsman wants to merge 1 commit intoopenclaw:mainfrom
wsman:fix/llm-slug-generator-use-configured-model

Conversation

@wsman
Copy link

@wsman wsman commented Feb 22, 2026

Summary

  • Problem: The llm-slug-generator function was using a hardcoded anthropic/claude-opus-4-6 model instead of respecting the user's configured default model (agents.defaults.model.primary).
  • Why it matters: Users without an Anthropic API key would encounter errors when the session memory hook tried to generate slugs: FailoverError: No API key found for provider "anthropic".
  • What changed: Added resolveDefaultModelForAgent to resolve the user's configured default model and pass provider/model parameters to runEmbeddedPiAgent.
  • What did NOT change: The slug generation logic itself; only the model selection was fixed.

Change Type (select all)

  • Bug fix
  • Feature
  • Refactor
  • Docs
  • Security hardening
  • Chore/infra

Scope (select all touched areas)

  • Gateway / orchestration
  • Skills / tool execution
  • Auth / tokens
  • Memory / storage
  • Integrations
  • API / contracts
  • UI / DX
  • CI/CD / infra

Linked Issue/PR

User-visible / Behavior Changes

  • The llm-slug-generator (used by session-memory hook) now respects agents.defaults.model.primary configuration
  • Users with non-Anthropic providers configured will no longer see "No API key found for provider anthropic" errors during slug generation

Security Impact (required)

  • New permissions/capabilities? No
  • Secrets/tokens handling changed? No
  • New/changed network calls? No
  • Command/tool execution surface changed? No
  • Data access scope changed? No

Repro + Verification

Environment

  • OS: Linux
  • Runtime/container: Node.js 22
  • Model/provider: Any non-Anthropic provider (e.g., OpenAI, Google)
  • Relevant config: agents.defaults.model.primary: "openai/gpt-4.1-mini"

Steps

  1. Configure a non-Anthropic model as primary (e.g., openai/gpt-4.1-mini)
  2. Do NOT configure any Anthropic API key
  3. Enable session-memory hook
  4. Have a conversation that triggers slug generation

Expected

  • Slug generation uses the configured OpenAI model
  • No errors about missing Anthropic API key

Actual (before fix)

  • Error: [llm-slug-generator] Failed to generate slug: FailoverError: No API key found for provider "anthropic"

Evidence

  • Failing test/log before + passing after
  • Trace/log snippets
  • Screenshot/recording
  • Perf numbers (if relevant)

Build and tests pass:

  • pnpm build
  • pnpm test src/hooks/bundled/session-memory ✅ (15 tests passed)

Human Verification (required)

  • Verified scenarios: Code review, build passes, related tests pass
  • Edge cases checked: Uses resolveDefaultModelForAgent which handles all model resolution edge cases (aliases, agent-specific overrides, fallbacks)
  • What you did not verify: Live testing with actual LLM calls

Compatibility / Migration

  • Backward compatible? Yes
  • Config/env changes? No
  • Migration needed? No

Failure Recovery (if this breaks)

  • How to disable/revert this change quickly: git revert <commit-sha>
  • Files/config to restore: None needed
  • Known bad symptoms reviewers should watch for: If slug generation fails, it returns null and falls back to timestamp-based filenames (existing graceful degradation)

Risks and Mitigations

  • Risk: Users with misconfigured agents.defaults.model.primary might see different errors
    • Mitigation: Error handling already exists; slug generation returns null on failure and falls back gracefully

Greptile Summary

This PR fixes a critical bug where the llm-slug-generator was hardcoded to use anthropic/claude-opus-4-6 instead of respecting the user's configured default model. The fix correctly adds resolveDefaultModelForAgent to resolve the configured model and passes both provider and model parameters to runEmbeddedPiAgent.

Key changes:

  • Added import of resolveDefaultModelForAgent from ../agents/model-selection.js
  • Added model resolution call: const modelRef = resolveDefaultModelForAgent({ cfg: params.cfg, agentId })
  • Passes provider: modelRef.provider and model: modelRef.model to runEmbeddedPiAgent

Impact:

  • Users without Anthropic API keys will no longer see FailoverError: No API key found for provider "anthropic" when session memory hooks generate slugs
  • The slug generator now respects agents.defaults.model.primary configuration and agent-specific overrides
  • Backward compatible - users with Anthropic configured will continue to work as before

Confidence Score: 5/5

  • This PR is safe to merge with no risks
  • The fix is minimal, targeted, and uses existing well-tested infrastructure (resolveDefaultModelForAgent). The change correctly addresses the root cause by allowing the model selection logic to respect user configuration instead of hardcoding values. The implementation follows the codebase's existing patterns and has proper error handling already in place (slug generation returns null on failure with graceful fallback).
  • No files require special attention

Last reviewed commit: d2e2309

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant