Skip to content

fix(core): eagerly null-out LLM client when host bridge is missing#1558

Merged
hijzy merged 1 commit into
MemTensor:mem-agent-0424from
hijzy:fix/host-bridge-missing
Apr 28, 2026
Merged

fix(core): eagerly null-out LLM client when host bridge is missing#1558
hijzy merged 1 commit into
MemTensor:mem-agent-0424from
hijzy:fix/host-bridge-missing

Conversation

@hijzy
Copy link
Copy Markdown
Collaborator

@hijzy hijzy commented Apr 28, 2026

Summary

修复 llm.provider: host(OpenClaw 默认配置)下 HostLlmBridge 未注入导致整条自我进化链路静默失效的问题。

问题

OpenClaw 默认配置 llm.provider: host,意思是"让宿主程序帮忙调用 AI 模型"。但实际上 OpenClaw 的 Plugin SDK 没有暴露 LLM completion 接口,adapter 中也没有(也无法)调用 registerHostLlmBridge()

结果:

  1. createLlmClient({provider: "host"}) 成功创建了 client 对象(构造时不检查 bridge)
  2. 下游模块看到 llm !== null,认为 LLM 可用
  3. 每次调用时 HostLlmProvider.complete() 才发现 bridge 为 null,抛出 LLM_UNAVAILABLE
  4. human-scorer / L2 induce / Skill crystallize / L3 abstract 全部走 catch 降级
  5. reward 永远 heuristic(rHuman=0)、经验/技能/环境认知全部跳过

修复

core/pipeline/memory-core.ts 中,LLM client 创建后立即检测:

  • 如果 provider=hostgetHostLlmBridge() 为 null
  • 输出明确的 warn 日志 llm.host_bridge_missing,说明影响和修复方法
  • llm 设为 null,让下游模块干净地走"无 LLM"路径

用户需要在 config.yaml 中把 llm.provider 改为直接可用的 provider(如 openai_compatibleanthropicgemini)并配上 API key 和 endpoint。

Test plan

  • 默认 host provider + 无 bridge → 启动日志应出现 llm.host_bridge_missing warn
  • 配置 openai_compatible provider + 有效 key → 正常工作,无新 warn
  • health API 中 llm.available 应为 false(provider=host 无 bridge 时)

When llm.provider=host (the OpenClaw default) but no adapter has called
registerHostLlmBridge(), the LLM client is technically non-null but
every call fails at runtime with LLM_UNAVAILABLE. This caused:

- reward pipeline: hasLlm=true → try LLM → catch → heuristic fallback
  on every single episode, burning cycles and producing misleading
  "score.llm_failed" warns instead of a clean "no LLM" skip
- L2/Skill/L3: same pattern — always attempt then fail

Root cause: the OpenClaw plugin SDK (OpenClawPluginApi) does not expose
a completion method, so the adapter has no way to wire a HostLlmBridge.
The bridge registration mechanism exists but was never called.

Fix: after createLlmClient(), check if provider=host and bridge is null.
If so, emit a clear "llm.host_bridge_missing" warn explaining the
impact and the fix (switch to a direct provider), then set llm=null so
all downstream modules cleanly skip LLM paths instead of failing on
every invocation.
@hijzy hijzy merged commit d48c2d3 into MemTensor:mem-agent-0424 Apr 28, 2026
@hijzy hijzy deleted the fix/host-bridge-missing branch April 29, 2026 02:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant