fix(core): eagerly null-out LLM client when host bridge is missing#1558
Merged
Conversation
When llm.provider=host (the OpenClaw default) but no adapter has called registerHostLlmBridge(), the LLM client is technically non-null but every call fails at runtime with LLM_UNAVAILABLE. This caused: - reward pipeline: hasLlm=true → try LLM → catch → heuristic fallback on every single episode, burning cycles and producing misleading "score.llm_failed" warns instead of a clean "no LLM" skip - L2/Skill/L3: same pattern — always attempt then fail Root cause: the OpenClaw plugin SDK (OpenClawPluginApi) does not expose a completion method, so the adapter has no way to wire a HostLlmBridge. The bridge registration mechanism exists but was never called. Fix: after createLlmClient(), check if provider=host and bridge is null. If so, emit a clear "llm.host_bridge_missing" warn explaining the impact and the fix (switch to a direct provider), then set llm=null so all downstream modules cleanly skip LLM paths instead of failing on every invocation.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
修复
llm.provider: host(OpenClaw 默认配置)下 HostLlmBridge 未注入导致整条自我进化链路静默失效的问题。问题
OpenClaw 默认配置
llm.provider: host,意思是"让宿主程序帮忙调用 AI 模型"。但实际上 OpenClaw 的 Plugin SDK 没有暴露 LLM completion 接口,adapter 中也没有(也无法)调用registerHostLlmBridge()。结果:
createLlmClient({provider: "host"})成功创建了 client 对象(构造时不检查 bridge)llm !== null,认为 LLM 可用HostLlmProvider.complete()才发现 bridge 为 null,抛出LLM_UNAVAILABLE修复
在
core/pipeline/memory-core.ts中,LLM client 创建后立即检测:provider=host且getHostLlmBridge()为 nullwarn日志llm.host_bridge_missing,说明影响和修复方法llm设为null,让下游模块干净地走"无 LLM"路径用户需要在
config.yaml中把llm.provider改为直接可用的 provider(如openai_compatible、anthropic、gemini)并配上 API key 和 endpoint。Test plan
hostprovider + 无 bridge → 启动日志应出现llm.host_bridge_missingwarnopenai_compatibleprovider + 有效 key → 正常工作,无新 warnllm.available应为 false(provider=host 无 bridge 时)