Skip to content

fix(openclaw): detect Anthropic provider and use correct API format#1260

Merged
syzsunshine219 merged 2 commits intoMemTensor:openclaw-local-plugin-20260317from
zerone0x:fix/anthropic-endpoint-normalization
Mar 19, 2026
Merged

fix(openclaw): detect Anthropic provider and use correct API format#1260
syzsunshine219 merged 2 commits intoMemTensor:openclaw-local-plugin-20260317from
zerone0x:fix/anthropic-endpoint-normalization

Conversation

@zerone0x
Copy link
Contributor

Summary

  • Anthropic API endpoints were incorrectly normalized by appending /chat/completions (OpenAI format), causing 404 errors during skill evolution
  • Added provider detection logic (from config key name or base URL) to route Anthropic providers to /v1/messages with correct headers (x-api-key, anthropic-version) and response parsing
  • Fixed three locations: src/shared/llm-call.ts, src/ingest/providers/index.ts, and scripts/refresh-summaries.ts

Fixes #1254

Test plan

  • TypeScript compilation passes (tsc --noEmit)
  • All 101 existing tests pass (vitest run)
  • Manual test: configure Anthropic provider in openclaw.json and verify skill evolution succeeds
  • Manual test: verify OpenAI-compatible providers continue to work unchanged

🤖 Generated with Claude Code

When Anthropic (or Anthropic-compatible) endpoints are configured, the
endpoint URL was incorrectly normalized by appending `/chat/completions`
(OpenAI format), causing 404 errors during skill evolution. The fix
detects the provider type from the config key name or base URL and uses
the correct endpoint path (`/v1/messages`) and request/response format
(x-api-key header, Anthropic message schema) for Anthropic providers.

Three locations fixed:
- src/shared/llm-call.ts: loadOpenClawFallbackConfig, callLLMOnce
- src/ingest/providers/index.ts: loadOpenClawFallbackConfig
- scripts/refresh-summaries.ts: callLLM

Fixes MemTensor#1254

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copilot AI review requested due to automatic review settings March 17, 2026 07:29
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Fixes OpenClaw “native model” fallback and related tooling to correctly route Anthropic providers to the Anthropic Messages API format (instead of incorrectly normalizing to OpenAI’s /chat/completions), addressing 404s during skill evolution.

Changes:

  • Added provider detection (by provider key name / base URL) and provider-specific endpoint normalization.
  • Implemented Anthropic request/response formatting (/v1/messages, x-api-key, anthropic-version, and content[] parsing) for direct LLM calls.
  • Updated the refresh-summaries script to support Anthropic-formatted calls.

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 4 comments.

File Description
apps/memos-local-openclaw/src/shared/llm-call.ts Adds provider detection + Anthropic vs OpenAI-compatible dispatch for one-off LLM calls and OpenClaw fallback config endpoint normalization.
apps/memos-local-openclaw/src/ingest/providers/index.ts Updates OpenClaw fallback config creation to detect Anthropic and normalize endpoints accordingly.
apps/memos-local-openclaw/scripts/refresh-summaries.ts Updates the maintenance script to call Anthropic with /v1/messages format and parse Anthropic responses.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

function defaultEndpointForProvider(provider: SummaryProvider, baseUrl: string): string {
const stripped = baseUrl.replace(/\/+$/, "");
if (provider === "anthropic") {
if (stripped.endsWith("/v1/messages")) return stripped;
): string {
const stripped = baseUrl.replace(/\/+$/, "");
if (provider === "anthropic") {
if (stripped.endsWith("/v1/messages")) return stripped;
Comment on lines +65 to +72
const isAnthropic = cfg.provider === "anthropic"
|| cfg.endpoint?.toLowerCase().includes("anthropic");

console.log(`Summarizer: ${cfg.provider} / ${cfg.model}`);

let endpoint = cfg.endpoint.replace(/\/+$/, "");
if (!endpoint.endsWith("/chat/completions")) endpoint += "/chat/completions";
if (isAnthropic) {
if (!endpoint.endsWith("/v1/messages") && !endpoint.endsWith("/messages")) {
Comment on lines +5 to +17
/**
* Detect provider type from provider key name or base URL.
*/
function detectProvider(providerKey: string | undefined, baseUrl: string): SummaryProvider {
const key = providerKey?.toLowerCase() ?? "";
const url = baseUrl.toLowerCase();
if (key.includes("anthropic") || url.includes("anthropic")) return "anthropic";
if (key.includes("gemini") || url.includes("generativelanguage.googleapis.com")) {
return "gemini";
}
if (key.includes("bedrock") || url.includes("bedrock")) return "bedrock";
return "openai_compatible";
}
@syzsunshine219 syzsunshine219 changed the base branch from main to openclaw-local-plugin-20260317 March 19, 2026 02:36
@syzsunshine219
Copy link
Collaborator

The incoming block (from the plugin branch) is a partial Anthropic detection that leaked in from #1262/#1264. It should be discarded because:
Incorrect endpoint: it appends /messages instead of /v1/messages, which will cause 404 errors against the Anthropic API.
No code reuse: the detection logic is inlined rather than using the shared detectProvider() / normalizeEndpointForProvider() helpers that PR #1260 introduces (and that llm-call.ts also uses).
Missing Gemini support: PR #1260's detectProvider() also handles Gemini providers; the incoming version does not.
Accept Current — PR #1260 provides the complete and correct fix for #1254.

@syzsunshine219 syzsunshine219 merged commit 4276223 into MemTensor:openclaw-local-plugin-20260317 Mar 19, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fix: Anthropic API 端点被错误规范化为 OpenAI 格式,导致技能演化失败

3 participants