fix: 添加了第三方api中转对Anthropic格式的支持#24
Conversation
📝 WalkthroughWalkthroughModified Anthropic provider implementation to normalize base URLs, prefer default model settings, simplify message formatting to single-message payload, enhance response parsing through content block iteration, and implemented dynamic model fetching with endpoint-aware URL construction and fallback retry logic for robustness. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@interfaces/api/v1/core/settings.py`:
- Around line 161-171: The current exception handling in the model-fetch block
drops exception context and reports the wrong error when the OpenAI fallback
fails; update the two raises to preserve chaining and include both errors: when
catching Exception as exc and immediately raising HTTPException (the branch
where not base_url), re-raise using "from exc" so the original stack is
preserved; in the OpenAI fallback, catch the second exception (e.g., exc2) and
raise HTTPException with a combined message referencing both exc and exc2 (e.g.,
"Failed to fetch Anthropic models: {exc}; OpenAI fallback failed: {exc2}") and
chain it using "from exc2"; reference _fetch_openai_models, the local variables
exc and fallback_base, and the HTTPException raises to locate and apply these
changes.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro Plus
Run ID: 8c92fb89-ae1e-4d7f-bbc9-80f55e45fdab
📒 Files selected for processing (2)
infrastructure/ai/providers/anthropic_provider.pyinterfaces/api/v1/core/settings.py
| except Exception as exc: | ||
| if not base_url: | ||
| raise HTTPException(502, f"Failed to fetch Anthropic models: {exc}") | ||
| logger.info("Anthropic-style fetch failed, trying OpenAI-style: %s", exc) | ||
| try: | ||
| fallback_base = base_url.rstrip("/") | ||
| if not fallback_base.endswith("/v1"): | ||
| fallback_base += "/v1" | ||
| return await _fetch_openai_models(api_key, fallback_base) | ||
| except Exception: | ||
| raise HTTPException(502, f"Failed to fetch models (tried both Anthropic and OpenAI format): {exc}") |
There was a problem hiding this comment.
Add exception chaining and fix error message on fallback failure.
Two issues:
-
Missing
from exc/from Noneonraise HTTPExceptionstatements loses stack trace context for debugging. -
Line 171 reports the original Anthropic exception
exc, but the OpenAI fallback also failed. The error message should include both failures for better diagnostics.
🔧 Proposed fix
except Exception as exc:
if not base_url:
- raise HTTPException(502, f"Failed to fetch Anthropic models: {exc}")
+ raise HTTPException(502, f"Failed to fetch Anthropic models: {exc}") from None
logger.info("Anthropic-style fetch failed, trying OpenAI-style: %s", exc)
try:
fallback_base = base_url.rstrip("/")
if not fallback_base.endswith("/v1"):
fallback_base += "/v1"
return await _fetch_openai_models(api_key, fallback_base)
- except Exception:
- raise HTTPException(502, f"Failed to fetch models (tried both Anthropic and OpenAI format): {exc}")
+ except Exception as openai_exc:
+ raise HTTPException(
+ 502,
+ f"Failed to fetch models (Anthropic: {exc}, OpenAI: {openai_exc})"
+ ) from None📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| except Exception as exc: | |
| if not base_url: | |
| raise HTTPException(502, f"Failed to fetch Anthropic models: {exc}") | |
| logger.info("Anthropic-style fetch failed, trying OpenAI-style: %s", exc) | |
| try: | |
| fallback_base = base_url.rstrip("/") | |
| if not fallback_base.endswith("/v1"): | |
| fallback_base += "/v1" | |
| return await _fetch_openai_models(api_key, fallback_base) | |
| except Exception: | |
| raise HTTPException(502, f"Failed to fetch models (tried both Anthropic and OpenAI format): {exc}") | |
| except Exception as exc: | |
| if not base_url: | |
| raise HTTPException(502, f"Failed to fetch Anthropic models: {exc}") from None | |
| logger.info("Anthropic-style fetch failed, trying OpenAI-style: %s", exc) | |
| try: | |
| fallback_base = base_url.rstrip("/") | |
| if not fallback_base.endswith("/v1"): | |
| fallback_base += "/v1" | |
| return await _fetch_openai_models(api_key, fallback_base) | |
| except Exception as openai_exc: | |
| raise HTTPException( | |
| 502, | |
| f"Failed to fetch models (Anthropic: {exc}, OpenAI: {openai_exc})" | |
| ) from None |
🧰 Tools
🪛 Ruff (0.15.10)
[warning] 161-161: Do not catch blind exception: Exception
(BLE001)
[warning] 163-163: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
[warning] 170-170: Do not catch blind exception: Exception
(BLE001)
[warning] 171-171: Within an except clause, raise exceptions with raise ... from err or raise ... from None to distinguish them from errors in exception handling
(B904)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@interfaces/api/v1/core/settings.py` around lines 161 - 171, The current
exception handling in the model-fetch block drops exception context and reports
the wrong error when the OpenAI fallback fails; update the two raises to
preserve chaining and include both errors: when catching Exception as exc and
immediately raising HTTPException (the branch where not base_url), re-raise
using "from exc" so the original stack is preserved; in the OpenAI fallback,
catch the second exception (e.g., exc2) and raise HTTPException with a combined
message referencing both exc and exc2 (e.g., "Failed to fetch Anthropic models:
{exc}; OpenAI fallback failed: {exc2}") and chain it using "from exc2";
reference _fetch_openai_models, the local variables exc and fallback_base, and
the HTTPException raises to locate and apply these changes.
fix: 添加了第三方api中转对Anthropic格式的支持
增加了使用Anthropic格式调用第三方中转站的功能
但是有什么意义呢?谁会在中转站使用Anthropic格式调用模型?
Anthropic 提供商
generate() 的 messages 包含了 Anthropic 不支持的 "role": "system",导致代理返回 500;改为仅传 user message,system prompt 走顶层参数
generate() 未传 base_url 给 SDK,自定义代理对调用不生效
兼容 Extended Thinking:遍历 response.content 找 TextBlock,不再硬取 index 0
base_url 统一归一化,strip 尾部 /v1,避免拼接出 /v1/v1/messages
model fallback chain:config.model → settings.default_model → DEFAULT_MODEL
模型列表获取
移除硬编码 _ANTHROPIC_MODELS,改为实时调用 /v1/models API
支持 Anthropic 原生认证(x-api-key)失败后自动 fallback 到 OpenAI Bearer 认证
fallback 时正确补全 /v1 路径
Summary by CodeRabbit
Bug Fixes
Refactor