Skip to content

fix: respect supportsTemperature flag in OpenAI-compatible providers#12164

Draft
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/openai-compatible-temperature-deprecation
Draft

fix: respect supportsTemperature flag in OpenAI-compatible providers#12164
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/openai-compatible-temperature-deprecation

Conversation

@roomote-v0
Copy link
Copy Markdown
Contributor

@roomote-v0 roomote-v0 Bot commented Apr 21, 2026

Related GitHub Issue

Closes: #12162

Description

This PR attempts to address Issue #12162. When models like Claude Opus 4.7 are proxied through OpenAI-compatible gateways (e.g. LiteLLM/Bedrock), sending the temperature parameter causes a 400 BadRequestError because the model has deprecated it.

Changes:

  • src/api/transform/model-params.ts: Uses the existing supportsTemperature field from ModelInfo to omit temperature when supportsTemperature === false in both the openai and openrouter format branches. This replaces the TODO comments that were there before.
  • src/api/providers/openai.ts (OpenAiHandler): Guards the temperature parameter in createMessage() streaming path with supportsTemperature !== false.
  • src/api/providers/base-openai-compatible-provider.ts (BaseOpenAiCompatibleProvider): Guards temperature in createStream() with the same check.
  • src/api/providers/openai-compatible.ts (OpenAICompatibleHandler): Guards temperature in both createMessage() and completePrompt().
  • webview-ui/src/components/settings/providers/OpenAICompatible.tsx: Adds a "Supports Temperature" checkbox so users can explicitly mark models that do not support temperature.
  • webview-ui/src/i18n/locales/en/settings.json: Adds translation keys for the new checkbox.
  • src/api/transform/__tests__/model-params.spec.ts: Adds 6 new tests covering the supportsTemperature behavior for openai, openrouter formats, and backwards compatibility with existing o1/o3-mini hardcoded checks.

Test Procedure

  • All 63 tests in model-params.spec.ts pass (including 6 new ones)
  • All 13 tests in base-openai-compatible-provider.spec.ts pass
  • All 10 tests in OpenAICompatible.spec.tsx pass
  • Full lint and type-check pass across the monorepo

To manually verify:

  1. Set up OpenAI Compatible provider with a model like claude-opus-4-7
  2. Uncheck the new "Supports Temperature" checkbox in model capabilities
  3. Send a prompt -- it should no longer include temperature in the request

Pre-Submission Checklist

  • Issue Linked: This PR is linked to an approved GitHub Issue (see "Related GitHub Issue" above).
  • Scope: My changes are focused on the linked issue (one major feature/fix per PR).
  • Self-Review: I have performed a thorough self-review of my code.
  • Testing: New and/or updated tests have been added to cover my changes.
  • Documentation Impact: No documentation updates needed -- the UI is self-explanatory.
  • Contribution Guidelines: I have read and agree to the Contributor Guidelines.

Feedback and guidance are welcome.

Interactively review PR in Roo Code Cloud

When models like Claude Opus 4.7 are proxied through OpenAI-compatible
gateways, sending the temperature parameter causes a 400 error because
the model has deprecated it.

This change:
- Uses the existing supportsTemperature ModelInfo field in model-params.ts
  to omit temperature when supportsTemperature === false
- Adds the same guard in OpenAiHandler, BaseOpenAiCompatibleProvider,
  and OpenAICompatibleHandler
- Adds a "Supports Temperature" checkbox in the OpenAI Compatible
  settings UI so users can toggle it off for models that need it
- Adds test coverage for the new behavior

Closes #12162
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] OpenAI Compatible - Claude Opus 4.7 has temperature deprecated

1 participant