Skip to content

fix(providers): use nullish coalescing for temperature with provider-scoped env support#8167

Merged
mldangelo merged 4 commits intomainfrom
fix/temperature-nullish-coalescing-extended
Mar 15, 2026
Merged

fix(providers): use nullish coalescing for temperature with provider-scoped env support#8167
mldangelo merged 4 commits intomainfrom
fix/temperature-nullish-coalescing-extended

Conversation

@mldangelo
Copy link
Copy Markdown
Member

Closes #8161

Summary

  • Replace || with ?? for temperature config in Anthropic Messages provider and LocalAI providers (builds on fix(providers): use nullish coalescing for temperature in Anthropic Messages and LocalAI #8163)
  • Add provider-scoped env override support for temperature — env.ANTHROPIC_TEMPERATURE / env.LOCALAI_TEMPERATURE set per-provider in YAML config now correctly participates in the fallback chain
  • The || operator treats 0 as falsy, so temperature: 0 was silently ignored and fell back to environment variable or hardcoded default (0.7 for LocalAI)

Temperature resolution order (both providers)

config.temperature → provider-scoped env → global env (process.env / cliState) → default

What was missing in #8163

The original fix correctly handled config.temperature: 0, but provider-scoped env overrides (set via env: in YAML per-provider) were still bypassed because getEnvFloat() only reads process.env and cliState.config.env, not options.env. This PR adds the missing middle layer.

Test plan

  • Added tests for Anthropic Messages: temperature: 0 in config, provider-scoped env 0.42, provider-scoped env 0, and config-over-env precedence
  • Added tests for LocalAI Chat/Completion: provider-scoped env temperature, provider-scoped env 0, config-over-env precedence, and default 0.7 fallback
  • All 52 tests pass (npx vitest run test/providers/anthropic/messages.test.ts test/providers/localai.test.ts)
  • npm run tsc passes
  • End-to-end verified with real Anthropic API: explicit temperature: 0, provider-scoped env.ANTHROPIC_TEMPERATURE: "0.42", and default fallback all send correct values

karesansui-u and others added 4 commits March 15, 2026 21:17
…essages and LocalAI

Replace || with ?? for temperature config in Anthropic Messages provider
and LocalAI providers. The || operator treats 0 as falsy, so
temperature: 0 was silently ignored and fell back to environment
variable or hardcoded default (0.7 for LocalAI).

The fix already exists in anthropic/completion.ts (using ??) and was
applied to OpenAI/Azure in PR #7323, but was not propagated to these
providers.

Closes #8161

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The original nullish coalescing fix (|| → ??) correctly preserved
temperature: 0 from config, but provider-scoped env overrides
(e.g. env.ANTHROPIC_TEMPERATURE set per-provider in YAML) were
still bypassed because getEnvFloat() only reads process.env and
cliState, not options.env.

Add provider-scoped env to the temperature fallback chain:
config.temperature → provider env → global env → default

- Store env on LocalAiGenericProvider instance
- Add parseEnvFloat helper to convert string env values to numbers
- Update model IDs in tests to claude-sonnet-4-6
@use-tusk
Copy link
Copy Markdown
Contributor

use-tusk bot commented Mar 15, 2026

No test execution environment matched

View output

View check history

Commit Status Output Created (UTC)
9722cac No test execution environment matched Output Mar 15, 2026 1:24PM

@mldangelo mldangelo marked this pull request as ready for review March 15, 2026 13:25
@mldangelo mldangelo requested a review from MrFlounder as a code owner March 15, 2026 13:25
@github-actions
Copy link
Copy Markdown
Contributor

Security Review ✅

No critical issues found. The changes correctly add provider-scoped env variable support for temperature, with proper precedence: config.temperatureenv.PROVIDER_TEMPERATURE → global env → default.

🟡 Minor Observations (1 item)
  • src/providers/anthropic/messages.ts:32, src/providers/localai.ts:8 - parseEnvFloat is duplicated in both files. Consider extracting to a shared utility (e.g., src/providers/shared.ts) to reduce duplication. Non-blocking.

Last updated: 2026-03-15 | Reviewing: 9722cac

Copy link
Copy Markdown
Contributor

@promptfoo-scanner promptfoo-scanner bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 All Clear

I reviewed the changes to Anthropic and LocalAI providers that adjust how the numeric temperature parameter is sourced (adding provider-scoped env overrides and nullish coalescing fixes). I traced inputs and outputs through the providers and confirmed no modifications to prompts, tools/capabilities, or output execution. Based on this, there are no new LLM security risks introduced by this PR.

Minimum severity threshold: 🟡 Medium | To re-scan after changes, comment @promptfoo-scanner
Learn more


Was this helpful?  👍 Yes  |  👎 No 

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Mar 15, 2026

📝 Walkthrough

Walkthrough

This pull request fixes a bug where explicitly setting temperature: 0 was silently ignored in Anthropic Messages and LocalAI providers due to the use of the logical OR operator (||), which treats 0 as falsy. A new parseEnvFloat helper function safely parses environment variable strings to numbers, returning undefined on invalid input. The temperature resolution logic is updated to use the nullish coalescing operator (??) instead of ||, ensuring that temperature: 0 is properly respected. An optional env field is added to LocalAiGenericProvider to support provider-scoped environment overrides. Comprehensive tests are added to validate temperature precedence: provider-scoped environment temperature is used when config temperature is not set, and config temperature takes precedence over environment values.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~12 minutes

Possibly related PRs

Suggested reviewers

  • MrFlounder
🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and concisely summarizes the main change: replacing || with ?? for temperature handling and adding provider-scoped env support.
Description check ✅ Passed The description is well-related to the changeset, clearly explaining the problem, solution, implementation details, and test coverage.
Linked Issues check ✅ Passed The PR fully addresses issue #8161 by fixing the ?? vs || issue in Anthropic Messages and LocalAI providers, including provider-scoped env support and comprehensive tests.
Out of Scope Changes check ✅ Passed All changes are scoped to temperature handling in Anthropic Messages and LocalAI providers with supporting tests; no unrelated modifications detected.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch fix/temperature-nullish-coalescing-extended
📝 Coding Plan
  • Generate coding plan for human review comments

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (3)
src/providers/anthropic/messages.ts (1)

32-38: Consider centralizing parseEnvFloat to avoid drift across providers.

This helper now exists in multiple provider files with identical logic. Extracting to a shared util would reduce divergence risk in future changes.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/providers/anthropic/messages.ts` around lines 32 - 38, The parseEnvFloat
helper in this file (function parseEnvFloat) is duplicated across providers;
extract it into a shared utility (e.g., create a new exported function
parseEnvFloat in a common utils module) and replace the local definition with an
import from that module in this file and other provider files that currently
define the same function, ensuring signatures and behavior remain identical and
updating imports where parseEnvFloat is referenced.
test/providers/localai.test.ts (1)

49-101: Add matching env-precedence tests for LocalAiCompletionProvider.

Nice chat coverage here, but the same provider-scoped env logic was added to completion too; mirroring these cases there would close the regression gap.

Suggested additions
+  it('should use provider-scoped env temperature when config temperature is not set (completion)', async () => {
+    vi.mocked(fetchWithCache).mockResolvedValue({
+      data: { choices: [{ text: 'Test output' }] },
+    } as any);
+
+    const provider = new LocalAiCompletionProvider('test-model', {
+      config: {},
+      env: { LOCALAI_TEMPERATURE: '0.42' },
+    });
+
+    await provider.callApi('Test prompt');
+
+    const callBody = JSON.parse(
+      (vi.mocked(fetchWithCache).mock.calls[0][1] as RequestInit).body as string,
+    );
+    expect(callBody.temperature).toBe(0.42);
+  });
As per coding guidelines, "`test/**/*.{ts,tsx}`: Test both success and error cases for all functionality."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@test/providers/localai.test.ts` around lines 49 - 101, Add three unit tests
mirroring the LocalAiChatProvider cases but for LocalAiCompletionProvider: (1)
when config.temperature is undefined and env { LOCALAI_TEMPERATURE: '0.42' }
assert the request body temperature is 0.42, (2) when env { LOCALAI_TEMPERATURE:
'0' } assert temperature is 0, and (3) when config.temperature is set (e.g.,
0.1) and env { LOCALAI_TEMPERATURE: '0.9' } assert the config value (0.1) wins.
In each test mock fetchWithCache to resolve with a completion-shaped response,
instantiate LocalAiCompletionProvider with the same options pattern used in the
chat tests, call provider.callApi('Test prompt'), parse the fetchWithCache mock
call body (JSON.parse((vi.mocked(fetchWithCache).mock.calls[0][1] as
RequestInit).body as string)) and assert callBody.temperature accordingly.
src/providers/localai.ts (1)

63-67: Extract temperature resolution into a shared base helper.

The same precedence chain is duplicated in chat and completion methods; a single helper on LocalAiGenericProvider would keep behavior in sync.

Refactor sketch
 class LocalAiGenericProvider implements ApiProvider {
+  protected resolveTemperature(defaultValue = 0.7): number {
+    return (
+      this.config.temperature ??
+      parseEnvFloat(this.env?.LOCALAI_TEMPERATURE) ??
+      getEnvFloat('LOCALAI_TEMPERATURE') ??
+      defaultValue
+    );
+  }
 }

 // chat
-      temperature:
-        this.config.temperature ??
-        parseEnvFloat(this.env?.LOCALAI_TEMPERATURE) ??
-        getEnvFloat('LOCALAI_TEMPERATURE') ??
-        0.7,
+      temperature: this.resolveTemperature(),

 // completion
-      temperature:
-        this.config.temperature ??
-        parseEnvFloat(this.env?.LOCALAI_TEMPERATURE) ??
-        getEnvFloat('LOCALAI_TEMPERATURE') ??
-        0.7,
+      temperature: this.resolveTemperature(),

Also applies to: 147-151

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/providers/localai.ts` around lines 63 - 67, Summary: Extract the
duplicated temperature precedence chain into a shared helper on
LocalAiGenericProvider. Create a private method (e.g., resolveTemperature or
getTemperature) on LocalAiGenericProvider that returns this.config.temperature
?? parseEnvFloat(this.env?.LOCALAI_TEMPERATURE) ??
getEnvFloat('LOCALAI_TEMPERATURE') ?? 0.7, then replace the inline chains in the
chat and completion code paths with calls to that method; ensure both locations
(the current blocks around temperature at lines referenced and the similar block
at 147-151) use the new helper so behavior stays consistent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@src/providers/anthropic/messages.ts`:
- Around line 32-38: The parseEnvFloat helper in this file (function
parseEnvFloat) is duplicated across providers; extract it into a shared utility
(e.g., create a new exported function parseEnvFloat in a common utils module)
and replace the local definition with an import from that module in this file
and other provider files that currently define the same function, ensuring
signatures and behavior remain identical and updating imports where
parseEnvFloat is referenced.

In `@src/providers/localai.ts`:
- Around line 63-67: Summary: Extract the duplicated temperature precedence
chain into a shared helper on LocalAiGenericProvider. Create a private method
(e.g., resolveTemperature or getTemperature) on LocalAiGenericProvider that
returns this.config.temperature ?? parseEnvFloat(this.env?.LOCALAI_TEMPERATURE)
?? getEnvFloat('LOCALAI_TEMPERATURE') ?? 0.7, then replace the inline chains in
the chat and completion code paths with calls to that method; ensure both
locations (the current blocks around temperature at lines referenced and the
similar block at 147-151) use the new helper so behavior stays consistent.

In `@test/providers/localai.test.ts`:
- Around line 49-101: Add three unit tests mirroring the LocalAiChatProvider
cases but for LocalAiCompletionProvider: (1) when config.temperature is
undefined and env { LOCALAI_TEMPERATURE: '0.42' } assert the request body
temperature is 0.42, (2) when env { LOCALAI_TEMPERATURE: '0' } assert
temperature is 0, and (3) when config.temperature is set (e.g., 0.1) and env {
LOCALAI_TEMPERATURE: '0.9' } assert the config value (0.1) wins. In each test
mock fetchWithCache to resolve with a completion-shaped response, instantiate
LocalAiCompletionProvider with the same options pattern used in the chat tests,
call provider.callApi('Test prompt'), parse the fetchWithCache mock call body
(JSON.parse((vi.mocked(fetchWithCache).mock.calls[0][1] as RequestInit).body as
string)) and assert callBody.temperature accordingly.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: b427ba0c-2854-49f4-a1d0-a6f56469c9a7

📥 Commits

Reviewing files that changed from the base of the PR and between deaf339 and 9722cac.

📒 Files selected for processing (4)
  • src/providers/anthropic/messages.ts
  • src/providers/localai.ts
  • test/providers/anthropic/messages.test.ts
  • test/providers/localai.test.ts

Copy link
Copy Markdown
Contributor

@promptfoo-scanner promptfoo-scanner bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 All Clear

I reviewed the changes adding provider-scoped env handling for temperature across Anthropic Messages and LocalAI providers. The modifications only affect how a numeric temperature value is resolved from config or env and do not change prompt construction, tool access, or output execution. Tracing through provider instantiation and env override flows confirmed these values come from configuration, not untrusted runtime inputs. No LLM-security issues were identified related to prompt injection, data exfiltration, secrets/PII exposure, insecure output handling, excessive agency, or jailbreak risks.

Minimum severity threshold: 🟡 Medium | To re-scan after changes, comment @promptfoo-scanner
Learn more


Was this helpful?  👍 Yes  |  👎 No 

@mldangelo mldangelo merged commit 3b20e2b into main Mar 15, 2026
44 checks passed
@mldangelo mldangelo deleted the fix/temperature-nullish-coalescing-extended branch March 15, 2026 15:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bug: temperature: 0 silently falls back to default in Anthropic Messages and LocalAI providers

2 participants