Skip to content

Authenticate interception server via INTERCEPTION_SECRET#1180

Merged
rasdani merged 2 commits intomainfrom
worktree-interception-auth
Apr 20, 2026
Merged

Authenticate interception server via INTERCEPTION_SECRET#1180
rasdani merged 2 commits intomainfrom
worktree-interception-auth

Conversation

@rasdani
Copy link
Copy Markdown
Contributor

@rasdani rasdani commented Apr 18, 2026

Summary

  • InterceptionServer accepts an optional secret parameter; when set, validates the Authorization: Bearer <secret> header on every request using hmac.compare_digest — auth check runs before rollout lookup to prevent ID enumeration
  • CliAgentEnv reads INTERCEPTION_SECRET from the host environment and injects it as OPENAI_API_KEY into sandbox env vars, overriding any dummy value (e.g. "intercepted")
  • OPENAI_API_KEY added to PROTECTED_ENV_VARS
  • Empty string INTERCEPTION_SECRET treated as unset in both server and env (consistent behaviour)
  • CliAgentEnv is backward compatible: if INTERCEPTION_SECRET is unset the server accepts all requests as before

Test plan

  • Unit-tested: no-secret (backward compat), no-auth → 401, wrong-secret → 401, correct-secret → accepted
  • Set INTERCEPTION_SECRET=<secret> and run a rollout end-to-end

🤖 Generated with Claude Code


Note

Medium Risk
Adds authentication gating to the interception HTTP server and propagates the shared secret into sandbox env vars; misconfiguration could cause rollouts to fail with 401s or inadvertently run unauthenticated if the secret is unset/empty.

Overview
Adds optional request authentication to InterceptionServer: when INTERCEPTION_SECRET is set, every intercepted request must include Authorization: Bearer <secret> (constant-time compared) and unauthorized requests return 401 before rollout lookup.

Updates CliAgentEnv to read INTERCEPTION_SECRET, pass it into InterceptionServer, and (when non-empty) inject it as OPENAI_API_KEY into sandbox env vars while protecting OPENAI_API_KEY from being overridden by subclasses.

Reviewed by Cursor Bugbot for commit f96ff0e. Bugbot is set up for automated code reviews on this repo. Configure here.

@rasdani rasdani force-pushed the worktree-interception-auth branch from 76a352c to 630515f Compare April 18, 2026 23:11
Comment thread verifiers/utils/interception_utils.py Outdated
Comment thread verifiers/envs/experimental/cli_agent_env.py
Comment thread verifiers/envs/experimental/cli_agent_env.py
Comment thread verifiers/envs/experimental/cli_agent_env.py
Copy link
Copy Markdown
Member

@mikasenghaas mikasenghaas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

couple nits, once resolved, can merge from my side

Comment thread docs/environments.md
Newer and more experimental environment classes include:

- **`GymEnv`** — universal runner for Gym-compatible environments (OpenAI Gym / Gymnasium API)
- **`CliAgentEnv`** — runs custom agent code inside sandboxes, intercepting API requests. Accepts sandbox configuration parameters including `docker_image`, `cpu_cores`, `memory_gb`, `disk_size_gb`, `gpu_count`, `gpu_type`, `timeout_minutes`, `environment_vars`, and `labels` for sandbox categorization. Also accepts retry tuning (like `max_retries`) and connection pooling (like `sandbox_client_max_workers`) parameters via `SandboxMixin`. Subclasses can override `get_sandbox_resources(state)` for per-instance resource allocation and `build_env_vars(state)` for custom environment variables (`PROTECTED_ENV_VARS` cannot be overridden). VMs are auto-enabled when `gpu_count > 0`
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don think this needs to be in public docs

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually it alr seems to have way more details than it should have for an overview

env_vars.setdefault("HTTPX_TIMEOUT", "3600")
secret = os.environ.get("INTERCEPTION_SECRET")
if secret:
env_vars["OPENAI_API_KEY"] = secret
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

question that just popped in my head: the way we hijack OPENAI_BASE_URL and OPENAI_API_KEY means the agent can never run a request to the actual OAI API, right? e.g. if i wanted my agent to run vf-eval against OAI API directly

this is alr not working before this PR i believe so not a concern, but is this correct?

@rasdani rasdani force-pushed the worktree-interception-auth branch from d498de9 to 5f726f0 Compare April 20, 2026 21:52
rasdani and others added 2 commits April 20, 2026 21:54
- InterceptionServer accepts optional `secret` parameter; validates
  Authorization header with hmac.compare_digest when set
- CliAgentEnv reads INTERCEPTION_SECRET at startup and injects it
  as OPENAI_API_KEY into sandbox env vars, overriding any dummy value

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Check Authorization header before rollout lookup to prevent ID
  enumeration (401 is now indistinguishable from 404 to unauthenticated callers)
- Treat empty string INTERCEPTION_SECRET as no secret (consistent
  between server and build_env_vars)
- Document INTERCEPTION_SECRET in docs/environments.md

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@rasdani rasdani force-pushed the worktree-interception-auth branch from 5f726f0 to f96ff0e Compare April 20, 2026 21:54
Copy link
Copy Markdown

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Fix All in Cursor

❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Reviewed by Cursor Bugbot for commit f96ff0e. Configure here.

env_vars.setdefault("HTTPX_TIMEOUT", "3600")
secret = os.environ.get("INTERCEPTION_SECRET")
if secret:
env_vars["OPENAI_API_KEY"] = secret
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

RLM harness overrides injected secret causing 401 failures

Medium Severity

When INTERCEPTION_SECRET is set, build_env_vars correctly injects it as OPENAI_API_KEY into sandbox environment variables. However, the existing RLM harness in verifiers/envs/experimental/composable/harnesses/rlm.py hardcodes export OPENAI_API_KEY=intercepted in its shell run command, overriding the framework-injected secret at runtime. The agent then sends Bearer intercepted instead of Bearer <actual_secret>, and the interception server rejects every request with a 401.

Additional Locations (1)
Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit f96ff0e. Configure here.

@rasdani rasdani merged commit 8e41365 into main Apr 20, 2026
6 checks passed
rasdani added a commit that referenced this pull request Apr 20, 2026
The RLM harness hardcoded `export OPENAI_API_KEY=intercepted` in its
shell run command, a placeholder that made sense when the interception
proxy did no auth. After #1180, build_env_vars injects the real
INTERCEPTION_SECRET as OPENAI_API_KEY; the hardcoded export clobbered
it, causing every agent request to hit the proxy with `Bearer
intercepted` and get rejected with 401.

Use `${OPENAI_API_KEY:-intercepted}` shell default-expansion: the
framework-injected secret wins when present, and the literal
"intercepted" placeholder preserves the pre-auth path when
build_env_vars declines to set the key (INTERCEPTION_SECRET unset or
empty).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
rasdani added a commit that referenced this pull request Apr 20, 2026
…rnesses

Both harnesses previously hardcoded the literal "intercepted" as the
agent's OpenAI API key — a placeholder from when the interception
proxy did no auth. After #1180, build_env_vars injects the real
INTERCEPTION_SECRET as OPENAI_API_KEY; the hardcoded values clobbered
it and every agent request hit the proxy with `Bearer intercepted` →
401.

Use bash-style default expansion `${OPENAI_API_KEY:-intercepted}` in
both the RLM run command and the opencode.json config. Preserves the
pre-auth path when no secret is configured.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
rasdani added a commit that referenced this pull request Apr 20, 2026
…rnesses (#1213)

Both harnesses previously hardcoded the literal "intercepted" as the
agent's OpenAI API key — a placeholder from when the interception
proxy did no auth. After #1180, build_env_vars injects the real
INTERCEPTION_SECRET as OPENAI_API_KEY; the hardcoded values clobbered
it and every agent request hit the proxy with `Bearer intercepted` →
401.

Use bash-style default expansion `${OPENAI_API_KEY:-intercepted}` in
both the RLM run command and the opencode.json config. Preserves the
pre-auth path when no secret is configured.

Co-authored-by: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants