Skip to content

Codex app-server sends /v1/responses request without Authorization header #87

@yfrbl

Description

@yfrbl

Codex app-server sends /v1/responses request without Authorization header

Summary

Requests from the local Codex setup to https://api.openai.com/v1/responses fail with:

401 Unauthorized: Missing bearer or basic authentication in header

The failure is reproducible with a running codex app-server that has no OPENAI_API_KEY in its process environment. The local ~/.codex/auth.json does contain ChatGPT OAuth tokens, but the app-server/proxy path still appears to send a request to the OpenAI Responses API without any Authorization header.

Environment

  • Host user: umbrel
  • Date of investigation: 2026-04-28
  • Codex app-server command:
/home/umbrel/.litter/bin/codex app-server --listen ws://127.0.0.1:8390
  • Installed app-server binary:
/home/umbrel/.litter/codex/rust-v0.125.0/codex
  • Symlink:
/home/umbrel/.litter/bin/codex -> /home/umbrel/.litter/codex/rust-v0.125.0/codex

Observed Processes

Initial app-server process:

/home/umbrel/.litter/bin/codex app-server --listen ws://127.0.0.1:8390

Listening socket:

127.0.0.1:8390 users:(("codex",pid=<redacted>,fd=<redacted>))

After SIGTERM, the app-server was automatically restarted:

/home/umbrel/.litter/bin/codex app-server --listen ws://127.0.0.1:8390

Listening socket after restart:

127.0.0.1:8390 users:(("codex",pid=<redacted>,fd=<redacted>))

Auth State

The app-server process environment was checked without printing secrets.

Before restart:

OPENAI_API_KEY=missing
OPENAI_BASE_URL=missing
OPENAI_API_BASE=missing
OPENAI_ORG=missing
OPENAI_PROJECT=missing
CODEX_HOME=missing
HOME=/home/umbrel
USER=umbrel
PATH=/usr/local/bin:/usr/bin:/bin:/usr/games

After restart:

OPENAI_API_KEY=missing
OPENAI_BASE_URL=missing
OPENAI_API_BASE=missing
OPENAI_ORG=missing
OPENAI_PROJECT=missing
CODEX_HOME=missing
HOME=/home/umbrel
USER=umbrel
PATH=/usr/local/bin:/usr/bin:/bin:/usr/games

The interactive Codex CLI processes also had no OpenAI API key:

interactive `node /usr/local/bin/codex` process
OPENAI_API_KEY=missing
OPENAI_BASE_URL=missing
OPENAI_API_BASE=missing
OPENAI_ORG=missing
OPENAI_PROJECT=missing
interactive native `codex` process
OPENAI_API_KEY=missing
OPENAI_BASE_URL=missing
OPENAI_API_BASE=missing
OPENAI_ORG=missing
OPENAI_PROJECT=missing

/home/umbrel/.codex/auth.json does contain ChatGPT OAuth tokens, but no API key:

auth_mode=chatgpt
has_api_key=false
has_tokens=true
has_access=true
has_refresh=true
last_refresh=<redacted; after initial app-server start>

Important timing detail:

  • app-server initial start happened before auth.json refresh
  • app-server restart happened after auth.json refresh

Restarting the app-server did not add OPENAI_API_KEY to the process environment.

Log Evidence

Session logs contain repeated upstream 401s:

unexpected status 401 Unauthorized: Missing bearer or basic authentication in header,
url: https://api.openai.com/v1/responses

Examples were found under /home/umbrel/.codex/sessions/<date>/; session file IDs omitted.

The TUI log also shows repeated websocket connection lifecycle events:

model_client.stream_responses_websocket{model=gpt-5.5 wire_api=responses transport="responses_websocket" api.path="responses" ...}
model_client.websocket_connection{provider=OpenAI wire_api=responses transport="responses_websocket" api.path="responses" ...}: codex_core::client: new
model_client.websocket_connection{provider=OpenAI wire_api=responses transport="responses_websocket" api.path="responses" ...}: codex_core::client: close

These websocket connect/close events look like local reconnect/retry behavior, but the actual user-visible failures are upstream 401s from api.openai.com, not local connection refused/timeouts.

Binary Clues

No local Rust source tree was present, only a compiled binary. Running strings against the binary showed relevant source paths and auth-related strings:

/home/runner/work/codex/codex/codex-rs/codex-api/src/sse/responses.rs
/home/runner/work/codex/codex/codex-rs/codex-api/src/endpoint/responses_websocket.rs
/home/runner/work/codex/codex/codex-rs/codex-api/src/endpoint/responses.rs
responses-api-proxy/src/read_api_key.rs
Bearer
/responses
bearer_token_env_var
uses unsupported `bearer_token`; set `bearer_token_env_var`.
failed to encode bearer auth as header value
falling back to stored bearer token authentication
retrying after auth recovery

This suggests the relevant Rust areas are:

  • codex-api/src/endpoint/responses.rs
  • codex-api/src/endpoint/responses_websocket.rs
  • codex-api/src/sse/responses.rs
  • responses-api-proxy/src/read_api_key.rs
  • any shared auth layer that converts auth.json / OAuth / env config into request headers

Classification

Local connection/reconnect issue

Unlikely as the primary cause.

Evidence:

  • app-server is listening on 127.0.0.1:8390
  • reconnect/new/close events are present, but the concrete failure is an upstream OpenAI 401
  • no observed ECONNREFUSED, DNS, TLS, or timeout failure in the relevant log lines

Missing env configuration

Confirmed.

Evidence:

  • app-server process has OPENAI_API_KEY=missing
  • interactive Codex CLI processes also have OPENAI_API_KEY=missing
  • auth.json has "OPENAI_API_KEY": null

Faulty header propagation in local backend/proxy

Likely if this app-server path is expected to support ChatGPT OAuth auth.

Evidence:

  • auth.json has usable-looking ChatGPT OAuth token fields
  • requests still reach https://api.openai.com/v1/responses without bearer/basic auth
  • restart after OAuth refresh did not change the app-server env
  • binary strings show a Responses API proxy path that appears to read API keys / bearer token env vars

Hypothesis

The local app-server / responses-api-proxy path only attaches an Authorization header when it can derive a bearer from OPENAI_API_KEY or a configured bearer_token_env_var. In auth_mode=chatgpt, the local auth.json OAuth token is present, but this code path either:

  1. does not read ChatGPT OAuth tokens at all,
  2. reads them only in the TUI path but not the app-server/proxy path,
  3. reads auth only at process start and did not recover correctly,
  4. treats missing OPENAI_API_KEY as non-fatal and sends an unauthenticated upstream request instead of failing locally.

The fourth behavior is especially problematic because it produces a confusing upstream 401 instead of a local, actionable error.

Proposed Rust Change

The Rust fix should be in the auth/header construction path used by Responses API and Responses websocket requests.

Suggested behavior:

  1. Before sending any request to https://api.openai.com/v1/responses, resolve auth in one place.
  2. If OPENAI_API_KEY or configured bearer_token_env_var exists, attach:
Authorization: Bearer <redacted>
  1. If auth_mode=chatgpt and ~/.codex/auth.json contains a valid access token, attach the OAuth bearer token for the Responses request if that is a supported auth mode for this endpoint.
  2. If OAuth is not supported for this app-server/proxy path, fail locally before the upstream request with a clear error such as:
Missing OpenAI authorization: app-server Responses API proxy requires OPENAI_API_KEY or bearer_token_env_var; ChatGPT OAuth auth is not supported in this path.
  1. Add a redacted debug log around request construction:
responses auth: source=env|oauth|none authorization_header=present|missing

Do not log token values.

Candidate files from binary strings:

responses-api-proxy/src/read_api_key.rs
codex-rs/codex-api/src/endpoint/responses.rs
codex-rs/codex-api/src/endpoint/responses_websocket.rs
codex-rs/codex-api/src/sse/responses.rs

The minimal safe Rust change is to prevent unauthenticated upstream calls:

// Pseudocode
let auth = resolve_openai_auth(config, auth_store, env)?;

if auth.authorization_header().is_none() {
    return Err(anyhow!(
        "Missing OpenAI authorization: set OPENAI_API_KEY or configure a supported bearer token source"
    ));
}

request.headers_mut().insert(
    AUTHORIZATION,
    auth.authorization_header().unwrap(),
);

If OAuth is intended to work here, resolve_openai_auth should include the ChatGPT OAuth token source used by the normal Codex model client.

Reproduction / Verification Commands

Find the app-server:

ps -eo pid,ppid,user,stat,lstart,cmd | grep -Ei 'codex app-server|responses-api-proxy' | grep -v grep
ss -ltnp | grep 8390

Check app-server environment without printing secrets:

PID="$(ss -ltnp | awk '/127.0.0.1:8390/ { match($0, /pid=[0-9]+/); print substr($0, RSTART+4, RLENGTH-4) }')"
perl -0777 -e '$s=<>; for $k (qw(OPENAI_API_KEY OPENAI_BASE_URL OPENAI_API_BASE OPENAI_ORG OPENAI_PROJECT CODEX_HOME HOME USER PATH)) { if ($s =~ /(?:^|\0)\Q$k\E=([^\0]*)/) { $v=$1; if ($k =~ /KEY|TOKEN|SECRET|AUTH/i) { print "$k=".(length($v)?"present len=".length($v)." plausible=".($v =~ /^sk-/ ? "yes" : "no"):"empty")."\n"; } else { print "$k=".(length($v)?$v:"empty")."\n"; } } else { print "$k=missing\n"; } }' "/proc/$PID/environ"

Check stored Codex auth state without printing tokens:

jq '{auth_mode, has_api_key:(.OPENAI_API_KEY != null and .OPENAI_API_KEY != ""), has_tokens:(.tokens != null), has_access:(.tokens.access_token != null and .tokens.access_token != ""), has_refresh:(.tokens.refresh_token != null and .tokens.refresh_token != ""), last_refresh}' ~/.codex/auth.json

Find 401s and reconnect context:

grep -RIn --exclude='*.sqlite*' -E '401 Unauthorized|Missing bearer|https://api.openai.com/v1/responses|websocket_connection|retry|reconnect' ~/.codex/log ~/.codex/sessions

Restart app-server:

kill -TERM "$PID"
sleep 1
ps -eo pid,ppid,user,stat,lstart,cmd | grep -Ei 'codex app-server|responses-api-proxy' | grep -v grep
ss -ltnp | grep 8390

Inspect binary for relevant Rust paths:

strings /home/umbrel/.litter/codex/rust-v0.125.0/codex \
  | grep -Ei 'v1/responses|api.openai.com|authorization|OPENAI_API_KEY|OPENAI_BASE_URL|OPENAI_API_BASE|reconnect|retry|bearer|auth.json|responses'

Expected Behavior

If the user is logged in via ChatGPT OAuth and that auth mode is supported, Codex should attach a valid bearer token to Responses API requests.

If only API-key auth is supported for this app-server/proxy path, Codex should fail locally with a clear diagnostic before sending an unauthenticated request upstream.

Actual Behavior

Codex sends or attempts a Responses API request that reaches:

https://api.openai.com/v1/responses

OpenAI returns:

401 Unauthorized: Missing bearer or basic authentication in header

This implies the upstream request had no usable Authorization header.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions