Skip to content

Linux/WSL2: Cloudflare 403 blocks all chatgpt.com API requests — rustls TLS fingerprint detected as bot while macOS native-tls works fine on same network #17860

@xiaodream551-a11y

Description

@xiaodream551-a11y

Description

Codex CLI on Linux (WSL2) is completely unusable with ChatGPT login — every request to chatgpt.com/backend-api/ returns HTTP 403 with a Cloudflare JavaScript challenge page. The same account, same proxy node, same network works perfectly on macOS.

Root Cause Analysis

After binary analysis, the difference is the TLS implementation:

macOS binary Linux binary
TLS library native-tls (SecureTransport) rustls 0.23.36
JA3 fingerprint Browser-like (Safari-similar) Automated client signature
Cloudflare result ✅ Pass ❌ 403 challenge

The Linux binary (x86_64-unknown-linux-musl) is statically compiled with rustls + rama-tls-rustls, producing a TLS Client Hello that Cloudflare's bot detection flags as non-browser traffic. The macOS binary uses the system TLS stack (SecureTransport), whose fingerprint resembles Safari and passes Cloudflare without challenge.

Evidence from binary strings:

# 97 references to rustls vs 3 to native-tls in Linux binary
tokio-rustls-0.26.4
rama-tls-rustls-0.3.0-alpha.4
rustls-0.23.36
utils/rustls-provider/src/lib.rs

Impact

This affects all chatgpt.com endpoints on Linux, not just login:

  • chatgpt.com/backend-api/codex/responses — can't send/receive messages
  • chatgpt.com/backend-api/plugins/featured — startup hangs loading plugins
  • ❌ Plugin marketplace — can't refresh plugin list
  • ❌ OAuth token refresh — sessions can't renew

Codex launches, MCP servers start, but the agent never responds to any input.

Environment

  • Codex version: 0.120.0
  • OS: Ubuntu 24.04 on WSL2 (Windows 11, kernel 6.6.87.2)
  • Install: npm global (@openai/codex)
  • Proxy: Clash Verge mixed mode on port 7890 (same proxy used on macOS where it works)
  • Exit IP: Zenlayer US (Los Angeles) — same node on both platforms

Reproduction

# On Linux/WSL2:
curl -I --proxy http://127.0.0.1:7890 https://chatgpt.com/backend-api/plugins/featured
# → HTTP 403, cf-mitigated: challenge

# On macOS (same proxy, same node):
# → Works fine

Logs

WARN codex_core::plugins::manager: failed to warm featured plugin ids cache
  error=remote plugin sync request to https://chatgpt.com/backend-api/plugins/featured
  failed with status 403 Forbidden: <html>...<span id="challenge-error-text">
  Enable JavaScript and cookies to continue</span>...

WARN codex_tui::chatwidget: failed to load full apps list;
  falling back to installed apps snapshot: Failed to load apps:
  Request failed with status 403 Forbidden

Suggested Fix

  1. Short-term: Allowlist the Codex CLI's User-Agent / request pattern in Cloudflare WAF rules for chatgpt.com
  2. Long-term: Use native-tls (OpenSSL) on Linux instead of rustls, or implement JA3 fingerprint randomization via rama-tls-rustls (which is already bundled but apparently not configured for fingerprint emulation)

Workaround

Using OPENAI_API_KEY instead of ChatGPT login bypasses chatgpt.com entirely (routes through api.openai.com which is not affected).

Related Issues

Metadata

Metadata

Assignees

No one assigned

    Labels

    authIssues related to authentication and accountsbugSomething isn't workingconnectivityIssues involving networking or endpoint connectivity problems (disconnections)

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions