Skip to content

oopsyz/codex_client

Repository files navigation

Claude Brainstorm with Codex

中文说明

License: MIT

This repository contains codex_ws_client.py, a lightweight client for codex app-server over WebSocket.

The script lives at skills/codex-ws-client/scripts/codex_ws_client.py.

The primary use case is running it inside Claude Code so Claude models can collaborate with Codex through a live codex app-server connection.

Demo

brainstorm.mp4

If inline playback is unavailable, open brainstorm.mp4 directly.

It is intended for agents or scripts that need to:

  • send a prompt to a running Codex app-server
  • reuse a persisted thread with --thread-id
  • stream or buffer assistant output
  • get machine-readable JSON output
  • use REPL mode for repeated prompts on one connection
  • inspect richer server behavior through stderr logs or NDJSON traces

Install As A Skill

This repo already packages the client as a skill at skills/codex-ws-client/.

To install project-locally (skill available only in this project):

Copy-Item -Recurse -Force skills/codex-ws-client .codex/skills/codex-ws-client

To install globally (skill available across all projects):

Copy-Item -Recurse -Force skills/codex-ws-client $HOME/.codex/skills/codex-ws-client

After a project-local install, run the client from that path:

python .codex/skills/codex-ws-client/scripts/codex_ws_client.py --json "Summarize this repo"

After a global install, use $HOME/.codex/skills/codex-ws-client/scripts/codex_ws_client.py instead.

Claude CLI sibling client:

Copy-Item -Recurse -Force skills/claude-cli-client .codex/skills/claude-cli-client
python .codex/skills/claude-cli-client/scripts/claude_cli_client.py --json "Summarize this repo"

When To Use It

Use this script when:

  • you want Claude Code to delegate work to Codex or continue a shared Codex thread
  • a long-lived codex app-server is already running
  • you want lower overhead than spawning codex exec for every turn
  • you want direct control over thread ids, timeouts, JSON output, and logging

Do not use it if:

  • you need stdio transport instead of WebSocket
  • you need full job/session orchestration like a larger wrapper tool
  • you need robust interactive approvals outside REPL mode

Transport

This client talks only to:

  • codex app-server --listen ws://HOST:PORT

Default URI:

ws://127.0.0.1:8765

Core Behavior

The client uses this protocol flow:

  1. connect to the WebSocket
  2. send initialize
  3. send initialized
  4. create or resume a thread
  5. send turn/start
  6. consume streamed notifications until the turn finishes

It handles:

  • item/agentMessage/delta
  • turn/completed
  • turn/failed
  • approval/file-change/permissions server requests
  • selected thread/tool/command/file-change notifications

Thread Model

Fresh thread:

  • if --thread-id is omitted, the client creates a new thread

Resumed thread:

  • if --thread-id is provided, the client calls thread/resume
  • resumed turns use --resume-timeout

Persistence:

  • threads are persisted by default
  • --ephemeral disables persistence
  • --thread-id only makes sense for non-ephemeral threads

Important:

  • --ephemeral threads cannot be resumed across connections
  • if a resumed thread cannot be loaded, one-shot mode fails fast
  • in REPL mode, some stale-thread cases may fall back to a new thread

Output Modes

Plain text:

  • default mode streams deltas to stdout

Buffered text:

  • --no-stream prints the final assistant text once at end of turn

JSON:

  • --json prints a structured JSON object to stdout
  • this is the best mode for another LLM or tool to consume

Current JSON shape includes:

  • thread_id
  • turn_id
  • status
  • text
  • optional error
  • optional notifications
  • optional metrics

metrics currently includes:

  • latency_ms
  • input_tokens
  • output_tokens

Useful Commands

One-shot prompt:

python skills/codex-ws-client/scripts/codex_ws_client.py "Summarize this repo"

JSON output for tool use:

python skills/codex-ws-client/scripts/codex_ws_client.py --json "List the main entrypoints"

Reuse a persisted thread:

python skills/codex-ws-client/scripts/codex_ws_client.py --thread-id THREAD_ID "Continue the previous conversation"

Interactive REPL:

python skills/codex-ws-client/scripts/codex_ws_client.py --repl --print-thread-id

REPL with interactive approvals:

python skills/codex-ws-client/scripts/codex_ws_client.py --repl --interactive-approvals

Prompt from file:

python skills/codex-ws-client/scripts/codex_ws_client.py --prompt-file prompt.txt

Structured output with trace:

python skills/codex-ws-client/scripts/codex_ws_client.py --json --ndjson-file trace.jsonl "Return metadata"

REPL Commands

Available in REPL mode:

  • /thread prints the current thread id
  • /new creates a new thread
  • /exit or /quit exits the REPL

Logging And Debugging

Verbosity:

  • -v prints lifecycle and selected notification summaries to stderr
  • -vv prints raw JSON-RPC traffic to stderr

Trace file:

  • --ndjson-file FILE appends JSON-RPC traffic as JSON lines

Summary:

  • --summary prints token usage and latency to stderr

Save final message:

  • --out FILE writes the final assistant text to a file

Approval Handling

Default behavior:

  • command approvals are auto-declined
  • file-change approvals are auto-declined
  • permission requests are denied

REPL override:

  • --interactive-approvals enables prompt-based handling for:
    • command approvals
    • file-change approvals
    • permission requests

Still unsupported:

  • dynamic tool execution requested by server
  • tool user input requests outside the simple approval prompts
  • ChatGPT auth token refresh requests

Unsupported server requests are answered explicitly instead of being ignored.

Timeouts

--timeout

  • normal WebSocket message wait timeout

--connect-timeout

  • initial connection timeout

--resume-timeout

  • timeout for turns sent on resumed threads

Set any of them to 0 for no timeout.

Exit Codes

  • 0: success
  • 1: turn failure
  • 2: bad arguments
  • 3: connection failure
  • 4: timeout
  • 5: JSON/schema parse error
  • 130: interrupted

Best Practices For Another LLM

Prefer:

  • --json for machine consumption
  • --no-stream if you only need the final answer text
  • --thread-id only for known persisted threads
  • --ndjson-file when debugging protocol behavior

Avoid:

  • using --thread-id with threads created via --ephemeral
  • relying on REPL-only features from one-shot mode
  • expecting full protocol coverage for every server request type

Recommended one-shot pattern:

python skills/codex-ws-client/scripts/codex_ws_client.py --json --connect-timeout 10 --timeout 120 "YOUR PROMPT"

Recommended resumed-thread pattern:

python skills/codex-ws-client/scripts/codex_ws_client.py --json --thread-id THREAD_ID --resume-timeout 300 "YOUR PROMPT"

Known Limits

  • WebSocket only, no stdio mode
  • single-process CLI design, not a reusable library
  • not a full protocol framework
  • Windows graceful interrupt of an in-flight turn is still limited
  • richer server-request families are partially handled, not comprehensively implemented

Relationship To app-server

This script is a client.

It does not start the server automatically.

You must already have something like:

codex app-server --listen ws://127.0.0.1:8765

running before using it.

Issues And Contributions

If you hit a bug, open an issue with the command you ran, the expected behavior, the actual behavior, and any relevant stderr or NDJSON trace output.

Contributions are welcome. Keep changes focused, update documentation when behavior changes, and include validation steps or reproduction notes in the pull request.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages