Closed
Conversation
This adds the standalone exec-server stdio JSON-RPC crate and its smoke tests without wiring it into the CLI or unified-exec yet. Co-authored-by: Codex <noreply@openai.com>
Document the standalone exec-server crate, its stdio JSON-RPC transport, and the current request/response and notification payloads. Co-authored-by: Codex <noreply@openai.com>
Separate the transport-neutral JSON-RPC connection and server processor from local process spawning, add websocket support, and document the new API shape. Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Fix read pagination when max_bytes truncates a response, add a chunking regression covering stdout/stderr retention, warn on retained-output eviction, and note init auth as a pre-trust-boundary TODO. Co-authored-by: Codex <noreply@openai.com>
Add a typed optional sandbox field to process/start so callers can omit sandboxing for the existing direct-spawn path while reserving a host-default mode for future remote materialization. Reject hostDefault for now instead of silently running unsandboxed, and cover both omitted and explicit sandbox payloads in tests. Co-authored-by: Codex <noreply@openai.com>
Add exec-server filesystem RPCs and a core-side remote filesystem client, then route unified-exec and filesystem-backed tools through that backend when enabled by config. Also add Docker-backed remote exec integration coverage for the local codex-exec CLI. Co-authored-by: Codex <noreply@openai.com>
- fall back to local when sandboxed exec cannot be modeled remotely - use server-issued process ids for remote session continuations - retain symlink fidelity across fs/readDirectory plumbing - clean up exited exec-server processes after retention Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Contributor
|
I have read the CLA Document and I hereby sign the CLA You can retrigger this bot by commenting recheck in this Pull Request. Posted by the CLA Assistant Lite bot. |
This was referenced Mar 18, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Review shape
This branch is the aggregate of the stacked draft PRs that build the exec-server integration incrementally:
Notes
main