Skip to content

Add the exec-server end-to-end stack#15069

Closed
starr-openai wants to merge 21 commits intomainfrom
starr/pr-15067-exec-server-full-stack
Closed

Add the exec-server end-to-end stack#15069
starr-openai wants to merge 21 commits intomainfrom
starr/pr-15067-exec-server-full-stack

Conversation

@starr-openai
Copy link
Contributor

Summary

  • add the exec-server crate and protocol surface
  • route unified exec and executor-backed filesystem tools through exec-server
  • add remote workspace remap and executor-aware path handling
  • load project docs and repo skills through the executor-backed environment
  • route app-server filesystem and command/exec through the same executor seam
  • add remote-path, app-server, and exec-server coverage across the stack

Review shape

This branch is the aggregate of the stacked draft PRs that build the exec-server integration incrementally:

Notes

  • base: main
  • this is the single end-to-end review branch for the full exec-server stack

starr-openai and others added 21 commits March 18, 2026 00:39
This adds the standalone exec-server stdio JSON-RPC crate and its
smoke tests without wiring it into the CLI or unified-exec yet.

Co-authored-by: Codex <noreply@openai.com>
Document the standalone exec-server crate, its stdio JSON-RPC
transport, and the current request/response and notification
payloads.

Co-authored-by: Codex <noreply@openai.com>
Separate the transport-neutral JSON-RPC connection and server processor from
local process spawning, add websocket support, and document the new API
shape.

Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Fix read pagination when max_bytes truncates a response, add a chunking regression covering stdout/stderr retention, warn on retained-output eviction, and note init auth as a pre-trust-boundary TODO.

Co-authored-by: Codex <noreply@openai.com>
Add a typed optional sandbox field to process/start so callers can omit sandboxing for the existing direct-spawn path while reserving a host-default mode for future remote materialization. Reject hostDefault for now instead of silently running unsandboxed, and cover both omitted and explicit sandbox payloads in tests.

Co-authored-by: Codex <noreply@openai.com>
Add exec-server filesystem RPCs and a core-side remote filesystem client,
then route unified-exec and filesystem-backed tools through that backend
when enabled by config. Also add Docker-backed remote exec integration
coverage for the local codex-exec CLI.

Co-authored-by: Codex <noreply@openai.com>
- fall back to local when sandboxed exec cannot be modeled remotely
- use server-issued process ids for remote session continuations
- retain symlink fidelity across fs/readDirectory plumbing
- clean up exited exec-server processes after retention

Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: Codex <noreply@openai.com>
@github-actions
Copy link
Contributor


Thank you for your submission, we really appreciate it. Like many open-source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution. You can sign the CLA by just posting a Pull Request Comment same as the below format.


I have read the CLA Document and I hereby sign the CLA


You can retrigger this bot by commenting recheck in this Pull Request. Posted by the CLA Assistant Lite bot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant