Skip to content

Conversation

@maxj-oai
Copy link
Contributor

@maxj-oai maxj-oai commented Feb 11, 2026

This stack layer makes app-server thread event delivery connection-aware so resumed/attached threads only emit notifications and approval prompts to subscribed connections.

  • Added per-thread subscription tracking in ThreadState (subscribed_connections) and mapped subscription ids to (thread_id, connection_id).
  • Updated listener lifecycle so removing a subscription or closing a connection only removes that connection from the thread’s subscriber set; listener shutdown now happens when the last subscriber is gone.
  • Added connection_closed(connection_id) plumbing (lib.rs -> message_processor.rs -> codex_message_processor.rs) so disconnect cleanup happens immediately.
  • Scoped bespoke event handling outputs through TargetedOutgoing to send requests/notifications only to subscribed connections.
  • Kept existing threadresume behavior while aligning with the latest split-loop transport structure.

@maxj-oai maxj-oai requested a review from owenlin0 February 11, 2026 18:35
Base automatically changed from maxj/threadstate to main February 11, 2026 20:20
@maxj-oai maxj-oai force-pushed the maxj/threadresume1 branch 2 times, most recently from ad28e92 to 40a29cb Compare February 11, 2026 20:25
@etraut-openai etraut-openai added the oai PRs contributed by OpenAI employees label Feb 11, 2026
@maxj-oai maxj-oai marked this pull request as ready for review February 11, 2026 22:18
Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 1ca45ea974

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

.await
{
send_error = Some(err);
break;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it a good idea to short-circuit or should we continue sending to the rest of the connections?

Copy link
Contributor Author

@maxj-oai maxj-oai Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm I think stopping is right - this is a single shared queue not per connection so if it's erroring no reason to continue

@maxj-oai maxj-oai merged commit c0ecc2e into main Feb 12, 2026
32 checks passed
@maxj-oai maxj-oai deleted the maxj/threadresume1 branch February 12, 2026 00:21
@github-actions github-actions bot locked and limited conversation to collaborators Feb 12, 2026
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

oai PRs contributed by OpenAI employees

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants