Skip to content

Conversation

@ccy-oai
Copy link
Collaborator

@ccy-oai ccy-oai commented Jan 28, 2026

Problem

Users get generic 429s with no guidance when a model is at capacity.

Solution

Detect model-cap headers, surface a clear “try a different model” message, and keep behavior non‑intrusive (no auto‑switch).

Scope

CLI/TUI only; protocol + error mapping updated to carry model‑cap info.

Tests
  - just fmt
  - cargo test -p codex-tui
  - cargo test -p codex-core --lib shell_snapshot::tests::try_new_creates_and_deletes_snapshot_file -- --nocapture (ran in isolated env)
  - validate local build with backend
image

@ccy-oai ccy-oai force-pushed the ccy/model-capacity-client-ux branch from fe58818 to 725c91b Compare January 29, 2026 00:11
@ccy-oai ccy-oai force-pushed the ccy/model-capacity-client-ux branch from 8e673f7 to 2ff31b4 Compare January 29, 2026 21:43
Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 2ff31b4f10

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +1187 to +1191
fn on_model_cap_error(&mut self, model: String, reset_after_seconds: Option<u64>) {
self.finalize_turn();

let mut message = format!("Model {model} is at capacity. Please try a different model.");
if let Some(seconds) = reset_after_seconds {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve ErrorEvent context when handling model-cap errors

The model‑cap path rebuilds its own message and drops ErrorEvent.message, which can carry important prefixes (e.g., core/src/compact_remote.rs sends Error running remote compact task: …). When a model-cap error happens during those prefixed operations, the TUI will now show a generic capacity warning without context, making it look like the user’s main turn failed. Consider passing the original message into on_model_cap_error (or prepending it) so contextual prefixes aren’t lost for model-cap errors.

Useful? React with 👍 / 👎.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

intentionally not preserving the prefixed message for model‑cap. For this case we want a clear, consistent capacity warning (model + retry hint) rather than context prefixes, which can be redundant or noisy.

@ccy-oai ccy-oai merged commit b79bf69 into main Jan 29, 2026
42 of 45 checks passed
@ccy-oai ccy-oai deleted the ccy/model-capacity-client-ux branch January 29, 2026 22:59
@github-actions github-actions bot locked and limited conversation to collaborators Jan 29, 2026
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants