Skip to content

feat: queue pending prompts when session is busy (#16102)#19156

Closed
beenotung wants to merge 4 commits intoanomalyco:devfrom
beenotung:feat/queue-pending-prompts
Closed

feat: queue pending prompts when session is busy (#16102)#19156
beenotung wants to merge 4 commits intoanomalyco:devfrom
beenotung:feat/queue-pending-prompts

Conversation

@beenotung
Copy link
Copy Markdown

Issue for this PR

Closes #16102

Type of change

  • New feature

What does this PR do?

When the session is busy (agent mid-task), incoming prompts are now queued instead of being rejected with a BusyError. The queued prompts are injected as user messages at the start of each loop iteration, before loading message history - allowing mid-task steering without interrupting the current task.

This implements the core mechanism requested in #16102 - draining queued messages at the start of each loop() iteration and injecting them as context before the next LLM call.

Note: This PR was primarily generated by an AI coding assistant (opencode). Additional review for edge cases and consistency with existing patterns would be valuable.

How did you verify your code works?

The changes handle both sync and async prompt endpoints:

  • Sync endpoint: queues the request and resolves when the session becomes available
  • Async endpoint: logs the queued status

Screenshots / recordings

N/A (backend change)

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

@github-actions github-actions bot added the needs:compliance This means the issue will auto-close after 2 hours. label Mar 25, 2026
@github-actions
Copy link
Copy Markdown
Contributor

This PR doesn't fully meet our contributing guidelines and PR template.

What needs to be fixed:

  • Not all checklist items are checked. Please confirm you have tested locally and have not included unrelated changes.

Please edit this PR description to address the above within 2 hours, or it will be automatically closed.

If you believe this was flagged incorrectly, please let a maintainer know.

@github-actions
Copy link
Copy Markdown
Contributor

The following comment was made by an LLM, it may be inaccurate:

Based on my search results, I found several related PRs about queuing and message handling, but they appear to be addressing different aspects or older features. The most relevant ones are:

However, none of these appear to be duplicates of PR #19156. Your PR (#19156) implements a new backend mechanism to queue pending prompts when the session is busy (instead of rejecting with BusyError), which is different from the earlier client-side queuing and UI control features.

No duplicate PRs found for the specific feature being implemented in PR #19156.

- Fixed extra parenthesis in catch block (line 834)
- Added .opencode/memory/ to gitignore to keep agent notes local
- Add priority field to queue data structure (urgent/normal/background)
- Implement batched draining with configurable limit per loop iteration
- Sort queued messages by priority then by queue time (FIFO within same priority)
- Add context tagging for queued messages with <queued-message> wrapper
- Update PromptInput schema to accept priority parameter
- Update both sync and async prompt endpoints to support priority

This enhancement allows users to send high-priority messages that get
processed first, while lower-priority messages are batched efficiently.
- Update prompt_async error handling test to accept both .catch() and try/catch
- Add comprehensive tests for queue pending prompts feature:
  - Queuing behavior when session is busy
  - Priority levels (urgent/normal/background)
  - Smart draining with priority ordering
  - Batch limiting per iteration
@github-actions
Copy link
Copy Markdown
Contributor

This pull request has been automatically closed because it was not updated to meet our contributing guidelines within the 2-hour window.

Feel free to open a new pull request that follows our guidelines.

@github-actions github-actions bot removed the needs:compliance This means the issue will auto-close after 2 hours. label Mar 25, 2026
@github-actions github-actions bot closed this Mar 25, 2026
@beenotung
Copy link
Copy Markdown
Author

I'm adding additional changes to the PR, will test it and re-open the issue when it's ready.

Also welcome others to join.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE]: inject queued messages into the next LLM call instead of interrupting the current task

1 participant