Skip to content

Simplify chat.autoReply: skip questions instead of LLM-answering them#300926

Merged
meganrogge merged 2 commits intomainfrom
digitarald/simplify-auto-reply
Mar 12, 2026
Merged

Simplify chat.autoReply: skip questions instead of LLM-answering them#300926
meganrogge merged 2 commits intomainfrom
digitarald/simplify-auto-reply

Conversation

@digitarald
Copy link
Contributor

Summary

When chat.autoReply is enabled, the questions tool now returns the same "user is not available, use your best judgment" response used in autopilot mode — instead of sending questions to a separate LLM call for answer resolution.

Removes ~450 lines of complexity (prompt engineering, JSON parsing with retry, fuzzy option matching, fallback answer generation, opt-in dialog, and storage management).

Motivation

The previous auto-reply system sent each question carousel to a separate LLM call to try to pick answers. This has a few problems:

  1. Lower quality answers — The auto-reply model sees far less context than the agent that asked the question. The agent already has the full conversation, tool results, and user intent; the auto-reply model just gets the carousel questions and a snippet of the original request. Returning to the agent with "use your best judgment" lets it leverage all that context.

  2. Extra cost and latency — Each carousel triggered an additional model request (with retry on parse failure), adding wait time and token cost for an answer that was often worse than what the agent would choose on its own.

  3. Complexity — ~450 lines of prompt engineering, JSON parsing, fuzzy option matching, fallback generation, opt-in dialog management, and storage state — all to approximate what the agent can do natively.

The simpler approach: when auto-reply is on, the tool immediately returns the autopilot response ("The user is not available to respond and will review your work later. Work autonomously and make good decisions.") and appends a completed carousel in the UI so users can see what was skipped.

Changes

File Change
askQuestionsTool.ts Merged autopilot + auto-reply into a single check — when either is active, return the autopilot response directly at the tool level
chatQuestionCarouselAutoReply.ts Deleted — the entire 454-line LLM-powered auto-reply class
chatListRenderer.ts Removed all auto-reply orchestration: import, fields, instantiation, maybeAutoReplyToQuestionCarousel(), getRequestMessageText(), _isAutopilotForContext(), and unused imports
chat.contribution.ts Updated setting description to reflect the new behavior
askQuestionsTool.test.ts Updated test to pass IConfigurationService to the new constructor

Testing

  • TypeScript compilation passes (only pre-existing unrelated errors remain)
  • All 10 askQuestionsTool unit tests pass
  • The auto-reply behavior now follows the same well-tested autopilot codepath

When `chat.autoReply` is enabled, the questions tool now returns the
same 'user is not available, use your best judgment' response used in
autopilot mode — instead of sending questions to a separate LLM call
for answer resolution.

This removes ~450 lines of LLM prompt engineering, JSON parsing with
retry, fuzzy option matching, fallback answer generation, and opt-in
dialog management.
Copilot AI review requested due to automatic review settings March 11, 2026 21:34
@digitarald digitarald requested a review from meganrogge March 11, 2026 21:36
@vs-code-engineering vs-code-engineering bot added this to the 1.112.0 milestone Mar 11, 2026
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR simplifies the chat.autoReply feature by removing the separate LLM-based auto-answering flow for question carousels and instead returning the existing “user not available, use best judgment” autopilot-style response directly from the askQuestions tool when auto-reply is enabled.

Changes:

  • Update AskQuestionsTool to treat chat.autoReply similarly to Autopilot: immediately append a completed carousel and return the autopilot response without waiting for user input.
  • Remove UI-side auto-reply orchestration from ChatListItemRenderer (and delete the dedicated auto-reply implementation).
  • Update configuration description and adjust unit tests for the new AskQuestionsTool constructor.

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
src/vs/workbench/contrib/chat/common/tools/builtinTools/askQuestionsTool.ts Adds IConfigurationService and auto-responds when chat.autoReply is enabled.
src/vs/workbench/contrib/chat/browser/widget/chatQuestionCarouselAutoReply.ts Deletes the previous LLM-based auto-reply implementation.
src/vs/workbench/contrib/chat/browser/widget/chatListRenderer.ts Removes auto-reply orchestration logic from the renderer.
src/vs/workbench/contrib/chat/browser/chat.contribution.ts Updates chat.autoReply setting description to reflect “skip” behavior.
src/vs/workbench/contrib/chat/test/common/tools/builtinTools/askQuestionsTool.test.ts Updates test construction to pass a configuration service.

…id-session

Listen for chat.autoReply config changes in chatListRenderer and skip
all pending question carousels when the setting becomes enabled. This
handles the edge case where a carousel is already awaiting user input
and the user enables auto-reply or switches to autopilot afterward.
@digitarald
Copy link
Contributor Author

Good catch! Addressed in a76b7f7 — added a configService.onDidChangeConfiguration listener in chatListRenderer that skips all pending question carousels across sessions when chat.autoReply flips to true. This follows the same pendingQuestionCarousels + carousel.skip() pattern already used for auto-skip on chat submission.

@meganrogge meganrogge merged commit 121ed5f into main Mar 12, 2026
20 checks passed
@meganrogge meganrogge deleted the digitarald/simplify-auto-reply branch March 12, 2026 00:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants