Skip to content

fix: insert separator between Responses API phase commentary and final text#312173

Merged
ulugbekna merged 3 commits intomainfrom
ulugbekna/agents/responses-api-phase-commentary-fix
Apr 24, 2026
Merged

fix: insert separator between Responses API phase commentary and final text#312173
ulugbekna merged 3 commits intomainfrom
ulugbekna/agents/responses-api-phase-commentary-fix

Conversation

@ulugbekna
Copy link
Copy Markdown
Contributor

Fixes #312163

Problem

When the Responses API streams multiple message output items — e.g. a commentary phase followed by a final_answer phase — the text from both items gets directly concatenated in the textAccumulator without any separator. This produces fused text like:

Responding directly in commentary as requested. My name is GitHub Copilot.My name is GitHub Copilot.

instead of properly separated paragraphs.

Fix

Track the output_index of the last response.output_text.delta event in OpenAIResponsesProcessor. When text arrives from a different output item, emit a \n\n paragraph separator first.

This only affects response.output_text.delta events — reasoning events (response.reasoning_summary_text.delta) use a completely separate code path and are unaffected.

Test

Added a test based on a real-world stream dump that replays the full SSE event sequence for a commentary → final_answer two-phase response and asserts the accumulated text is properly separated.

ulugbekna and others added 2 commits April 23, 2026 18:38
… and final text

When the Responses API streams a commentary message item followed by a
final message item, the text from both items gets directly concatenated
in the textAccumulator without any separator, producing e.g.
'Commentary text.Final text.' instead of properly separated text.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
When the Responses API streams multiple message output items (e.g.
commentary followed by final), their text deltas were all accumulated
into a single textAccumulator without any separator. This caused text
like 'Commentary text.Final text.' instead of properly separated
paragraphs.

Track the output_index of the last text delta and emit a paragraph
break (\n\n) when text arrives from a different output item.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Fixes a Copilot Responses API streaming formatting issue where text deltas from separate output items (e.g., commentary then final_answer) were concatenated without separation, producing fused text in the accumulator.

Changes:

  • Track the last response.output_text.delta output_index in OpenAIResponsesProcessor and insert a \n\n separator when the stream switches to a new output item.
  • Add a unit test that replays a real-world SSE sequence for a commentary → final_answer two-phase response and asserts the accumulated text includes the separator.
Show a summary per file
File Description
extensions/copilot/src/platform/endpoint/node/responsesApi.ts Inserts a paragraph separator when output_text.delta events move to a different output_index to prevent phase text fusion.
extensions/copilot/src/platform/endpoint/node/test/responsesApi.spec.ts Adds a regression test that replays a two-phase SSE stream and verifies the accumulator includes \n\n between phases.

Copilot's findings

  • Files reviewed: 2/2 changed files
  • Comments generated: 1

Comment on lines +1281 to +1283
// The accumulated text must separate commentary and final_answer text
const lastTextBeforeCompletion = accumulatedTexts[accumulatedTexts.length - 2];
expect(lastTextBeforeCompletion).toBe(
Copy link

Copilot AI Apr 23, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This assertion depends on implementation details of how many progress callbacks fire near completion (using accumulatedTexts.length - 2). That’s brittle if processResponseFromChatEndpoint changes to emit fewer/more callbacks on response.completed. Consider asserting against the final accumulated text (e.g., last entry) or locating the last distinct/non-empty accumulated value before completion instead of indexing from the end.

Suggested change
// The accumulated text must separate commentary and final_answer text
const lastTextBeforeCompletion = accumulatedTexts[accumulatedTexts.length - 2];
expect(lastTextBeforeCompletion).toBe(
// The accumulated text must separate commentary and final_answer text.
// Find the last meaningful accumulated value rather than depending on
// how many callbacks fire during stream completion.
const lastAccumulatedText = [...accumulatedTexts].reverse().find(text => text.length > 0);
expect(lastAccumulatedText).toBe(

Copilot uses AI. Check for mistakes.
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
@ulugbekna ulugbekna merged commit 2993fbd into main Apr 24, 2026
26 checks passed
@ulugbekna ulugbekna deleted the ulugbekna/agents/responses-api-phase-commentary-fix branch April 24, 2026 08:49
@vs-code-engineering vs-code-engineering Bot added this to the 1.118.0 milestone Apr 24, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Commentary and final phases are concatenated together without newline.

3 participants