Conversation
… and final text When the Responses API streams a commentary message item followed by a final message item, the text from both items gets directly concatenated in the textAccumulator without any separator, producing e.g. 'Commentary text.Final text.' instead of properly separated text. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
When the Responses API streams multiple message output items (e.g. commentary followed by final), their text deltas were all accumulated into a single textAccumulator without any separator. This caused text like 'Commentary text.Final text.' instead of properly separated paragraphs. Track the output_index of the last text delta and emit a paragraph break (\n\n) when text arrives from a different output item. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Contributor
There was a problem hiding this comment.
Pull request overview
Fixes a Copilot Responses API streaming formatting issue where text deltas from separate output items (e.g., commentary then final_answer) were concatenated without separation, producing fused text in the accumulator.
Changes:
- Track the last
response.output_text.deltaoutput_indexinOpenAIResponsesProcessorand insert a\n\nseparator when the stream switches to a new output item. - Add a unit test that replays a real-world SSE sequence for a
commentary → final_answertwo-phase response and asserts the accumulated text includes the separator.
Show a summary per file
| File | Description |
|---|---|
| extensions/copilot/src/platform/endpoint/node/responsesApi.ts | Inserts a paragraph separator when output_text.delta events move to a different output_index to prevent phase text fusion. |
| extensions/copilot/src/platform/endpoint/node/test/responsesApi.spec.ts | Adds a regression test that replays a two-phase SSE stream and verifies the accumulator includes \n\n between phases. |
Copilot's findings
- Files reviewed: 2/2 changed files
- Comments generated: 1
Comment on lines
+1281
to
+1283
| // The accumulated text must separate commentary and final_answer text | ||
| const lastTextBeforeCompletion = accumulatedTexts[accumulatedTexts.length - 2]; | ||
| expect(lastTextBeforeCompletion).toBe( |
There was a problem hiding this comment.
This assertion depends on implementation details of how many progress callbacks fire near completion (using accumulatedTexts.length - 2). That’s brittle if processResponseFromChatEndpoint changes to emit fewer/more callbacks on response.completed. Consider asserting against the final accumulated text (e.g., last entry) or locating the last distinct/non-empty accumulated value before completion instead of indexing from the end.
Suggested change
| // The accumulated text must separate commentary and final_answer text | |
| const lastTextBeforeCompletion = accumulatedTexts[accumulatedTexts.length - 2]; | |
| expect(lastTextBeforeCompletion).toBe( | |
| // The accumulated text must separate commentary and final_answer text. | |
| // Find the last meaningful accumulated value rather than depending on | |
| // how many callbacks fire during stream completion. | |
| const lastAccumulatedText = [...accumulatedTexts].reverse().find(text => text.length > 0); | |
| expect(lastAccumulatedText).toBe( |
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
dileepyavan
approved these changes
Apr 24, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes #312163
Problem
When the Responses API streams multiple message output items — e.g. a
commentaryphase followed by afinal_answerphase — the text from both items gets directly concatenated in thetextAccumulatorwithout any separator. This produces fused text like:instead of properly separated paragraphs.
Fix
Track the
output_indexof the lastresponse.output_text.deltaevent inOpenAIResponsesProcessor. When text arrives from a different output item, emit a\n\nparagraph separator first.This only affects
response.output_text.deltaevents — reasoning events (response.reasoning_summary_text.delta) use a completely separate code path and are unaffected.Test
Added a test based on a real-world stream dump that replays the full SSE event sequence for a commentary → final_answer two-phase response and asserts the accumulated text is properly separated.