stream: preserve toReadableSync batch after backpressure#63276
Open
trivikr wants to merge 1 commit into
Open
Conversation
Keep the current batch and index across _read() calls so chunks that remain after push() returns false are emitted on later reads. Fixes: nodejs#63275 Signed-off-by: Kamat, Trivikram <16024985+trivikr@users.noreply.github.com> Assisted-by: openai:gpt-5.5
Collaborator
|
Review requested:
|
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #63276 +/- ##
========================================
Coverage 90.04% 90.05%
========================================
Files 713 714 +1
Lines 225003 225256 +253
Branches 42536 42573 +37
========================================
+ Hits 202606 202849 +243
- Misses 14177 14181 +4
- Partials 8220 8226 +6
🚀 New features to boost your workflow:
|
Collaborator
Collaborator
jasnell
approved these changes
May 13, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
toReadableSync()could drop chunks from a batch when classic streambackpressure was applied.
When
_read()pulled a batch with multiple chunks andpush()returnedfalse, the method returned immediately without saving the remaining batchitems. A later
_read()call advanced to the next iterator result, so theunpushed chunks were lost.
This stores the current batch and index across
_read()calls, allowingtoReadableSync()to resume the same batch after backpressure clears.Fixes: #63275
Assisted-by: openai:gpt-5.5