Fix NodeJsMessageStream to handle messages spanning multiple chunks #16
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The
NodeJsMessageStreamclass had a critical bug where it would drop messages that were longer than a single data chunk. The original implementation would split incoming chunks by newlines and immediately attempt to parse each part as complete JSON, causing parse errors when messages spanned multiple chunks.Problem
When a JSON-RPC message was longer than the TCP chunk size (typically 64KB), it would be split across multiple
dataevents. The original code would try to parse incomplete JSON fragments, resulting in parse errors and lost messages.Solution
Added proper buffering to accumulate incoming chunks and only process complete messages:
Changes Made
bufferproperty to accumulate incoming data chunksImpact
Fixes #15.
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.