Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Sep 6, 2025

The NodeJsMessageStream class had a critical bug where it would drop messages that were longer than a single data chunk. The original implementation would split incoming chunks by newlines and immediately attempt to parse each part as complete JSON, causing parse errors when messages spanned multiple chunks.

Problem

readStream.on("data", (chunk: any) => {
	const str = chunk.toString("utf8");
	const parts = str.trim().split("\n"); // This breaks for long messages
	for (const p of parts) {
		const obj = JSON.parse(p) as Message; // Fails when p is incomplete
		this.onMessage(obj);
	}
});

When a JSON-RPC message was longer than the TCP chunk size (typically 64KB), it would be split across multiple data events. The original code would try to parse incomplete JSON fragments, resulting in parse errors and lost messages.

Solution

Added proper buffering to accumulate incoming chunks and only process complete messages:

private buffer: string = "";

readStream.on("data", (chunk: any) => {
	const str = chunk.toString("utf8");
	this.buffer += str;
	
	// Process complete messages (terminated by newlines)
	let newlineIndex: number;
	while ((newlineIndex = this.buffer.indexOf("\n")) !== -1) {
		const messageStr = this.buffer.substring(0, newlineIndex).trim();
		this.buffer = this.buffer.substring(newlineIndex + 1);
		
		if (messageStr.length > 0) {
			try {
				const obj = JSON.parse(messageStr) as Message;
				this.onMessage(obj);
			} catch (error) {
				console.error(`Failed to parse JSON message: ${messageStr}`, error);
			}
		}
	}
});

Changes Made

  • Added a buffer property to accumulate incoming data chunks
  • Modified the data handler to properly detect message boundaries using newline delimiters
  • Added error handling to prevent crashes on malformed JSON
  • Ensured incomplete messages remain in the buffer for subsequent chunks

Impact

  • Fixed: Messages longer than chunk size are now correctly parsed
  • Maintained: Existing functionality for multiple messages in a single chunk preserved
  • Robust: Handles edge cases like empty chunks and messages split at boundaries
  • Safe: Added error handling prevents crashes on malformed data

Fixes #15.


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Copy link
Contributor Author

Copilot AI commented Sep 6, 2025

@hediet 👋 This repository doesn't have Copilot instructions. With Copilot instructions, I can understand the repository better, work faster and produce higher quality PRs.

I can generate a .github/copilot-instructions.md file for you automatically. Click here to open a pre-filled issue and assign it to me. I'll write the instructions, and then tag you for review.

…tiple chunks

Co-authored-by: hediet <2931520+hediet@users.noreply.github.com>
Copilot AI changed the title [WIP] (json-rpc-streams) NodeJsMessageStream drops longer messages and potentially some sent before and after Fix NodeJsMessageStream to handle messages spanning multiple chunks Sep 6, 2025
Copilot finished work on behalf of hediet September 6, 2025 11:58
Copilot AI requested a review from hediet September 6, 2025 11:58
@hediet hediet marked this pull request as ready for review September 7, 2025 13:24
@hediet hediet merged commit 160e4a2 into master Sep 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

(json-rpc-streams) NodeJsMessageStream drops longer messages and potentially some sent before and after

2 participants