Socket.bytesWritten broken when data is buffered for slow client #5128

Closed
bradisbell opened this Issue Mar 24, 2013 · 9 comments

Projects

None yet

5 participants

@bradisbell

Sometimes, when I access Socket.bytesWritten on a server, I get a crash:

net.js:671
    bytes += Buffer.byteLength(el[0], el[1]);
                                 ^
TypeError: Cannot read property '0' of undefined
    at net.js:671:34
    at Array.forEach (native)
    at Socket.self [as bytesWritten] (net.js:669:16)
    at Timer.<anonymous> (bytesWrittenBroken.js:9:81)
    at Timer.timer.ontimeout (timers.js:242:14)

I have narrowed this problem down to situations when data is being sent to slower clients (such as media players which control the TCP window size, or clients on slower network connections). I had a different problem under the same conditions with Node v0.8, which you can read about here: #4736

You can reproduce this problem with a standard Node.js HTTP or Socket server that sends data, and a client that accepts data slowly.

Server

var http = require('http');
http.createServer(function (req, res) {
    res.writeHead(200, {'Content-Type': 'text/plain'});
    res.write(new Buffer(1048576));
    res.write(new Buffer(1048576));
    res.write(new Buffer(1048576));
    setInterval(function () {
        res.write(new Buffer(1048576));
        console.log('Read: ' + req.socket.bytesRead + '  Written: ' + req.socket.bytesWritten);
    }, 1000);
}).listen(1337, '127.0.0.1');
console.log('Server running at http://127.0.0.1:1337/');

Client

var net = require('net');
var rateLimit = 32768,
    host = 'localhost',
    startTime;

function checkRateLimit (stream) {
    if ((Math.ceil((new Date() - startTime)/1000) * rateLimit) < stream.bytesRead) {
        if (!stream.paused) {
            console.log('Pausing stream');
            stream.pause();
            stream.paused = true;
        }
    } else {
        if (stream.paused) {
            console.log('Resuming stream');
            stream.resume();
            stream.paused = false;
        }
    }
}
net.connect(1337, host, function () {
    var stream = this;
    stream.paused = false;
    startTime = new Date();
    stream.write('GET / HTTP/1.1\r\n\r\n');
    setInterval(function () {
        checkRateLimit(stream);
    }, 100);
}).on('data', function (data) {
        console.log('Received ' + data.length + ' bytes');
        checkRateLimit(this);
    });

Node.js v0.10.1 on Windows x64

@bradisbell

I haven't fully tested this patch, but things seem to work if you replace this:

  state.buffer.forEach(function(el) {
    el = el[0];
    bytes += Buffer.byteLength(el[0], el[1]);
  });

With this:

    state.buffer.forEach(function(el) {
        bytes += Buffer.byteLength(el.chunk.toString(el.encoding), el.encoding);
    });

This is on line 669 on net.js.
The whole idea of needing a string to figure out a Buffer's length seems very wrong and error-prone to me. I read some note on another issue that said this was a necessary evil. Are the buffers in Socket._writableState.buffer not the lengths they were written as? Why not bytes += el.chunk.length instead?

@bradisbell

More testing has shown that this is a problem on Linux x64 as well with Node.js v0.10.1. This is not just a Windows issue.

@bnoordhuis
Member

/cc @isaacs - seems like it's fallout from the streams2-ification of things.

I haven't been able to reproduce it but I wager it's because lib/_stream_writable.js pushes WriteReqs into state.buffer, not array-like objects.

@akovac
akovac commented Mar 29, 2013

I have also encountered this problem on 0.10.2. Cannot reliably reproduce it, but I have added debugging try/catch around state.buffer.forEach, so maybe some context could help:

this._writableState.buffer looks like this:

  _writableState:
   { 
//... cut
     buffer:
      [ { chunk: ,
          encoding: 'binary',
          callback:
           { [Function]
             [length]: 0,
             [name]: '',
             [arguments]: null,
             [caller]: null,
             [prototype]: { [constructor]: [Circular] } } },
        [length]: 1 ] 
// cut...

so forEach el is { chunk: ... } and el = el[0] is undefined which triggers error in Buffer.byteLength(el[0], el[1]);

I can send full dump if needed.

@andyburke

+1 on this, unfortunately not much to add other than that I'm seeing it on Ubuntu x64 on AWS.

@andyburke

Actually, I just made a change to enable gzip compression with express.js, maybe that's making this more likely? I had not seen this before that change.

@indutny indutny added a commit to indutny/node that referenced this issue Apr 7, 2013
@indutny indutny crypto: zero is not an error if writing 0 bytes
fix #5128
8910184
@bradisbell

@indutny I think you have referenced the wrong ticket in your commits. If not, can you briefly explain how your change fixes this issue?

@indutny
Member
indutny commented Apr 12, 2013

Yeah, I believe so. :)

Anyway, this issue should be resolved too in eb39c98 and c665b8e.

Can you please give a try to the latest node.js version and let me know if that works for you?

@bradisbell

Thanks! Yes, this problem is resolved in v0.10.4.

@bradisbell bradisbell closed this Apr 12, 2013
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment