Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

http2: end is emitted before error when destroying client stream #39400

Closed
szmarczak opened this issue Jul 15, 2021 · 15 comments
Closed

http2: end is emitted before error when destroying client stream #39400

szmarczak opened this issue Jul 15, 2021 · 15 comments
Labels
confirmed-bug Issues with confirmed bugs. http2 Issues or PRs related to the http2 subsystem. stream Issues and PRs related to the stream subsystem.

Comments

@szmarczak
Copy link
Member

Version

v16.4.2

Platform

Linux solus 5.13.1-187.current #1 SMP PREEMPT Wed Jul 7 19:52:26 UTC 2021 x86_64 GNU/Linux

Subsystem

http2

What steps will reproduce the bug?

import http2 from 'http2';
import stream from 'stream';

const session = http2.connect('https://petstore.swagger.io/v2/pet/findByStatus?status=available');
const request = session.request({
	':path': '/findByStatus?status=available',
	'x-trace-id': 'foo',
	'user-agent': 'foo-bar/baz, bash'
});

request.on('data', () => {
	request.destroy(new Error('error'));
});

request.on('end', () => {
	console.log('end');
});

request.end();

await stream.promises.pipeline(
	request,
	new stream.PassThrough()
);

console.log('success');

session.close();

How often does it reproduce? Is there a required condition?

Always.

What is the expected behavior?

Error: error
    at ClientHttp2Stream.<anonymous> (file:///home/szm/Desktop/got/demo.js:12:18)
    at ClientHttp2Stream.emit (node:events:406:35)
    at addChunk (node:internal/streams/readable:312:12)
    at readableAddChunk (node:internal/streams/readable:287:9)
    at ClientHttp2Stream.Readable.push (node:internal/streams/readable:226:10)
    at Http2Stream.onStreamRead (node:internal/stream_base_commons:190:23)

What do you see instead?

end
success

Additional information

Possibly related with #29929 or #35209

@asanoic
Copy link

asanoic commented Jul 15, 2021

I think this might be expected behavior: consider your code like

import http2 from 'http2';
import stream from 'stream';

const session = http2.connect('https://petstore.swagger.io/v2/pet/findByStatus?status=available');
const request = session.request({
	':path': '/findByStatus?status=available',
	'x-trace-id': 'foo',
	'user-agent': 'foo-bar/baz, bash'
});

request.on('data', () => {
	request.destroy(new Error('error'));
});

request.on('end', () => {
	console.log('end');
});

request.end();

stream.promises.pipeline(
    request,
    new stream.PassThrough()
).then(x => {
    console.log('success');
    session.close();
});

the pipeline method just connects an empty stream when request is already ended...

@szmarczak
Copy link
Member Author

szmarczak commented Jul 15, 2021

It's not possible as it's not possible to be connected in the same tick you initiate a connection.

@Ayase-252 Ayase-252 added the http2 Issues or PRs related to the http2 subsystem. label Jul 16, 2021
@lpinca
Copy link
Member

lpinca commented Jul 27, 2021

It seems the 'end' event is emitted even if the TCP connection is not established:

import http2 from 'http2';

const session = http2.connect('https://0273ea441bca0690c5454e2f3380fe0e/');
const request = session.request();

request.on('data', () => {
  request.destroy(new Error('error'));
});

request.on('end', () => {
  console.log('end');
});

request.end();

@lpinca
Copy link
Member

lpinca commented Jul 28, 2021

I think the issue here is that stream.pipeline swallows the error:

$ cat test.mjs
import stream from 'stream';

const duplex = new stream.Duplex({
  read() {},
  write(chunk, encoding, callback) {
    callback();
  }
});

duplex.on('data', function () {
  this.destroy(new Error('error'));
});

duplex.on('end', function () {
  console.log('end');
});

// duplex.on('error', console.error);

duplex.push('foo');
duplex.push(null);

stream.pipeline(duplex, new stream.PassThrough(), (err) => {
  if (err) throw err;

  console.log('success');
});
$ node test.mjs
end
success

@lpinca lpinca added the stream Issues and PRs related to the stream subsystem. label Jul 28, 2021
@lpinca
Copy link
Member

lpinca commented Jul 28, 2021

cc: @nodejs/streams

@mcollina
Copy link
Member

This is a very interesting edge case. Consider the following:

import stream from 'stream';

const duplex = new stream.Duplex({
  read() {},
  write(chunk, encoding, callback) {
    callback();
  }
});

let tick = false
duplex.on('data', function () {
  tick = true
  process.nextTick(() => {
    tick = false
  })
  this.destroy(new Error('error'));
});

duplex.on('end', function () {
  console.log('end', tick);
});

duplex.on('error', console.error);

duplex.push('foo');
duplex.push(null);

The problem is that 'error' will be emitted after end. However the logical order we would expect is for error to come before 'end'. The reason is 'end' is emitted on the same tick of the last 'data' event instead of a subsequent tick.

pipeline see the stream as correctly ended, that's why it swallows the error - its behavior is correct.

This looks like a bug we should fix.

@ronag wdyt?

@mcollina mcollina added the confirmed-bug Issues with confirmed bugs. label Jul 28, 2021
@lpinca
Copy link
Member

lpinca commented Jul 28, 2021

pipeline see the stream as correctly ended, that's why it swallows the error - its behavior is correct.

It's a duplex that emitted only 'end', it is still writable and the error is emitted before 'close'. Is it the expected behavior for pipeline? Shouldn't it wait for 'error' or 'close' for a duplex?

Edit: I guess pipeline does not care about the writable side which makes sense as there is no more data to read/pipe to the next stream.

@szmarczak
Copy link
Member Author

@mcollina In the issue the writable side has ended but the readable not.

@mcollina
Copy link
Member

Edit: I guess pipeline does not care about the writable side which makes sense as there is no more data to read/pipe to the next stream.

@szmarczak from @lpinca words ^.

@szmarczak
Copy link
Member Author

Looks like

if (!state.errorEmitted && !state.closeEmitted &&
should be state.errored instead of state.errorEmitted

@szmarczak
Copy link
Member Author

@mcollina I believe in that case end should be never emitted. https://nodejs.org/api/stream.html#stream_event_end_1

@szmarczak
Copy link
Member Author

szmarczak commented Jul 30, 2021

http2 failed tests.txt with state.errored instead of state.errorEmitted. All the tests depend on the end event. If they used close instead, I think they would pass.

@szmarczak
Copy link
Member Author

szmarczak commented Jul 30, 2021

Is the change semver-major or semver-minor?

@mcollina
Copy link
Member

I would go with a patch as it's a bad bug. I'd just wait to backport to LTS lines for a bit (or at all).

@mcollina
Copy link
Member

Looks like

if (!state.errorEmitted && !state.closeEmitted &&
should be state.errored instead of state.errorEmitted

I think so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
confirmed-bug Issues with confirmed bugs. http2 Issues or PRs related to the http2 subsystem. stream Issues and PRs related to the stream subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants