Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[x] stream: error multiple iterators #29064

Closed

Conversation

ronag
Copy link
Member

@ronag ronag commented Aug 9, 2019

Throw if creating multiple async iterators from the same stream.

Refs: #28997

Checklist
  • make -j4 test (UNIX), or vcbuild test (Windows) passes
  • tests and/or benchmarks are included
  • documentation is changed or added
  • commit message follows commit guidelines

@nodejs-github-bot nodejs-github-bot added errors Issues and PRs related to JavaScript errors originated in Node.js core. fs Issues and PRs related to the fs subsystem / file system. labels Aug 9, 2019
@ronag ronag force-pushed the stream-error-multiple-iterators branch from ce96ebc to ac81e03 Compare August 9, 2019 10:06
doc/api/errors.md Outdated Show resolved Hide resolved
@ronag ronag force-pushed the stream-error-multiple-iterators branch 3 times, most recently from 5f44ded to 8e36d58 Compare August 9, 2019 13:50
@trivikr trivikr changed the title Stream error multiple iterators stream: error multiple iterators Aug 9, 2019
@ronag
Copy link
Member Author

ronag commented Aug 12, 2019

@mcollina @lpinca There might be a valid case for multiple iterators when doing nested for loops. What do you think?

for (const x of stream) {
  for (const y of stream) {
  }
}

An alternative to this PR could be to always return the same iterator if one has already been created?

@mcollina mcollina added the semver-major PRs that contain breaking changes and should be released in the next major version. label Aug 12, 2019
@mcollina
Copy link
Member

@mcollina @lpinca There might be a valid case for multiple iterators when doing nested for loops. What do you think?

The internal loop is going to exhaust the stream anyway.

Copy link
Member

@mcollina mcollina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good work! I've left a few things to change.
I would like to know what @benjamingr and @devsnek think about this one, as it's a tricky case.

for await (const b of r) {
}
} catch (err) {
assert.strictEqual(err.code, 'ERR_STREAM_ITERATOR_EXISTS');
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test would have to be split in two. The previous code verified that iterating on an ended stream completes - this would have to be tested without the first for await. Then , we should add a new test about the ERR_STREAM_ITERATOR_EXISTS error.

test/parallel/test-readline-async-iterators.js Outdated Show resolved Hide resolved
Copy link
Member

@benjamingr benjamingr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not convinced we should do this - there are cases where creating multiple iterators might be valid and there are cases where I'd want to create an iterator - consume part of the stream, throw it away - then create another iterator to consume the rest of the stream.

@benjamingr
Copy link
Member

const cursor = database.query(...).cursor();
for await(const item of cursor) {
  if (item === 'special-item') {
    // found first special item
   report(item);
  }
}

for await(const item of cursor) {
  // do something with all future items
}

In this case the user might think they have a single iterable but they actually have two.

@mcollina
Copy link
Member

I am not convinced we should do this - there are cases where creating multiple iterators might be valid and there are cases where I'd want to create an iterator - consume part of the stream, throw it away - then create another iterator to consume the rest of the stream.

This is not currently possible. Adding break within an for await  will call .destroy()  on the stream to prevent leakages, and I think that's correct.

@benjamingr
Copy link
Member

This is not currently possible. Adding break within an for await will call .destroy() on the stream to prevent leakages, and I think that's correct.

Oh yeah, I remember that :D

That is correct behavior at a second glance, a break does return which finishes the stream.

@benjamingr benjamingr dismissed their stale review August 12, 2019 09:01

not sure about this anymore.

@mcollina
Copy link
Member

not sure about this anymore.

Unfortunately this is complex topic. Think about it a bit more, and let us know!

@benjamingr
Copy link
Member

@mcollina do we actually have a choice though? We don't have any concept of multiple readables for a resource in core (do we?) - and if someone wants to build a package that tees to support this then sure.

Is there actually an alternative other than throwing? (or erroring later)

@ronag
Copy link
Member Author

ronag commented Aug 12, 2019

I'll sort out the PR comments once we have more of a consensus.

@mcollina
Copy link
Member

Is there actually an alternative other than throwing? (or erroring later)

I think returning the same iterator might be considered an option.

@benjamingr
Copy link
Member

I guess that makes sense since it aligns with having the behaviours people expect for the same Readable for the same data source.

@ronag
Copy link
Member Author

ronag commented Aug 12, 2019

The internal loop is going to exhaust the stream anyway.

You could have a break... oh no... that will destroy the stream. So in this case we would need to reference count the number of createIterator vs destroy iterator requests, before we destroy the stream?

@benjamingr
Copy link
Member

You could have a break... oh no... that will destroy the stream. So in this case we would need to reference count the number of createIterator vs destroy iterator requests, before we destroy the stream?

Doing refcounting would just be confusing IMO, I am fine with either this behavior (throw) or Matteo's suggestion (same iterator).

@ronag
Copy link
Member Author

ronag commented Aug 12, 2019

Doing refcounting would just be confusing IMO,

It's just as confusing without refcounting :D. I totally got the nested behaviour wrong.

Copy link
Member

@devsnek devsnek left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Generally I like this (we do the same thing in web streams), but we should have a way to release the lock somehow.

lib/internal/streams/async_iterator.js Outdated Show resolved Hide resolved
@devsnek
Copy link
Member

devsnek commented Aug 12, 2019

Also for the above point about wanting multiple consumers, the general approach from other languages and from web streams is to tee the origin stream with a buffered intermediary (https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream/tee)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
errors Issues and PRs related to JavaScript errors originated in Node.js core. fs Issues and PRs related to the fs subsystem / file system. semver-major PRs that contain breaking changes and should be released in the next major version.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants