Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

emit "jasmineDone" event after tests are complete #71

Merged
merged 1 commit into from
Jun 18, 2016
Merged

emit "jasmineDone" event after tests are complete #71

merged 1 commit into from
Jun 18, 2016

Conversation

jbblanchet
Copy link
Contributor

No description provided.

cb();
t.end();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

xo error correction

@jbblanchet
Copy link
Contributor Author

Solves #69

cb(error);
} else {
self.emit('end');
cb();
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get it. cb() should make end happen.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is what I thought too, but apparently I was wrong. Every time I think I've understood node streams, I learn that really don't...

Apparently in gulp the end event is emitted by draining the stream with a writable stream (see rvagg/through2#31 and gulpjs/gulp#1637). I thought that transform streams were both readable AND writable, but I still haven't read the definitive book on streams (which I'm very interested in buying whenever one comes out), so I must be missing something.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really think this is a bad pattern :( Maybe this should be a writable stream

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@phated Thanks for taking the time.

Ok, I've been reading to try to better understand the issue here, but I'm still confused. My understanding is that we're creating a transform stream, through the use of through2 and readable-stream, which is supposed to be a re-implementation of streams3. From the documentation, a transform stream is supposed to be both a readable AND a writable stream. So my understanding is that we already return a writable stream.

After reading your comment and re-reading the docs for the nth time, I thought that maybe the problem was that the data never starts flowing because we're missing the "data" handler or because we don't pipe to a writable stream (ref), but if that was the case the flush event would never be called, and it is since that's where I emit the end event.

I'm so sorry if I'm a little dense. I really, really want to understand streams.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am still pretty bad with streams but getting better, I hope. A Writable stream is completely different than a Transform stream. There are 4 types of streams: Readable (must start a pipeline), Writable (must end a pipeline), Duplex (Readable + Writable but necessarily coupled), and Transform (Readable & Writable stream that are coupled). I think the events (for a Transform stream) are best summed up by the Transform stream docs:

Events: 'finish' and 'end'#
The 'finish' and 'end' events are from the parent Writable and Readable classes respectively. The 'finish' event is fired after stream.end() is called and all chunks have been processed by stream._transform(), 'end' is fired after all data has been output which is after the callback in stream._flush() has been called.

A transform stream can't end the pipeline but since it's inception, gulp has recommended this pattern to users (either directly or indirectly through plugins due to miscommunication or incorrect understanding of streams). This was handled by adding a "drain" (or "resume") step to the task execution, which happens if you return the stream in both gulp 3.x and 4.0. On top of that, vinyl-fs changed at some point to use a Transform stream for it's .dest() method and streams would stop processing at 16 files, unless returned and drained; however, we've recently gone through the paces of making a Transform stream act sometimes as a Writable and sometimes as a Transform stream (depending on whether it was piped). I do not recommend doing this because it was an absolute hassle and I'm not completely sure if it works in every case.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if this is possible with jasmine, but I think a better pattern might be to have the Transform stream process all the test files and attach metadata, then create a Writable stream that is the reporter so you can do something like:

gulp.src('**/*.js').pipe(jasmine()).pipe(jasmine.report())

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I got it! I've finally got it! Took me long enough, but I understand what's happening. I'm sick today (which might have contributed to my epiphany), but I'll prepare something for tomorrow.

@jbblanchet
Copy link
Contributor Author

Ok, I'm back after a long week-end, so here's my understanding of the nature of the problem.

The stream

We are implementing a transform stream, which is both a readable AND writable stream. However, it will only behave like a readable stream (and thus emit the end event) if we enter flowing mode by piping it to something else, calling the .resume method or attaching a data handler (source). This is the reason why the end event is emitted when we return the stream, but not when we simply attach a end handler.

The end event

The end event is emitted by a readable stream when it has nothing more to push to the stream (source). Thus it is emitted AFTER the flush callback, because the flush might emit something in the stream. But it is not emitted if nothing is "listening" for data (see above).

The finish event

The finish event is emitted by a writable stream when it is done reading from upstream, but the documentation is misleading. It is emitted AFTER the flush method has been called, but BEFORE the flush callback has been called (source). So in our case, it is emitted after all the spec files have been passed to jasmine, but before jasmine is complete, and is not usable has a hook to check if jasmine is done.

Conclusion

Using finish is not a good option. Using end does not work reliably, because you need to have something listening downstream. I do believe that forcing the end event, while an imperfect solution, is still the best option here, but I'm willing to listen to other options.

@jbblanchet
Copy link
Contributor Author

@sindresorhus I've been rereading @phated's solution, and if you think it's the way to go I might take a stab at implementing it too. It would be a major version change. Upstream specs would be converted to downstream reporter events (something like { event: 'specDone', payload: { result: { /* ...*/ } } }, and 'applied' to all registered reporters. Reporter options would be passed to the writeReports method instead. Someone waiting for jasmine to finish would listen to the finish event after the writeReports call instead of end:

gulp.src('*.spec.js')
  .pipe(jasmine(/* jasmine options */))
  .pipe(jasmine.writeReports(/* reporters or default reporter options */))
  .on('finish', () => {
     // do whatever
     // finish is ok here, because it's not emitted by the _flush of the transform stream
     // but by the _end of the writable stream
  });

Personally what I don't like about this solution is that it kills the stream and you can't do anything else with the specs after they've passed through jasmine, but I can't find a use case where that would be useful.

If that's the way we want to go, I'd drop support for Node 0.10 and 0.12 at the same time.

I could have something ready for next week. Let me know if that's the way you want to go.

@sindresorhus
Copy link
Owner

Sure, sounds good to me.

If that's the way we want to go, I'd drop support for Node 0.10 and 0.12 at the same time.

👍

.pipe(jasmine.writeReports(/* reporters or default reporter options */))

I would name it .reporter() as that's what most of the similar plugins are using.

@jbblanchet
Copy link
Contributor Author

Since we're overhauling the whole thing, should we also remove the default reporter at the same time? This would simplify the package and the API, but force you to import and initialize the reporter manually.

@phated
Copy link

phated commented May 16, 2016

@jbblanchet one clarification: instead of emitting events, just attach the results to the vinyl object they were generated from and pass it downstream to the reporter writable stream.

@sindresorhus
Copy link
Owner

@jbblanchet
Copy link
Contributor Author

I'll check the metadata I have on the reporter calls, but we're running jasmine in the _flush method of the stream, so I'm not sure I'll be able to link every reporter call to a specific file. If doable I'll try to.

But yeah, I didn't mean "events" in the nodejs sense, I meant it in the abstract way. Something like:

cb(null, { event: 'specStarted', payload: result });

not

this.emit('specStarted', result);

@phated
Copy link

phated commented May 16, 2016

@jbblanchet using the callback just emits it as an event and you break anything that expected vinyl files inbetween.

@jbblanchet
Copy link
Contributor Author

It emits a 'data' event, not a 'specEvent' is what I meant.

If I can only pass files downstream, I don't think this solution will work, because there's not a strong link between specs and files like there is in jshint. We're running all the specs at once, no 1 spec file at a time.

Take the jasmineStarted and jasmineDone events. They're global to all the spec files. I can't attach them to a specific file.

@jbblanchet
Copy link
Contributor Author

The other solution I could see is keeping the reporter callbacks in memory and emptying them all at once, but we would lose the "real-time" reporting that we currently have, and I'm not sure that's worth it.

It seems to me to be a complex solution for what might only be a stopgap, since there's talk of changing the problematic behavior of _flush: nodejs/node#2314, but I'm open to your ideas and willing to implement them.

@jbblanchet
Copy link
Contributor Author

How about I simply replace the end event emitted in my pull request with a jasmineDone event. It's unorthodox, but it's both a valid solution and very simple, and doesn't mess with the end and finish events. So someone wanting to do something after jasmine has completed would simply do:

gulp.src('*.spec.js')
  .pipe(jasmine())
  .on('jasmineDone', () => /* do whatever */);

What do you think?

@jbblanchet jbblanchet changed the title force "end" event after tests are complete emit "jasmineDone" event after tests are complete Jun 8, 2016
@jbblanchet
Copy link
Contributor Author

I really do wish there was a :crickets: emoji in GitHub! 😄

Faced with the incredible popularity of my solutions, I've decided to pick the one that appeared to me both simple and not messing with what user of gulp-jasmine were expecting.

@sindresorhus, I'm leaving the decision whether to merge it or not to you.

@phated
Copy link

phated commented Jun 13, 2016

I really don't like custom events on streams.

@phated
Copy link

phated commented Jun 13, 2016

@jbblanchet why can't there be a writable stream that adds the reporter in flush? jasmine.report() -> no-ops the data handler but has the reporter logic in it's flush.

@jbblanchet
Copy link
Contributor Author

My problem with that solution is not where to put the logic, but how do I pass the events from jasmine to this reporter? In the jshint example you provided, events are attached to their vinyl file, because jshint process files, so the association is natural. But jasmine does not process files. It loads a bunch of files, then run all the suites and specs in those files, and emit events for those suites.

While I might be able to link back suites and specs to specific files, I can't link link the global jasmineStarted and jasmineDone events to any specific file.

There are solutions. I could arbitrarily link these events to the first and last files that have been globed. I could keep the events in memory. I could push a fake file in the flush event that contains the jasmine events. But I'm not sure any of those hacks is better than simply pushing the jasmineDone event. It's not a perfect solution, but it's really simple, does not require our users to change their gulp tasks, and reflects what they really want to do, which is to wait for jasmine to be done by waiting for the jasmineDone event with on('jasmineDone', () => /* ... */ )

@sindresorhus
Copy link
Owner

I don't like this either, but I also don't have time to shepherd through a "correct" solution, so this is the most pragmatic solution.

@sindresorhus sindresorhus merged commit 26cb78a into sindresorhus:master Jun 18, 2016
@sindresorhus
Copy link
Owner

Thanks @jbblanchet :)

@hebrerillo
Copy link

Hi everyone!

The jasmineDone event is only triggered if all tests have passed. If any test fails the event is not triggered. Am I right?

I need to know when all the tests have ended, regardless of failed tests. How can I know that information?

@jbblanchet
Copy link
Contributor Author

@hebrerillo Can you open a separate issue? I think it makes sense to have the jasmineDone event sent even on error, but it might be a breaking change. I'll have a look next week.

@jbblanchet jbblanchet deleted the emit-end branch January 15, 2017 14:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants