Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Report not working when running parallel browsers #21

Closed
louiscbotha opened this issue Sep 8, 2015 · 7 comments
Closed

Report not working when running parallel browsers #21

louiscbotha opened this issue Sep 8, 2015 · 7 comments

Comments

@louiscbotha
Copy link
Contributor

When running tests with nightwatch using multiple browsers simultaneously, the exported report doesn't combine all the output reports into one. Currently only one of the reports gets provided in html.

There should be a way that this nightwatch-html-reporter automatically fetches all reports for each of these browsers and combines it into one report.

@louiscbotha
Copy link
Contributor Author

Hmm seems like I was wrong.

@Zeboch
Copy link

Zeboch commented Apr 13, 2016

How do you manage to get multiple reports ? I'm on the same dead end : I'm running tests on parallel browsers and only one report is generated.

@jls
Copy link
Owner

jls commented Apr 13, 2016

Hello,
Are you running the reporter from the command line or using the reporter option in nightwatch/global.js?

One other question is how are you starting the parallel tests? In the nightwatch docs it looks like if you run tests in parallel using the same browser the reports are overwritten since the browser name is the same.

@Zeboch
Copy link

Zeboch commented Apr 14, 2016

I'm using reporter option. But I find a workaround : this is my html-reporter.js file

var HtmlReporter = require('nightwatch-html-reporter');

/* Same options as when using the built in nightwatch reporter option */
var reporter = new HtmlReporter({
    openBrowser: false,
    reportsDirectory: 'src/e2e/javascript/reports/',
    reportFilename: 'report' + '_' + process.env.__NIGHTWATCH_ENV + '.html',
    themeName: 'outlook'
});

module.exports = {
    write : function(results, options, done) {
        reporter.fn(results, done);
    }
};

Meaning it's ok if reporteFilename is different for every report. And if you run tests using same browser in parallel, I suggest to add a timestamp ;)

@jls
Copy link
Owner

jls commented Apr 14, 2016

Perfect thank you! If you don't mind I'll add a section to the readme that outlines your solution in case others try and do the same thing.

@Zeboch
Copy link

Zeboch commented Apr 14, 2016

I think it's not a perfect solution but if it helps, no problemo ;)

jls pushed a commit that referenced this issue May 27, 2016
For issue #21 and issue #31 this allows parallel tests to run without overwriting generated reports.
@nestoru
Copy link

nestoru commented Feb 19, 2017

@jls Thanks for this package.

@louiscbotha I think you closed the issue prematurely. If multiple tests are run in parallel in a multicore machine the issue is evident.

Given that NighWatch does create individual report JUnit XML files per browser and test file name, I believe it makes sense to support the creation of a single html file.

I created https://github.com/nestoru/nightwatch-lean-e2e, partly to fundament the need for keeping the results in just one file. I think that having just one report.html containing the result of all tests is a real need.

For now something like the below can be used to catch the results in ansi and html as explained in the nightwatch-lean-e2e project. Note the usage of the ENV VAR $APP_VERSION, which allows to keep the history of each e2e run per app deployed version:

RPT_DIR=reports/$APP_VERSION; mkdir -p "$RPT_DIR"; nightwatch --config nightwatch.conf.js | tee "$RPT_DIR/console-report.ansi"; cat "$RPT_DIR/console-report.ansi" | ansi-to-html -n > "$RPT_DIR/console-report.html"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants