Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

CodeCoverage always appends data #105

Closed
milo opened this Issue · 5 comments

2 participants

@milo
Collaborator

CodeCoverage always appends data to coverage.dat file. But this file may exist from previous run and data may be outdated. In the result the coverage-reports generates nonsences.

Maybe related to #103.

@milo
Collaborator

Propose
Runner already sets env var NETTE_TESTER_RUNNER=1.
It will be changed to Runner's PID. NETTE_TESTER_RUNNER=$pid.
CodeCoverage::save() checks this env var. If not exists or not match, whole coverage.dat will be replaced.

Note
It will not work when Runner is instantized and used in test (e.g. Tester's tests). So NETTE_TESTER_RUNNER env var could inherit.

@milo milo added the bug label
@dg
Owner
dg commented

What about storing and checking filemtime in Collector::save()?

@milo
Collaborator

It may lie when test will be changed, e.g. some assertions will be removed but code stay marked as coveraged.

Or situation: first run tester Nette, second run tester Nette\Database. First run fills coverage.dat, second only Database files and coverage-report show coveraged whole framework even last run against Database only.

This appending can be desired. Make it configurable by Collector::save($file, $append = FALSE)?

@dg
Owner
dg commented

Yeah, I understand. Checking filemtime may be useful for ReportGenerator to ensure that the coverage.dat corresponds with files.

And about appending/replacing: It might be interesting to add the option for generating code coverage directly to the Tester. Path to coverage.dat should be passed via environment variable, it will be started automatically in Environment::setup and in this case it will switch to append mode.

After that Tester can generate HTML output.

@dg
Owner
dg commented

Closing, it is a feature ;-) #106 (comment)

@dg dg closed this
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.