Skip to content

Conversation

@danchr
Copy link
Contributor

@danchr danchr commented Mar 15, 2019

This PR adjust expected failures to be reported as skips, includes output in successful tests and makes the verbose output more consistent with unittest.

danchr added 4 commits March 14, 2019 17:19
Generally speaking, expected occurrences are lowercase whereas
anything exceptional and worthy of notice is uppercase. Examples of
the former include tests passing, skips and expected failures;
examples of the latter include failures, errors and unexpected
successes.
@coveralls
Copy link

coveralls commented Mar 15, 2019

Coverage Status

Coverage decreased (-0.3%) to 99.213% when pulling 7c4d6a2 on danchr:expected into ed3fb0b on xmlrunner:master.

testinfo = self.infoclass(self, test, self.infoclass.ERROR, err)
testinfo.test_exception_name = 'ExpectedFailure'
testinfo.test_exception_message = 'EXPECTED FAILURE: {}'.format(testinfo.test_exception_message)
testinfo = self.infoclass(self, test, self.infoclass.SKIP, err)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://docs.python.org/3/library/unittest.html#unittest.expectedFailure

Mark the test as an expected failure. If the test fails it will be considered a success. If the test passes, it will be considered a failure.

Can you please elaborate why it should be marked as SKIP instead of ERROR or FAILURE?
To me a failure is a failure, expected or not; so I'm not sure I understand the change.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you actually split the changes and address this semantic change separately? I'm okay merging the rest.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll say that for Django's test suite, a recent change in unittest-xml-reporting
is making Jenkins marking our builds as "Unstable" rather than "Success" due to tests marked with @unittest.expectedFailure failing (as exected).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@timgraham, could be unrelated to this PR but related to some of my changes. please file a ticket with more info... I don't see django/django in travis, so I'm not sure which CI is used.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I meant is that I think this change would fix that issue. Possibly the regression is due to cc05679. Our CI is at https://djangoci.com/.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://docs.python.org/3/library/unittest.html#unittest.expectedFailure

Mark the test as an expected failure. If the test fails it will be considered a success. If the test passes, it will be considered a failure.

Can you please elaborate why it should be marked as SKIP instead of ERROR or FAILURE?
To me a failure is a failure, expected or not; so I'm not sure I understand the change.

An expected failure is a way of marking as “incorrect” — a good example might be a test that is in fact correct, but which tests functionality not currently working. The entire point of an expected failure is to ensure that the test suite remains passing while semantically marking a test as “wrong” or “broken”.

Compared to the obvious alternative of skipping the test, the main feature of expectedFailure is that the test run actually fails if the test suddenly starts working, preventing it from happening accidentally. From the CPython 3.7 sources:

    def wasSuccessful(self):
        """Tells whether or not this result was a success."""
[snip]
        return ((len(self.failures) == len(self.errors) == 0) and
                (not hasattr(self, 'unexpectedSuccesses') or
                 len(self.unexpectedSuccesses) == 0))

Given that the JUnit XML does not support this feature, a “skip” is reasonably close approximation, although one might also mark it as a success. The latter feels a bit wrong to me. An unexpected success, however, is a clear failure.

As with Django, the motivation for me was that our test suite suddenly started becoming “unstable” on Jenkins — I'm suspect that wasn't the case with the code I originally submitted, but you never know…

'message',
test_result.test_exception_message
)
if test_result.stdout:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: pep8, we shouldn't use if x when we mean if x is not None

@codecov-io
Copy link

Codecov Report

Merging #186 into master will decrease coverage by 0.62%.
The diff coverage is 87.67%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #186      +/-   ##
==========================================
- Coverage   99.49%   98.87%   -0.63%     
==========================================
  Files          17       17              
  Lines        1393     1595     +202     
==========================================
+ Hits         1386     1577     +191     
- Misses          7       18      +11
Impacted Files Coverage Δ
xmlrunner/result.py 98.49% <100%> (+0.16%) ⬆️
tests/testsuite.py 98.25% <74.28%> (-1.75%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ed3fb0b...7c4d6a2. Read the comment docs.

self.assertIn(b'<error', outdir.getvalue())
self.assertNotIn(b'<skip', outdir.getvalue())

def test_xmlrunner_safe_xml_encoding_name(self):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

duplicate test, issue with coverage

@dnozay dnozay merged commit 7c4d6a2 into xmlrunner:master Mar 17, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants