New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't expect so many failures #161

Open
vbabiy opened this Issue Mar 15, 2011 · 13 comments

Comments

Projects
None yet
6 participants
@vbabiy
Contributor

vbabiy commented Mar 15, 2011

I can only guess that test_basic.py has so many expect_failure=True's out of a desire to be resilient if the testing environment doesn't have tools like SVN installed. Really, if there's a failure in [[http://bitbucket.org/ianb/pip/src/tip/tests/test_basic.py#cl-50|the SVN command]], though, it's unlikely that any of the following lines will pass. As it is, the instruction to skip looks like at best obfuscation and at worse something that could cause a real problem to slip by undetected. The right answer is probably to use a mechanism to skip tests with dependencies that aren't installed (and have a mode where everything is run anyway so we can find out what's needed)


@vbabiy

This comment has been minimized.

Show comment
Hide comment
@vbabiy

vbabiy Mar 15, 2011

Contributor

I think I misinterpreted what's going on here. In most of these cases we
really should expect an error. In fact, it's stronger than that: we want
an error. The problem is that scripttest has no way to express that: itonly
has a way of ignoring errors but no way of insisting on them (i.e. asserting
if there's no error). As a result it just masks the cases where we expect an
error but in reality we don't get one, and there's nothing checking to make
sure we haven't just liberally sprinkled expect_error=True even where it
really doesn't belong.


Original Comment By: Dave Abrahams
Contributor

vbabiy commented Mar 15, 2011

I think I misinterpreted what's going on here. In most of these cases we
really should expect an error. In fact, it's stronger than that: we want
an error. The problem is that scripttest has no way to express that: itonly
has a way of ignoring errors but no way of insisting on them (i.e. asserting
if there's no error). As a result it just masks the cases where we expect an
error but in reality we don't get one, and there's nothing checking to make
sure we haven't just liberally sprinkled expect_error=True even where it
really doesn't belong.


Original Comment By: Dave Abrahams
@vbabiy

This comment has been minimized.

Show comment
Hide comment
@vbabiy

vbabiy Mar 15, 2011

Contributor

Dave: was this cleaned up to your satisfaction in your branch, now merged? Or
is there reason to leave this open? Resolving unless you say otherwise.


Original Comment By: Carl Meyer
Contributor

vbabiy commented Mar 15, 2011

Dave: was this cleaned up to your satisfaction in your branch, now merged? Or
is there reason to leave this open? Resolving unless you say otherwise.


Original Comment By: Carl Meyer
@vbabiy

This comment has been minimized.

Show comment
Hide comment
@vbabiy

vbabiy Mar 15, 2011

Contributor

No, it wasn't cleaned up systematically. What should happen is that when
expect_error=True, the test framework raises an AssertionError unless an
error actually did occur. That would immediately reveal all the places where
it's been added erroneously.


Original Comment By: Dave Abrahams
Contributor

vbabiy commented Mar 15, 2011

No, it wasn't cleaned up systematically. What should happen is that when
expect_error=True, the test framework raises an AssertionError unless an
error actually did occur. That would immediately reveal all the places where
it's been added erroneously.


Original Comment By: Dave Abrahams
@vbabiy

This comment has been minimized.

Show comment
Hide comment
@vbabiy

vbabiy Mar 15, 2011

Contributor
  • Changed status from resolved to open.

Original Comment By: Dave Abrahams
Contributor

vbabiy commented Mar 15, 2011

  • Changed status from resolved to open.

Original Comment By: Dave Abrahams
@vbabiy

This comment has been minimized.

Show comment
Hide comment
@vbabiy

vbabiy Mar 15, 2011

Contributor
  • Changed responsible from nobody to carljm.

Original Comment By: Dave Abrahams
Contributor

vbabiy commented Mar 15, 2011

  • Changed responsible from nobody to carljm.

Original Comment By: Dave Abrahams
@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Jan 30, 2014

Member

Pretty sure this is long since no longer applicable.

Member

dstufft commented Jan 30, 2014

Pretty sure this is long since no longer applicable.

@dstufft dstufft closed this Jan 30, 2014

@qwcode

This comment has been minimized.

Show comment
Hide comment
@qwcode

qwcode Jan 30, 2014

Contributor

we still have many cases of expect_failure=True. I'm guessing a number of those aren't need, but I'd have to look closer at them.

Contributor

qwcode commented Jan 30, 2014

we still have many cases of expect_failure=True. I'm guessing a number of those aren't need, but I'd have to look closer at them.

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Jan 30, 2014

Member

Oh, whoops. I totally misread this ticket, I thought it was talking about expected failures of the test suite not of script test, that's what I get for not reading closely enough.

Member

dstufft commented Jan 30, 2014

Oh, whoops. I totally misread this ticket, I thought it was talking about expected failures of the test suite not of script test, that's what I get for not reading closely enough.

@dstufft dstufft reopened this Jan 30, 2014

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft Mar 22, 2017

Member

Just to be clear, the flag is called expect_error, I do believe we still have many instances of them and most of them are invalid and should be removed.

Member

dstufft commented Mar 22, 2017

Just to be clear, the flag is called expect_error, I do believe we still have many instances of them and most of them are invalid and should be removed.

@luojiebin

This comment has been minimized.

Show comment
Hide comment
@luojiebin

luojiebin Apr 5, 2017

Contributor

Should we remove all test with "expect_error"? Is there a standard to identify it as invalid?

Contributor

luojiebin commented Apr 5, 2017

Should we remove all test with "expect_error"? Is there a standard to identify it as invalid?

@xavfernandez

This comment has been minimized.

Show comment
Hide comment
@xavfernandez

xavfernandez Apr 5, 2017

Contributor

The easy fix part I had in mind was to check wheither the tests were in fact passing but I don't think we have any test unexpectedly passing. So nevermind the easy fix ^^

Contributor

xavfernandez commented Apr 5, 2017

The easy fix part I had in mind was to check wheither the tests were in fact passing but I don't think we have any test unexpectedly passing. So nevermind the easy fix ^^

@pradyunsg

This comment has been minimized.

Show comment
Hide comment
@pradyunsg

pradyunsg May 16, 2017

Member

What's the appropriate action to take to resolve this then?

Member

pradyunsg commented May 16, 2017

What's the appropriate action to take to resolve this then?

@dstufft

This comment has been minimized.

Show comment
Hide comment
@dstufft

dstufft May 19, 2017

Member

All over the place there are tests that call script.pip (or similar) that have an expect_error=True or expect_stderr=True that don't need it. The way to resolve this ticket is for someone to go through and remove any of those that are not needed for the tests to pass.

Member

dstufft commented May 19, 2017

All over the place there are tests that call script.pip (or similar) that have an expect_error=True or expect_stderr=True that don't need it. The way to resolve this ticket is for someone to go through and remove any of those that are not needed for the tests to pass.

@pradyunsg pradyunsg added the T: DevOps label Jun 26, 2017

@pradyunsg pradyunsg assigned pradyunsg and unassigned pradyunsg Aug 17, 2017

@pradyunsg pradyunsg self-assigned this Jun 6, 2018

@pradyunsg pradyunsg added this to the Internal Cleansing milestone Jun 6, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment