TST: Fix locally failing tests. #1549

Merged
merged 6 commits into from Apr 4, 2014

Projects

None yet

2 participants

@jseabold
Member
jseabold commented Apr 3, 2014

This partially address #1521. I still don't see the grouputils failures. Is that pandas 0.11.0? If so, we can close that because we haev 0.12.0 as minimum now.

@jseabold
Member
jseabold commented Apr 3, 2014

Should the GMM tests marked with np.testing.decorators.knownfailureif give an error? They do for me locally. It looks like the decorator works, but instead of marking as a knownfail it marks as an error for me.

@josef-pkt
Member

When I ran nosetests xxx on the commandline, then I always got the knownfailures as errors. IIRC
But they don't show up as errors in the python shell.

I haven't checked in a while but it was because of the way numpy.testing and nose interacted.

@jseabold
Member
jseabold commented Apr 3, 2014

Yeah, that's what I see too. Not ideal for local testing.

I removed the generator in the test_multi. All of the tests still run but it doesn't list the whole shebang if we run the tests in verbose mode.

@jseabold
Member
jseabold commented Apr 3, 2014

I'm punting on that. We should probably do conditional skips. I could've sworn we had known failures before that didn't raise an error from the command line.

@josef-pkt
Member

I think we only use skipif to avoid knownfailure.

@jseabold
Member
jseabold commented Apr 4, 2014

Changed knownfailure to skipif.

@jseabold jseabold merged commit 726bd90 into statsmodels:master Apr 4, 2014

1 check passed

continuous-integration/travis-ci The Travis CI build passed
Details
@jseabold jseabold deleted the jseabold:test-fails branch Apr 4, 2014
@josef-pkt josef-pkt added the PR label Apr 14, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment