TestRegressionNM.test_ci_beta2 i386 AssertionError #1831

Closed
yarikoptic opened this Issue Jul 16, 2014 · 5 comments

Projects

None yet

3 participants

@yarikoptic
Contributor

There is quite a few issues about test_ci_beta2 but not this one specifically I think, so only on 32bit (ok on amd64)

======================================================================
FAIL: statsmodels.emplike.tests.test_regression.TestRegressionNM.test_ci_beta2
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
  File "/tmp/buildd/statsmodels/statsmodels/emplike/tests/test_regression.py", line 150, in test_ci_beta2
    assert_almost_equal(beta2ci, self.res2.test_ci_beta2, 6)
  File "/usr/lib/python2.7/dist-packages/numpy/testing/utils.py", line 454, in assert_almost_equal
    return assert_array_almost_equal(actual, desired, decimal, err_msg)
  File "/usr/lib/python2.7/dist-packages/numpy/testing/utils.py", line 811, in assert_array_almost_equal
    header=('Arrays are not almost equal to %d decimals' % decimal))
  File "/usr/lib/python2.7/dist-packages/numpy/testing/utils.py", line 644, in assert_array_compare
    raise AssertionError(msg)
AssertionError:
Arrays are not almost equal to 6 decimals

(mismatch 50.0%)
 x: array([ 0.60120459,  2.18481462])
 y: array([ 0.60120459,  2.18470794])

----------------------------------------------------------------------
Ran 16 tests in 171.578s
@josef-pkt josef-pkt added this to the 0.5.1 milestone Jul 16, 2014
@josef-pkt
Member

I will try one more time to fix the test. If I don't manage, then we skip this test this time.

@josef-pkt
Member

The last merge #1847 fixed the same failure on Ubuntu i386

I never managed to replicate this failure, but I found some other cases with non-convergence that are fixed now.

@yarikoptic Can you run the test for current master on the failing machine, to see if it doesn't fail anymore?

The last test run for this commit in http://nipy.bic.berkeley.edu/waterfall?category=statsmodels seems to have unrelated problems, so I cannot see if it also passes there. It should.

@yarikoptic
Contributor

On Mon, 28 Jul 2014, Josef Perktold wrote:

[2]@yarikoptic Can you run the test for current master on the failing
machine, to see if it doesn't fail anymore?

to make it more productive, may be I will just do another snapshot
package from the current master and see where it builds/fails

Yaroslav O. Halchenko, Ph.D.
http://neuro.debian.net http://www.pymvpa.org http://www.fail2ban.org
Research Scientist, Psychological and Brain Sciences Dept.
Dartmouth College, 419 Moore Hall, Hinman Box 6207, Hanover, NH 03755
Phone: +1 (603) 646-9834 Fax: +1 (603) 646-1419
WWW: http://www.linkedin.com/in/yarik

@jseabold jseabold modified the milestone: 0.6, 0.5.1 Sep 20, 2014
@jseabold
Member

Is this still relevant or has it been fixed.

@josef-pkt
Member

I think it's fixed, and the problem doesn't show up on any of our test machines anymore. But I haven't gotten the confirmation from @yarikoptic that it works on the two exotic Debian machines where it failed in 0.5.

reopen if necessary

@josef-pkt josef-pkt closed this Sep 20, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment