Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

test failures with the current master #3330

Closed
ev-br opened this Issue · 15 comments

4 participants

@ev-br
Collaborator
>>> from scipy.sparse import test
>>> test() 

gives a bunch of errors of this sort:

======================================================================
ERROR: test_base.TestLIL.test_sum(<type 'numpy.complex128'>, 4)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
  File "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/scipy/sparse/tests/test_base.py", line 654, in check
    assert_array_almost_equal(dat.sum(axis=-2), datsp.sum(axis=-2))
  File "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/numpy/matrixlib/defmatrix.py", line 435, in sum
    return N.ndarray.sum(self, axis, dtype, out)._align(axis)
  File "/home/br/virtualenvs/scipy-dev/`local/lib/python2.7/site-packages/numpy/matrixlib/defmatrix.py", line 376, in _align
    raise ValueError, "unsupported axis"
ValueError: unsupported axis

----------------------------------------------------------------------
Ran 8645 tests in 80.713s

FAILED (KNOWNFAIL=251, SKIP=997, errors=246)
<nose.result.TextTestResult run=8645 errors=246 failures=0>
>>> import scipy
>>> scipy.__version__
'0.14.0.dev-9812e4c'
>>> import numpy
>>> numpy.__version__
'1.6.2'

This all is with python 2.7.3 on 64 bit Ubuntu Precise with stock atlas.

@ev-br ev-br added the scipy.sparse label
@rgommers rgommers added the defect label
@rgommers rgommers added this to the 0.14.0 milestone
@pv
Owner
pv commented

Numpy bug, which seems to have been (accidentally) fixed in a43d255dbc in v1.7.0.
Should protect that with Numpy version check in Scipy tests.

@ev-br
Collaborator

Indeed, with numpy 1.8 (different machine, similar setup) sparse tests all pass. There are several pages of DeprecationWarnings though:

/home/br/installs/scipy/build/testenv/lib/python2.7/site-packages/scipy/sparse/tests/test_base.py:1619: DeprecationWarning: Implicitly casting between incompatible kinds. In a future numpy release, this will raise an error. Use casting="unsafe" if this is intentional.
  y += b
@pv
Owner
pv commented

The Deprecationwarnings should go away with Numpy 1.9, where += ends up calling __numpy_ufunc__.

@rgommers
Owner

There are more issues in sparse with numpy 1.5.1:

FAILED (KNOWNFAIL=179, SKIP=961, errors=194, failures=355)

A lot of these:

======================================================================
FAIL: Failure: AssertionError (
Not equal to tolerance rtol=1e-07, atol=0

(mismatch 100.0%)
 x: array([[ 1.,  0.,  0.,  2.],
       [ 3.,  0.,  1.,  0.],
       [ 0.,  2.,  0.,  0.]])
 y: array([[ 0.,  0.,  0.,  4.],
       [ 0.,  0.,  2.,  0.],
       [ 0., -4.,  0.,  0.]]))
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/rgommers/.local/lib/python2.7/site-packages/nose/loader.py", line 518, in makeTest
    return self._makeTest(obj, parent)
  File "/home/rgommers/.local/lib/python2.7/site-packages/nose/loader.py", line 577, in _makeTest
    return MethodTestCase(obj)
  File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 345, in __init__
    self.inst = self.cls()
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 144, in __init__
    self.datsp = self.spmatrix(self.dat)
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 3554, in spmatrix
    assert_allclose(M.A, NC.A)
  File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 1130, in assert_allclose
    verbose=verbose, header=header)
  File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 618, in assert_array_compare
    raise AssertionError(msg)
AssertionError: 
Not equal to tolerance rtol=1e-07, atol=0

(mismatch 100.0%)
 x: array([[ 1.,  0.,  0.,  2.],
       [ 3.,  0.,  1.,  0.],
       [ 0.,  2.,  0.,  0.]])
 y: array([[ 0.,  0.,  0.,  4.],
       [ 0.,  0.,  2.,  0.],
       [ 0., -4.,  0.,  0.]])

And some of each of these three types of errors:

======================================================================
FAIL: test_base.Test64Bit.test_resiliency_limit_10(<class 'test_base.TestCSR'>, 'test_set_slice')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
  File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 146, in skipper_func
    return f(*args, **kwargs)
  File "<string>", line 2, in check
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 93, in deco
    return func(*a, **kw)
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 3741, in check
    getattr(instance, method_name)()
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 2065, in test_set_slice
    assert_array_equal(A.todense(), B, repr(a))
  File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 686, in assert_array_equal
    verbose=verbose, header='Arrays are not equal')
  File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/utils.py", line 618, in assert_array_compare
    raise AssertionError(msg)
AssertionError: 
Arrays are not equal
slice(3, None, None)
(mismatch 100.0%)
 x: matrix([[ 0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.],
        [ 1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.],
        [ 0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.],...
 y: matrix([[ 0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.],
        [ 1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.,  1.],
        [ 0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.],...

======================================================================
ERROR: test_base.Test64Bit.test_resiliency_limit_10(<class 'test_base.TestCSR'>, 'test_getnnz_axis')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
  File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 146, in skipper_func
    return f(*args, **kwargs)
  File "<string>", line 2, in check
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 93, in deco
    return func(*a, **kw)
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 3741, in check
    getattr(instance, method_name)()
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 2694, in test_getnnz_axis
    assert_array_equal(bool_dat.sum(axis=0), datsp.getnnz(axis=0))
  File "/home/rgommers/Code/scipy/scipy/sparse/compressed.py", line 105, in getnnz
    return np.bincount(self.indices, minlength=N)
TypeError: 'minlength' is an invalid keyword argument for this function

======================================================================
ERROR: test_base.Test64Bit.test_resiliency_limit_10(<class 'test_base.TestBSR'>, 'test_minmax_axis')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/rgommers/.local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
    self.test(*self.arg)
  File "/home/rgommers/.local/lib/python2.7/site-packages/numpy/testing/decorators.py", line 146, in skipper_func
    return f(*args, **kwargs)
  File "<string>", line 2, in check
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 93, in deco
    return func(*a, **kw)
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 3741, in check
    getattr(instance, method_name)()
  File "/home/rgommers/Code/scipy/scipy/sparse/tests/test_base.py", line 2647, in test_minmax_axis
    assert_array_equal(X.max(axis=axis).A, D.max(axis=axis).A)
  File "/home/rgommers/Code/scipy/scipy/sparse/data.py", line 161, in max
    return self._min_or_max(axis, np.maximum)
  File "/home/rgommers/Code/scipy/scipy/sparse/data.py", line 147, in _min_or_max
    return self._min_or_max_axis(axis, min_or_max)
  File "/home/rgommers/Code/scipy/scipy/sparse/data.py", line 115, in _min_or_max_axis
    major_index, value = mat._minor_reduce(min_or_max)
  File "/home/rgommers/Code/scipy/scipy/sparse/compressed.py", line 591, in _minor_reduce
    value = ufunc.reduceat(self.data, self.indptr[major_index])
TypeError: array cannot be safely cast to required type
@rgommers
Owner

numpy < 1.7.0 doesn't support negative values for axis keyword to sum method. Will send a fix for that in a bit.

@pv
Owner
pv commented

_csparsetools is also broken, as apparently fused types in Cython don't work as you'd believe.

@rgommers rgommers referenced this issue from a commit in rgommers/scipy
@rgommers rgommers TST: fix sparse test errors due to axis=-1,-2 usage in np.matrix.sum().
Negative axis arguments are not supported by numpy < 1.7.0.
Fixes part of gh-3330.
48d994f
@rgommers
Owner

The test_set_slice failures look like a real bug:

A = csc_matrix((5,10))
A[slice(3, None, None)] = 9
print(A.todense())
B = np.matrix(np.zeros((5,10), float))
B[slice(3, None, None)] = 9
print(B)

gives:

[[ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.]
 [ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.]
 [ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.]
 [ 9.  9.  9.  9.  0.  9.  9.  9.  9.  9.]     # note the spurious 0 in this row
 [ 9.  9.  9.  9.  9.  9.  9.  9.  9.  9.]]
[[ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.]
 [ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.]
 [ 0.  0.  0.  0.  0.  0.  0.  0.  0.  0.]
 [ 9.  9.  9.  9.  9.  9.  9.  9.  9.  9.]
 [ 9.  9.  9.  9.  9.  9.  9.  9.  9.  9.]]

Shows up with numpy 1.5.1 and 1.6.0, not for 1.8.0. On 32-bit linux.

@pv
Owner
pv commented

There's then some behavior change in numpy that breaks _cs_matrix._insert_many (pure-Python).

@pv
Owner
pv commented

This is the difference:

>>> x = np.array([0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 3, 3,3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6,6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 8, 8, 8, 8, 8, 8, 8, 9, 9, 9, 9, 9, 9,9], dtype=np.int32)

Numpy 1.5.1:

>>> np.unique(x, return_index=True)[1]
array([ 0, 13, 19, 21, 32, 40, 42, 55, 62, 68])

Numpy 1.7.1:

>>> np.unique(x, return_index=True)[1]
array([ 0,  7, 14, 21, 28, 35, 42, 49, 56, 63])

This seems to be a Numpy fixed bug: numpy/numpy#2655
Quite recent, too, so the issue also exists in Numpy 1.6.

@rgommers
Owner

This one (full traceback in my first comment above), or more precisely 351x this one, is still present after all current PRs are merged:

FAIL: Failure: AssertionError (
Not equal to tolerance rtol=1e-07, atol=0

(mismatch 100.0%)
 x: array([[ 1.,  0.,  0.,  2.],
       [ 3.,  0.,  1.,  0.],
       [ 0.,  2.,  0.,  0.]])
 y: array([[ 0.,  0.,  0.,  4.],
       [ 0.,  0.,  2.,  0.],
       [ 0., -4.,  0.,  0.]]))

I haven't investigated yet.

@rgommers
Owner

OK it's the failure in the comment above plus the usage of minlength keyword for bincount which are the remaining issues with 1.5.1. 1.6.0 and 1.8.0 are OK.

@rgommers
Owner

All fixed. Thanks @pv.

@rgommers rgommers closed this
@jnothman

This one (full traceback in my first comment above), or more precisely 351x this one, is still present after all current PRs are merged:

Either this is a problem in toarray with duplicate indices, or in test_base._same_sum_duplicate. The former is basically all SWIGed. The latter includes arr.repeat(...) and arr[1::2] *= -1 but doesn't get much more complicated. Maybe I've missed something.

@rgommers
Owner

@jnothman it's fixed now. I think 0ef20f9 did it.

@jnothman
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.