Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_softmax_sums_to_one[CupyOps] and test_softmax_works_inplace[CupyOps] failing #83

Closed
ccoulombe opened this issue Nov 13, 2018 · 1 comment

Comments

@ccoulombe
Copy link
Contributor

The test_softmax_sums_to_one[CupyOps] and test_softmax_works_inplace[CupyOps] are failing, both with:

E   TypeError: Unsupported type <class 'numpy.ndarray'>

cupy/core/elementwise.pxi:68: TypeError

in a python 3.6 virtual env. on linux. The GPU is a Tesla K80 and cuda version is 9.0.176.

(.thinc) ~ $ pip install -U --force -r requirements.txt cupy==4.5.0 thinc_gpu_ops==0.0.3 thinc==6.12.0
(.thinc) ~ $ pip list
Package        Version
-------------- -------
atomicwrites   1.2.1  
attrs          18.2.0 
cupy           4.5.0  
cymem          2.0.2  
Cython         0.29   
cytoolz        0.9.0.1
dill           0.2.8.2
fastrlock      0.4    
hypothesis     2.0.0  
mock           2.0.0  
more-itertools 4.3.0  
msgpack        0.5.6  
msgpack-numpy  0.4.3.2
msgpack-python 0.5.6  
murmurhash     1.0.1  
numpy          1.15.2 
pbr            5.1.1  
pip            18.1   
plac           0.9.6  
pluggy         0.8.0  
preshed        2.0.1  
py             1.7.0  
pytest         3.10.1 
setuptools     40.6.2 
six            1.11.0 
thinc          6.12.0 
thinc-gpu-ops  0.0.3  
toolz          0.9.0  
tqdm           4.28.1 
wheel          0.32.2 
wrapt          1.10.11
(.thinc) ~ $ pytest .thinc/lib/python3.6/site-packages/thinc
....
________________________________________________________________________________________ test_softmax_sums_to_one[CupyOps] _________________________________________________________________________________________

ops = <thinc.neural.ops.CupyOps object at 0x7f93a9da4518>

    @settings(max_examples=MAX_EXAMPLES)
>   @given(X=strategies.arrays_BI())
    def test_softmax_sums_to_one(ops, X):

.thinc/lib/python3.6/site-packages/thinc/tests/unit/test_ops.py:148: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.thinc/lib/python3.6/site-packages/hypothesis/core.py:713: in wrapped_test
    print_example=True, is_final=True
.thinc/lib/python3.6/site-packages/hypothesis/executors/executors.py:25: in default_executor
    return function()
.thinc/lib/python3.6/site-packages/hypothesis/core.py:376: in run
    return test(*args, **kwargs)
.thinc/lib/python3.6/site-packages/thinc/tests/unit/test_ops.py:150: in test_softmax_sums_to_one
    y = ops.softmax(X)
ops.pyx:215: in thinc.neural.ops.Ops.softmax
    ???
.thinc/lib/python3.6/site-packages/cupy/core/fusion.py:871: in __call__
    return self._cupy_op(*args, **kwargs)
cupy/core/elementwise.pxi:753: in cupy.core.core.ufunc.__call__
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???
E   TypeError: Unsupported type <class 'numpy.ndarray'>

cupy/core/elementwise.pxi:68: TypeError
---------------------------------------------------------------------------------------------------- Hypothesis ----------------------------------------------------------------------------------------------------
Falsifying example: test_softmax_sums_to_one(ops=<thinc.neural.ops.CupyOps object at 0x7f93a9da4518>, X=array([[-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100.]], dtype=float32))
_______________________________________________________________________________________ test_softmax_works_inplace[CupyOps] ________________________________________________________________________________________

ops = <thinc.neural.ops.CupyOps object at 0x7f93a995e390>

    @settings(max_examples=MAX_EXAMPLES)
>   @given(X=strategies.arrays_BI())
    def test_softmax_works_inplace(ops, X):

.thinc/lib/python3.6/site-packages/thinc/tests/unit/test_ops.py:167: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
.thinc/lib/python3.6/site-packages/hypothesis/core.py:713: in wrapped_test
    print_example=True, is_final=True
.thinc/lib/python3.6/site-packages/hypothesis/executors/executors.py:25: in default_executor
    return function()
.thinc/lib/python3.6/site-packages/hypothesis/core.py:376: in run
    return test(*args, **kwargs)
.thinc/lib/python3.6/site-packages/thinc/tests/unit/test_ops.py:169: in test_softmax_works_inplace
    ops.softmax(X, inplace=True)
ops.pyx:215: in thinc.neural.ops.Ops.softmax
    ???
.thinc/lib/python3.6/site-packages/cupy/core/fusion.py:871: in __call__
    return self._cupy_op(*args, **kwargs)
cupy/core/elementwise.pxi:753: in cupy.core.core.ufunc.__call__
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???
E   TypeError: Unsupported type <class 'numpy.ndarray'>

cupy/core/elementwise.pxi:68: TypeError
---------------------------------------------------------------------------------------------------- Hypothesis ----------------------------------------------------------------------------------------------------
Falsifying example: test_softmax_works_inplace(ops=<thinc.neural.ops.CupyOps object at 0x7f93a995e390>, X=array([[-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100.],
       [-100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100., -100., -100., -100., -100., -100., -100.,
        -100., -100., -100.]], dtype=float32))

Let me know if I can provide more information.

Thanks!

@honnibal
Copy link
Member

Thanks, fixed!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants