-
-
Notifications
You must be signed in to change notification settings - Fork 12.2k
einsum broadcast regression (with optimize=True) #10343
Copy link
Copy link
Closed
Description
In numpy 1.13.3, it was possible to execute the following snippet without errors, while in 1.14.0 this happens:
In [1]: import numpy as np
In [2]: a = np.ones((10,2))
In [3]: b = np.ones((1,2))
In [4]: np.einsum('t...i,ti->t...', a, b)
Traceback (most recent call last):
File "<ipython-input-4-fa62d1d882f9>", line 1, in <module>
np.einsum('t...i,ti->t...', a, b)
File "/usr/local/lib/python2.7/dist-packages/numpy/core/einsumfunc.py", line 1087, in einsum
einsum_call=True)
File "/usr/local/lib/python2.7/dist-packages/numpy/core/einsumfunc.py", line 710, in einsum_path
"not match previous terms.", char, tnum)
ValueError: ("Size of label '%s' for operand %d does not match previous terms.", 't', 1)However optimize=False solves the problem:
In [5]: np.einsum('t...i,ti->t...', a, b, optimize=False)
Out[5]: array([2., 2., 2., 2., 2., 2., 2., 2., 2., 2.])Is this intended behavior and the user is now responsible to explicitly disable optimization or is this a bug?
Reactions are currently unavailable