Skip to content

einsum broadcast regression (with optimize=True) #10343

@d70-t

Description

@d70-t

In numpy 1.13.3, it was possible to execute the following snippet without errors, while in 1.14.0 this happens:

In [1]: import numpy as np
In [2]: a = np.ones((10,2))
In [3]: b = np.ones((1,2))
In [4]: np.einsum('t...i,ti->t...', a, b)
Traceback (most recent call last):
  File "<ipython-input-4-fa62d1d882f9>", line 1, in <module>
    np.einsum('t...i,ti->t...', a, b)
  File "/usr/local/lib/python2.7/dist-packages/numpy/core/einsumfunc.py", line 1087, in einsum
    einsum_call=True)
  File "/usr/local/lib/python2.7/dist-packages/numpy/core/einsumfunc.py", line 710, in einsum_path
    "not match previous terms.", char, tnum)
ValueError: ("Size of label '%s' for operand %d does not match previous terms.", 't', 1)

However optimize=False solves the problem:

In [5]: np.einsum('t...i,ti->t...', a, b, optimize=False)
Out[5]: array([2., 2., 2., 2., 2., 2., 2., 2., 2., 2.])

Is this intended behavior and the user is now responsible to explicitly disable optimization or is this a bug?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions