-
Notifications
You must be signed in to change notification settings - Fork 25.4k
Closed
Labels
triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🐛 Bug
The documentation states that for torch.gather(input, dim, index)
, the index
tensor must have the same size in all dimensions as input
, except for dimension dim
. This check is respected in PyTorch 1.5.1, but not on master.
To Reproduce
On 1.5.1:
>>> t = torch.tensor([[1,2],[3,4]])
>>> index = torch.tensor([[0]])
>>> torch.gather(t, 1, index)
RuntimeError: Size does not match at dimension 0 get 2 vs 1
On master:
t = torch.tensor([[1,2],[3,4]])
index = torch.tensor([[0]])
torch.gather(t, 1, index)
Expected behavior
I'm not sure. Either we should update the documentation or add an error check.
Metadata
Metadata
Assignees
Labels
triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module