You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
defbackward(self, grad_output):
# TODO: this won't have to be zeroedgrad_input=grad_output.new(self.input_size).zero_()
grad_input.index(self.index).copy_(grad_output)
returngrad_input
I think the index-copy statement doesn't work as intended.
In [15]: a = torch.zeros((3, 5))
In [16]: a
Out[16]:
0 0 0 0 0
0 0 0 0 0
0 0 0 0 0
[torch.FloatTensor of size 3x5]
In [17]: b = torch.ones((2, 5))
In [18]: a.index(torch.LongTensor([0, 2])).copy_(b)
Out[18]:
1 1 1 1 1
1 1 1 1 1
[torch.FloatTensor of size 2x5]
In [19]: a
Out[19]:
0 0 0 0 0
0 0 0 0 0
0 0 0 0 0
[torch.FloatTensor of size 3x5]
soumith
changed the title
backward() in Autograd Index Function is likely broken
backward() in Autograd Index Function is broken when indexing with LongTensor
Feb 23, 2017
In
torch/autograd/_functions/tensor.py:20
:I think the index-copy statement doesn't work as intended.
I guess the statement should be something like
The text was updated successfully, but these errors were encountered: