Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

backward() in Autograd Index Function is broken when indexing with LongTensor #828

Closed
BarclayII opened this issue Feb 22, 2017 · 2 comments
Assignees

Comments

@BarclayII
Copy link

In torch/autograd/_functions/tensor.py:20:

    def backward(self, grad_output):
        # TODO: this won't have to be zeroed
        grad_input = grad_output.new(self.input_size).zero_()
        grad_input.index(self.index).copy_(grad_output)
        return grad_input

I think the index-copy statement doesn't work as intended.

In [15]: a = torch.zeros((3, 5))

In [16]: a
Out[16]: 

 0  0  0  0  0
 0  0  0  0  0
 0  0  0  0  0
[torch.FloatTensor of size 3x5]

In [17]: b = torch.ones((2, 5))

In [18]: a.index(torch.LongTensor([0, 2])).copy_(b)
Out[18]: 

 1  1  1  1  1
 1  1  1  1  1
[torch.FloatTensor of size 2x5]

In [19]: a
Out[19]: 

 0  0  0  0  0
 0  0  0  0  0
 0  0  0  0  0
[torch.FloatTensor of size 3x5]

I guess the statement should be something like

        grad_input.index_copy_(0, self.index, grad_output)
@soumith
Copy link
Member

soumith commented Feb 22, 2017

you're right. looks like a bug. will fix.

@soumith soumith changed the title backward() in Autograd Index Function is likely broken backward() in Autograd Index Function is broken when indexing with LongTensor Feb 23, 2017
@apaszke apaszke self-assigned this Feb 25, 2017
@apaszke apaszke mentioned this issue Feb 25, 2017
@apaszke
Copy link
Contributor

apaszke commented Feb 26, 2017

Fixed in #852.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants