-
Notifications
You must be signed in to change notification settings - Fork 25.1k
Closed
Labels
Description
Snippet to reproduce.
import torch
from torch.autograd import Variable
x = Variable(torch.Tensor([1, 2, 3]), requires_grad=True)
y = x[0]
y.backward() # error
x[1] = 5 # error
The problem is in here, here and here, and is due to the fact that torch.Tensor
return numbers when indexing a 1D tensor (which I think is the desired behaviour as we don't have broadcasting yet?).
Maybe we should check if the input is 1D (and that the index is of range 1) and to use narrow
instead of select
in this case?
Autograd Variables
return a 1D tensor anyway if we index it with a 1D tensor in forward, but this solution could avoid the sync point with cuda tensors in Index
(as we don't return a number).
Or, another solution would be to have something like 0D tensors which behave almost like numbers? This seems to be what numpy does.