-
Notifications
You must be signed in to change notification settings - Fork 24.9k
Closed
Description
Sometimes constructing a Tensor with a storage interprets the storage as the backing storage:
a = torch.IntTensor(torch.IntStorage([1,2,3]))
1
2
3
[torch.IntTensor of size 3]
But not if it's a LongStorage
a = torch.LongTensor(torch.LongStorage([1,2,3]))
(0,.,.) =
1.4059e+14 1.4059e+14 1.4059e+14
0.0000e+00 0.0000e+00 1.4059e+14
[torch.LongTensor of size 1x2x3]
This is because we want to allow constructions like:
a = torch.IntTensor([1,2,3])
b = torch.FloatTensor(a.size())
But we also want to allow things like:
a = torch.IntTensor([1,2,3])
b = torch.IntTensor(a.storage())
We should resolve this ambiguity, probably using keyword arguments. We probably need to require something like:
a = torch.XTensor(size=b.size())
a = torch.XTensor(storage=b.storage())
fmassa
Metadata
Metadata
Assignees
Labels
No labels