-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[pytorch] randperm lacks CUDA implementation #6874
Comments
It sounds like it's not implemented for CUDA. For now you can do the following:
|
How about the case when the randperm is used in |
It is generally a bad idea to set the default tensor type to be a cuda tensor. But still, |
I’ll take a stab at this. |
Ok, I solved the issue by changing two lines in the class RandomSampler(Sampler):
r"""Samples elements randomly, without replacement.
Arguments:
data_source (Dataset): dataset to sample from
"""
def __init__(self, data_source):
self.data_source = data_source
def __iter__(self):
cpu = torch.device('cpu')
return iter( torch.randperm( len(self.data_source), device = cpu).tolist())
def __len__(self):
return len(self.data_source) if it is okay, I can PR. |
If you have a question or would like help and support, please ask at our
forums.
If you are submitting a feature request, please preface the title with [feature request].
If you are submitting a bug report, please fill in the following details.
Issue description
when
torch.cuda.FloatTensor
is default tensor type, thentorch.randperm
shows an error as:Code example
System Info
The text was updated successfully, but these errors were encountered: