-
Notifications
You must be signed in to change notification settings - Fork 9.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix torch.randperm #5014
Fix torch.randperm #5014
Conversation
Codecov Report
@@ Coverage Diff @@
## master #5014 +/- ##
==========================================
- Coverage 65.65% 65.49% -0.16%
==========================================
Files 257 257
Lines 20080 20080
Branches 3419 3419
==========================================
- Hits 13183 13152 -31
- Misses 6185 6211 +26
- Partials 712 717 +5
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
Hi @ypwhs , |
@@ -51,7 +51,7 @@ def random_choice(self, gallery, num): | |||
else: | |||
device = 'cpu' | |||
gallery = torch.tensor(gallery, dtype=torch.long, device=device) | |||
perm = torch.randperm(gallery.numel(), device=gallery.device)[:num] | |||
perm = torch.randperm(gallery.numel()).to(device=gallery.device)[:num] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your contribution. Can you add some comments? such as the reason for the modification and the corresponding link.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just a simple problem fix, the specific problem may be in PyTorch, and it has not been repaired so far. I just tried to fix it, and it works, so there is no link to provide. Thanks for your review.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand the meaning, but it is best to write the reason for the modification in the code to facilitate reading the code in the future. Thank you!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can simply add a line of comments to refer to this PR and the issue link of PyTorch and indicate that this is a workaround. Thus, users could easily understand the code. When PyTorch finally fixes this issue, we may clean the code in the future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added comments and optimized the speed of copying.
Temporary fix for torch.randperm, from: open-mmlab#5014
Temporary fix for torch.randperm, from: open-mmlab#5014
Environment
Error Log
Reason
will get huge number like:
Solution
Generate random numbers on CPU, and copy them to GPU.
Or, wait PyTorch fix it.
For User: You could downgrade PyTorch to 1.7.1.
Related Issue
#4734
#4824
pytorch/pytorch#30569