New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torch.cuda.memory_allocated to return {}
if not initialized
#51179
torch.cuda.memory_allocated to return {}
if not initialized
#51179
Conversation
💊 CI failures summary and remediationsAs of commit 776ed5c (more details on the Dr. CI page):
🕵️ 1 new failure recognized by patternsThe following CI failures do not appear to be due to upstream breakages: pytorch_linux_xenial_py3_6_gcc5_4_test (1/1)Step: "Run tests" (full log | diagnosis details | 🔁 rerun)
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@malfet has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Thanks for the fix!
Weirdly, it works if |
Codecov Report
@@ Coverage Diff @@
## master #51179 +/- ##
=======================================
Coverage 80.88% 80.88%
=======================================
Files 1931 1931
Lines 210560 210562 +2
=======================================
+ Hits 170311 170315 +4
+ Misses 40249 40247 -2 |
Fixed, although according to type annotations, this function is not supposed to be called with |
Awesome, can you please add tests? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@malfet has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Fixes #49952