Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory reporting issue #579

Closed
denizyuret opened this issue Nov 27, 2020 · 1 comment · Fixed by #582
Closed

memory reporting issue #579

denizyuret opened this issue Nov 27, 2020 · 1 comment · Fixed by #582

Comments

@denizyuret
Copy link
Contributor

CUDA.memory_status() now reports a discrepancy. The reason seems to be the CUDA.used_memory() function which does not report the number of bytes used any more.

@maleadt
Copy link
Member

maleadt commented Nov 27, 2020

There was a report by @marius311 on Slack about this as well:

Marius Millea 22:23
Is the last line in this error I'm getting:

Out of GPU memory trying to allocate 8.016 MiB
Effective GPU memory usage: 91.54% (14.446 GiB/15.782 GiB)
CUDA allocator usage: 13.984 GiB (capped at 14.000 GiB)
binned usage: 15.141 KiB (15.141 KiB allocated, 0 bytes cached)
Discrepancy of 13.984 GiB between memory pool and allocator!

a sign that something has gone wrong? (and could potentially be fixed?)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants