Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPU memory usage estimation #33

Open
basselin7u opened this issue Jun 25, 2018 · 1 comment
Open

GPU memory usage estimation #33

basselin7u opened this issue Jun 25, 2018 · 1 comment

Comments

@basselin7u
Copy link

AMGX is currently able to provide an estimation of the GPU memory usage. Currently, this is achieved by computing the memory used by all processes, thanks to cudaMemGetInfo function.

It would be more interesting to only provide the memory used by the AMGX process.
To get the amount of memory used by the AMGX process, you may store at launch the current memory used estimation and subtract it to each "allocated = total - free" in the updateMaxMemoryUsage function.

@marsaev
Copy link
Collaborator

marsaev commented Jul 5, 2018

It is possible that another application will change it's memory usage during runtime, so saving memory usage at AMGX startup wouldn't be the best solution. However considering that AMGX uses caching allocator it's possible to get total memory usage by reduction over allocated blocks of memory from the allocator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants