Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Memory is not released #40

Closed
liujiqiang999 opened this issue Mar 17, 2019 · 3 comments
Closed

Memory is not released #40

liujiqiang999 opened this issue Mar 17, 2019 · 3 comments

Comments

@liujiqiang999
Copy link

Hi, when the program ends, the memory of the GPU0 is released, but the other GPUs are not released. Why that ?

@glample
Copy link
Contributor

glample commented Mar 17, 2019

It has to be released. If you check the process ID with nvidia-smi you will see the process ID of the job which is still using memory, and it's probably some other process using the GPU 0.

@StillKeepTry
Copy link

In fact, I also meet this problem. I directly kill these processes manually.

@liujiqiang999
Copy link
Author

Thanks. It is easy way to solve this problem.

@glample glample closed this as completed Mar 21, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants