Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

anyone have problems in 6G GPU ? #84

Closed
wingdi opened this issue Nov 8, 2020 · 4 comments
Closed

anyone have problems in 6G GPU ? #84

wingdi opened this issue Nov 8, 2020 · 4 comments

Comments

@wingdi
Copy link

wingdi commented Nov 8, 2020

env: pytorch1.7.0、 cuda 11.0、RTX2060 、win10system.

get this Error:
RuntimeError: CUDA out of memory. Tried to allocate 256.00 MiB (GPU 0; 6.00 GiB total capacity; 3.87 GiB already allocated; 187.62 MiB free; 4.01 GiB reserved in total by PyTorch)

is there any method to solve this problem?

@AnMa12
Copy link

AnMa12 commented Nov 9, 2020

Hello!

I have the same problem, with NVIDIA® GeForce® RTX 2060 , with 6GB GDDR6 VRAM.

[MY SOLUTION]
My solution was to run it on CPU (my CPU has 16 GB DDR4 2400MHz SDRAM)
To do that I changed in apps/recon.py:

  • the line line: cuda = torch.device('cuda:%d' % opt.gpu_id if torch.cuda.is_available() else 'cpu')
  • with this line: cuda = torch.device('cpu')

It takes a while to process the image, but at least it works!

Hope it will work for you too!

@wingdi
Copy link
Author

wingdi commented Nov 9, 2020

Hello!

I have the same problem, with NVIDIA® GeForce® RTX 2060 , with 6GB GDDR6 VRAM.

[MY SOLUTION]
My solution was to run it on CPU (my CPU has 16 GB DDR4 2400MHz SDRAM)
To do that I changed in apps/recon.py:

  • the line line: cuda = torch.device('cuda:%d' % opt.gpu_id if torch.cuda.is_available() else 'cpu')
  • with this line: cuda = torch.device('cpu')

It takes a while to process the image, but at least it works!

Hope it will work for you too!

it works fors me too. mine is 16GB CPU .
it help a lot ! thanks very much !

@IhabBendidi
Copy link

Working for me too ! Seems like a minimum memory VRAM of GPU of 7GB should be specified in the readme, as it is the minimum enough for it to run. Maybe the issue should be closed? @wingdi

@wingdi wingdi closed this as completed Nov 26, 2020
@wingdi
Copy link
Author

wingdi commented Nov 26, 2020

Working for me too ! Seems like a minimum memory VRAM of GPU of 7GB should be specified in the readme, as it is the minimum enough for it to run. Maybe the issue should be closed? @wingdi

ok ~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants