Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3D example runs out of memory on Google Colab #6

Closed
fjug opened this issue Apr 12, 2019 · 3 comments
Closed

3D example runs out of memory on Google Colab #6

fjug opened this issue Apr 12, 2019 · 3 comments

Comments

@fjug
Copy link
Member

fjug commented Apr 12, 2019

Is there a way to reduce the memory requirement so that this example can run through on Google Colab?
That would be nice, but likely not most pressing issue... ;)

@tibuch
Copy link
Collaborator

tibuch commented Apr 12, 2019

Hi,

Thanks for reporting!

Do you know what the memory limits of Google Colab are?

One hot-fix would be to reduce the batch-size until it runs. This will result in less stable convergence and might take longer to train.

@fjug
Copy link
Member Author

fjug commented Apr 12, 2019 via email

@tibuch
Copy link
Collaborator

tibuch commented Jun 3, 2019

Using n2v_patch_shape = (32, 32, 32) in the config seems to work on google colab (CPU only), but then training takes very long. Using n2v_patch_shape = (16, 32, 32) worked for the GPU setting on google colab and training is reasonably fast.

I am somewhat hesitating to change the current n2v_patch_shape = (32, 64, 64) because it gives very nice results.
Maybe it would be enough to mention it as a note in the notebook?

@fjug fjug closed this as completed Jun 13, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants