-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory requirement spike for gandlf_preprocess
#780
Comments
Is the corresponding BraTS data publicly available? Can you provide it also, please? |
You should be able to download the data here: https://www.synapse.org/brats And this should be replicable even on the unit testing data [ref]. Do you think you can include the report from #806 into your fix as well (since both related to memory consumption)? |
Stale issue message |
This is still under investigation. |
Stale issue message |
Stale issue message |
Describe the bug
When
gandlf_process
is run (withnormalize
andcrop_external_zero_plane
as preprocessing parameters), the process runs fine for a validation csv with ~180 subjects, but fails with OOM error for training csv with ~800 subjects using the BraTS data.To Reproduce
gandlf_preprocess
script for both these cases.Expected behavior
It should run for both.
Screenshots
N.A.
GaNDLF Version
0.0.18-dev
Desktop (please complete the following information):
N.A.
Additional context
Memory profiler (thanks @hasan7n): https://pypi.org/project/memory-profiler/
The text was updated successfully, but these errors were encountered: