New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large memory usage during segmentation #6982
Comments
Note: closing the scene leaves Slicer cosuming 38GB of memory. |
There are several potential sources of significant memory usage:
None of these are leaks and while possible, I would not expect to find any major leaks. It is more likely that some working buffers are not immediately released (e.g., we may keep the resampled source volume until another source volume is selected). We generally recommend to have at least 10x more memory than the input image size and that seems to solve all the transient high memory usages, but I can understand that in case of a very large image this may not be always feasible. We can investigate if the memory usage is abnormal if you can provide a reproducible workflow (preferably using Slicer sample data sets). The best would be if you could write a short Python script to reproduce the issue (all the examples you need should be available in the script repository). I would also recommend to discuss this topic on the Slicer Forum, for example here: https://discourse.slicer.org/t/large-data-support-in-3d-slicer/15560 - describe what kind of data you work on, what your high-level requirements are, what are some issues or limitations that you encounter in Slicer and see how others address in and see what fixes or improvements should be implemented. |
@lassoan Thank you very much for the response. These tips are very helpful. You mentioned changing the undo levels, could you elaborate on how to do this? |
You can reduce the number of stored undo states by typing this into the Python console: slicer.modules.SegmentEditorWidget.editor.maximumNumberOfUndoStates = 3 The default is 10. You can set it to 0 to completely disable undo/redo. |
Changing the undo levels made a significant difference, max memory usage is now around 80GB (5x the dataset size). This still seems excessive, how it is now usable. The recommended 10x the memory than the dataset also seems very excessive. It would make sense that Slicer should recognize high memory volumes and make corrections, such as adjusting the undo level. If you are correct in saying that it will use up to 10x the memory of the dataset, then you are also correct in saying this not a memory leak. Our system has 128GB of RAM, I would need double that to test if there is an actual memory leak or if it is just high usage. I will test some more on the Slicer example datasets to see if I can exceed 10x the dataset's size in memory. I'll also create/comment on the Slicer forum describing this case in more detail and some of the features that would be useful for this situation. Thank you for your time! |
Summary
When segmenting a volume, the memory usage can reach extremely high amounts in comparison to the actual volume size. I created only 3 labels on a 16GB dataset. After drawing just a few strokes for each label (using sphere brush), my memory usage from slicer was over 100GB.
Steps to reproduce
Screenshots and sample data unavailable due to CUI
Expected behavior
Even if each label doubled the memory usage (voxel to voxel copy), it should not reach 100GB from using only 3 labels. Let me know of any more logs/info I can provide.
Environment
The text was updated successfully, but these errors were encountered: