Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large memory usage during segmentation #6982

Closed
MattTheCuber opened this issue May 24, 2023 · 5 comments
Closed

Large memory usage during segmentation #6982

MattTheCuber opened this issue May 24, 2023 · 5 comments
Labels
Type: Bug Something isn't working correctly

Comments

@MattTheCuber
Copy link
Contributor

MattTheCuber commented May 24, 2023

Summary

When segmenting a volume, the memory usage can reach extremely high amounts in comparison to the actual volume size. I created only 3 labels on a 16GB dataset. After drawing just a few strokes for each label (using sphere brush), my memory usage from slicer was over 100GB.

Steps to reproduce

  1. Install Slicer
  2. Open a DICOM image series (size used: 1832x1832x2399 - ~16GB)
  3. Open the Segment Editor module
  4. Create some labels
  5. Use the brush tool (sphere brush) and start drawing
  6. Watch the memory usage skyrocket

Screenshots and sample data unavailable due to CUI

Expected behavior

Even if each label doubled the memory usage (voxel to voxel copy), it should not reach 100GB from using only 3 labels. Let me know of any more logs/info I can provide.

Environment

  • Slicer version: Slicer-5.2.2-2023-04-27 - r31382 / fb46bd1
  • Operating system: Red Hat Enterprise Linux 8.6
@MattTheCuber MattTheCuber added the Type: Bug Something isn't working correctly label May 24, 2023
@MattTheCuber
Copy link
Contributor Author

Note: closing the scene leaves Slicer cosuming 38GB of memory.

@lassoan
Copy link
Contributor

lassoan commented May 24, 2023

There are several potential sources of significant memory usage:

  • We create a copy of all modified segments after each modification operation for undo/redo. You can reduce the number of undo levels or disable undo if this is an issue.
  • If the segmentation's geometry is different from the source volume, we also make a copy of the source volume resamples in the segmentation's geometry.
  • Some more complex Segment Editor effects may also run filters that consume significant amount of memory.

None of these are leaks and while possible, I would not expect to find any major leaks. It is more likely that some working buffers are not immediately released (e.g., we may keep the resampled source volume until another source volume is selected). We generally recommend to have at least 10x more memory than the input image size and that seems to solve all the transient high memory usages, but I can understand that in case of a very large image this may not be always feasible.

We can investigate if the memory usage is abnormal if you can provide a reproducible workflow (preferably using Slicer sample data sets). The best would be if you could write a short Python script to reproduce the issue (all the examples you need should be available in the script repository).

I would also recommend to discuss this topic on the Slicer Forum, for example here: https://discourse.slicer.org/t/large-data-support-in-3d-slicer/15560 - describe what kind of data you work on, what your high-level requirements are, what are some issues or limitations that you encounter in Slicer and see how others address in and see what fixes or improvements should be implemented.

@lassoan lassoan changed the title Large memory leak Large memory usage during segmentation May 24, 2023
@MattTheCuber
Copy link
Contributor Author

@lassoan Thank you very much for the response. These tips are very helpful.

You mentioned changing the undo levels, could you elaborate on how to do this?

@lassoan
Copy link
Contributor

lassoan commented May 24, 2023

You can reduce the number of stored undo states by typing this into the Python console:

slicer.modules.SegmentEditorWidget.editor.maximumNumberOfUndoStates = 3

The default is 10. You can set it to 0 to completely disable undo/redo.

@MattTheCuber
Copy link
Contributor Author

There are several potential sources of significant memory usage:

  • We create a copy of all modified segments after each modification operation for undo/redo. You can reduce the number of undo levels or disable undo if this is an issue.
  • If the segmentation's geometry is different from the source volume, we also make a copy of the source volume resamples in the segmentation's geometry.
  • Some more complex Segment Editor effects may also run filters that consume significant amount of memory.

None of these are leaks and while possible, I would not expect to find any major leaks. It is more likely that some working buffers are not immediately released (e.g., we may keep the resampled source volume until another source volume is selected). We generally recommend to have at least 10x more memory than the input image size and that seems to solve all the transient high memory usages, but I can understand that in case of a very large image this may not be always feasible.

We can investigate if the memory usage is abnormal if you can provide a reproducible workflow (preferably using Slicer sample data sets). The best would be if you could write a short Python script to reproduce the issue (all the examples you need should be available in the script repository).

I would also recommend to discuss this topic on the Slicer Forum, for example here: https://discourse.slicer.org/t/large-data-support-in-3d-slicer/15560 - describe what kind of data you work on, what your high-level requirements are, what are some issues or limitations that you encounter in Slicer and see how others address in and see what fixes or improvements should be implemented.

Changing the undo levels made a significant difference, max memory usage is now around 80GB (5x the dataset size). This still seems excessive, how it is now usable. The recommended 10x the memory than the dataset also seems very excessive. It would make sense that Slicer should recognize high memory volumes and make corrections, such as adjusting the undo level. If you are correct in saying that it will use up to 10x the memory of the dataset, then you are also correct in saying this not a memory leak. Our system has 128GB of RAM, I would need double that to test if there is an actual memory leak or if it is just high usage. I will test some more on the Slicer example datasets to see if I can exceed 10x the dataset's size in memory. I'll also create/comment on the Slicer forum describing this case in more detail and some of the features that would be useful for this situation. Thank you for your time!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Type: Bug Something isn't working correctly
Development

No branches or pull requests

2 participants