Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A few problems encountered while running the Docker container #1

Closed
denbonte opened this issue Aug 2, 2023 · 4 comments
Closed

A few problems encountered while running the Docker container #1

denbonte opened this issue Aug 2, 2023 · 4 comments

Comments

@denbonte
Copy link

denbonte commented Aug 2, 2023

Dear developers,

Thank you for sharing this resource and congrats for the publication on Radiology!


I just tested the docker container you folks built and shared, and I wanted to ask if you could fix a couple of things that might make the use of the tool way easier!

The input NIfTI file(s) get deleted after the run

First of all, this was the output of the run:

--- STEP 1 - Prepare Data ---

Number of CT images: 1
Convert to std orientation (image.nii.gz):
/app/tools/c3d /Input/NIFTI/image.nii.gz -swapdim RPI -o /Output/Temp/convert2std/image.std.c3d.nii.gz
ln -sf /Output/Temp/convert2std/image.std.c3d.nii.gz /Output/Temp/ct_std/image.nii.gz

--- STEP 2 - Preprocessing ---

Get lung mask

100%|██████████████████████████████████████████████████████████████████████| 1/1 [00:20<00:00, 20.33s/it]

Vertebral level prediction

Generate record (#CPU=12): 100%|███████████████████████████████████████████| 1/1 [00:00<00:00, 10.52it/s]
Extract lung region (3D) (#CPU=12): 100%|████████████████████████████████| 1/1 [00:00<00:00, 4165.15it/s]
Intensity normalization (#CPU=12): 100%|█████████████████████████████████| 1/1 [00:00<00:00, 4865.78it/s]
Resample (#CPU=12): 100%|████████████████████████████████████████████████| 1/1 [00:00<00:00, 4266.84it/s]
Get lung volume (#CPU=12): 100%|█████████████████████████████████████████| 1/1 [00:00<00:00, 4655.17it/s]
Generate body mask (#CPU=12): 100%|██████████████████████████████████████| 1/1 [00:00<00:00, 7489.83it/s]
Generate model input data (#CPU=12): 100%|███████████████████████████████| 1/1 [00:00<00:00, 1446.31it/s]
Vertebral level prediction: 100%|██████████████████████████████████████████| 1/1 [00:01<00:00,  1.65s/it]
Generate combined png files for QA (#CPU=12): 100%|██████████████████████| 1/1 [00:00<00:00, 4169.29it/s]

Get field-of-view mask

Generate field-of-view mask (#CPU=12): 100%|█████████████████████████████| 1/1 [00:00<00:00, 5540.69it/s]

--- STEP 3 - BC assessment ---

Run BC inference: 100%|████████████████████████████████████████████████████| 1/1 [00:02<00:00,  2.62s/it]
Generate combined png file for QA (#CPU=1): 100%|██████████████████████████| 1/1 [00:01<00:00,  1.62s/it]

--- STEP 4 - Generate result ---

Get BC measurements:   0%|                                                         | 0/1 [00:00<?, ?it/s][W pthreadpool-cpp.cc:90] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool)
Get BC measurements: 100%|█████████████████████████████████████████████████| 1/1 [00:00<00:00,  4.30it/s]
Generate pdf report (#CPU=12): 100%|█████████████████████████████████████| 1/1 [00:00<00:00, 6898.53it/s]
Generate png report (#CPU=12): 100%|█████████████████████████████████████| 1/1 [00:00<00:00, 4899.89it/s]
Delete temporary nii.gz files.
find /Output -name "*.nii.gz" -type f -delete

As you can imagine from the last line:

Delete temporary nii.gz files.
find /Output -name "*.nii.gz" -type f -delete

The NIfTI data I ran the container on got nuked from my system. This means that, if I want to run the analysis for 10,000 patients, I would need to run the conversion from DICOM and will be able to use those NIfTI only once!

I tried to fix this problem by mounting the input directory/volume as read only, using -v $INPUT_DIR:/Input:ro as you normally do in Docker, but this makes the run crash:

--- STEP 4 - Generate result ---

Get BC measurements:   0%|                                                         | 0/1 [00:00<?, ?it/s][W pthreadpool-cpp.cc:90] Warning: Leaking Caffe2 thread-pool after fork. (function pthreadpool)
Get BC measurements: 100%|█████████████████████████████████████████████████| 1/1 [00:00<00:00,  5.04it/s]
Generate pdf report (#CPU=12): 100%|█████████████████████████████████████| 1/1 [00:00<00:00, 3214.03it/s]
Traceback (most recent call last):
  File "/app/src/run_combined.py", line 38, in <module>
    steps.generate_results(config_preprocess, config_inference)
  File "/app/src/steps.py", line 113, in generate_results
    output_dir=os.path.join(project_out_root, 'Report_pdf'))
  File "/app/src/Utils/result_measurement.py", line 272, in generate_report
    desc=f'Generate pdf report (#CPU={self.n_jobs})'))
  File "/app/env/lib/python3.7/site-packages/joblib/parallel.py", line 1056, in __call__
    self.retrieve()
  File "/app/env/lib/python3.7/site-packages/joblib/parallel.py", line 935, in retrieve
    self._output.extend(job.get(timeout=self.timeout))
  File "/app/env/lib/python3.7/multiprocessing/pool.py", line 657, in get
    raise self._value
  File "/app/env/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "/app/env/lib/python3.7/site-packages/joblib/_parallel_backends.py", line 595, in __call__
    return self.func(*args, **kwargs)
  File "/app/env/lib/python3.7/site-packages/joblib/parallel.py", line 263, in __call__
    for func, args, kwargs in self.items]
  File "/app/env/lib/python3.7/site-packages/joblib/parallel.py", line 263, in <listcomp>
    for func, args, kwargs in self.items]
  File "/app/src/Utils/result_measurement.py", line 264, in process_single_case
    report_generator.draw_report(out_pdf)
  File "/app/src/Utils/pdf_report.py", line 486, in draw_report
    height=config_snapshot['fov_seg_height']
  File "/app/env/lib/python3.7/site-packages/reportlab/platypus/flowables.py", line 421, in __init__
    if not fp and os.path.splitext(filename)[1] in ['.jpg', '.JPG', '.jpeg', '.JPEG']:
  File "/app/env/lib/python3.7/posixpath.py", line 122, in splitext
    p = os.fspath(p)
TypeError: expected str, bytes or os.PathLike object, not NoneType

(notice the data I ran the pipeline on was the same piece of data as before, so I highly doubt this is down to something else!)

Only users with root privileges can handle the pipeline output

When running the pipeline on a Linux system, all of the output files get saved as root - which makes the handling of the results afterwards a bit sloppy. This is a well documented issue with Docker on Linux (in order to avoid this, you would need to build the container specifying the user id and group id). The way you could solve this is changing the file permissions inside the container (of whatever gets exported by the pipeline, basically the python equivalent to chmod 777).

Temp folder

Last but not least, I couldn't help but notice a pretty size-y Temp folder is exported with every run: is this intended to be exported, or...?


I would really like to use and cite this tool for some of my research projects, so it would be amazing if you could look into this 🙃

Thank you for your time,
Dennis.

@kwxu
Copy link
Collaborator

kwxu commented Aug 28, 2023

Hi, thanks so much for the suggestions! We released a new version of the Docker image (v1.0.3). There are multiple updates hopefully can help resolve the issues you mentioned.

Files under Temp are all intermediate files. You won't need them if the purpose is only to get the body composition values. The files are used to debug when there is a problem or some generated files can be used for other purposes (the generated lung, body, and FOV masks, etc). The updated Docker routine removes the entire Temp folder after the run by default. User can specify --keep_temp to keep those files. This update should also help to avoid deleting the input NIfTIs when /Input and /Output happen to share the same host location. Check the updated instructions for more details.

If the cohort is huge (like NLST), a good practice would be to run in batches, e.g., with size of each batch under 200, so the temp files can be cleaned up timely.

The file permission problem can be solved by passing -u "$(id -u):$(id -g)" when triggering the docker. It is mentioned in this blog. An example is provided in the bash script that comes with the updated TCIA demo dataset package.

Hope these can help

@denbonte
Copy link
Author

Thank you Kaiwen for reaching back!

I will definitely give this a try :-)

@JYI-CSHS
Copy link

Hi, I tried to reproduce the inference results using the provided examples from TCIA on Windows machine. I encountered some issues when executing extract_lung_region_3D(...). The error occurs when I execute c3d via command line, but I'm not sure.

Command line for executing c3d: $D:\S-EFOV_refineSPECT_BC\tools\c3d D:\TCIA_demo\Output\Temp\ct_std\COVID-19-AR-16406490.nii.gz -region 87x130x10vox 336x221x522vox -o D:\TCIA_demo\Output\Temp\ct_crop3D\COVID-19-AR-16406490.nii.gz

My questions are

  • Is there a way to get around the error?
  • For the c3d, what does it do in the extract_lung_region_3D(...) function? Does it only crop the input D:\TCIA_demo\Output\Temp\ct_std\COVID-19-AR-16406490.nii.gz to get the 3D lung region?
  • What's the output? Is it the 3D lung region or the CT scan without the lung region?

Thanks

-----------------------------------------log info---------------------------------------------------

Vertebral level prediction

Generate record (#CPU=1): 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:03<00:00, 1.10it/s]
Extract lung region (3D) (#CPU=1): 0%| | 0/4 [00:00<?, ?it/sc
md_str: D:\S-EFOV_refineSPECT_BC\tools\c3d D:\TCIA_demo\Output\Temp\ct_std\COVID-19-AR-16406490.nii.gz -region 87x130x10vox 336x221x522vox -o D:\TCIA_demo\Output\Temp\ct_crop3D\COVID-19-AR-16406490.nii.gz
'D:\S-EFOV_refineSPECT_BC\tools\c3d' is not recognized as an internal or external command,
operable program or batch file.
Extract lung region (3D) (#CPU=1): 25%|██████████████████████████████████████ | 1/4 [00:00<00:02, 1.06it/s]cmd_str: D:\S-EFOV_refineSPECT_BC\tools\c3d D:\TCIA_demo\Output\Temp\ct_std\COVID-19-AR-16406503.nii.gz -region 64x115x131vox 361x253x495vox -o D:\TCIA_demo\Output\Temp\ct_crop3D\COVID-19-AR-16406503.nii.gz
'D:\S-EFOV_refineSPECT_BC\tools\c3d' is not recognized as an internal or external command,
operable program or batch file.
Extract lung region (3D) (#CPU=1): 50%|████████████████████████████████████████████████████████████████████████████ | 2/4 [00:02<00:02, 1.02s/it]cmd_str: D:\S-EFOV_refineSPECT_BC\tools\c3d D:\TCIA_demo\Output\Temp\ct_std\COVID-19-AR-16424081.nii.gz -region 90x128x1vox 366x219x328vox -o D:\TCIA_demo\Output\Temp\ct_crop3D\COVID-19-AR-16424081.nii.gz
'D:\S-EFOV_refineSPECT_BC\tools\c3d' is not recognized as an internal or external command,
operable program or batch file.
Extract lung region (3D) (#CPU=1): 75%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████ | 3/4 [00:02<00:00, 1.22it/s]cmd_str: D:\S-EFOV_refineSPECT_BC\tools\c3d D:\TCIA_demo\Output\Temp\ct_std\COVID-19-AR-16445138.nii.gz -region 81x171x143vox 353x228x489vox -o D:\TCIA_demo\Output\Temp\ct_crop3D\COVID-19-AR-16445138.nii.gz
'D:\S-EFOV_refineSPECT_BC\tools\c3d' is not recognized as an internal or external command,
operable program or batch file.
Extract lung region (3D) (#CPU=1): 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:03<00:00, 1.08it/s]
Intensity normalization (#CPU=1): 0%| | 0/4 [00:00<?, ?it/s]Something wrong with COVID-19-AR-16406490.nii.gz
Something wrong with COVID-19-AR-16406503.nii.gz

@kwxu kwxu mentioned this issue Mar 25, 2024
@BennettLandman
Copy link

I am closing the older bug reports as these were missed. We are now better tracking reports across the organization. Please re-open if this continues to be a blocker.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants