Skip to content

HiLab-git/IPLC

Repository files navigation

IPLC

👉 Requirements

Non-exhaustive list:

  • python3.9+
  • Pytorch 1.10.1
  • nibabel
  • Scipy
  • NumPy
  • Scikit-image
  • yaml
  • tqdm
  • pandas
  • scikit-image
  • SimpleITK

👉 Usage

  1. Download the M&MS Dataset, and organize the dataset directory structure as follows:
your/data_root/
       train/
            img/
                A/
                    A0S9V9_0.nii.gz
                    ...
                B/
                C/
                ...
            lab/
                A/
                    A0S9V9_0_gt.nii.gz
                    ...
                B/
                C/
                ...
       valid/
            img/
            lab/
       test/
           img/
           lab/

The network takes nii files as an input. The gt folder contains gray-scale images of the ground-truth, where the gray-scale level is the number of the class (0,1,...K).

  1. Download the SAM-Med2D model and move the model to the "your_root/pretrain_model" directory in your project.

  2. Train the source model in the source domain, for instance, you can train the source model using domain A on the M&MS dataset:

python train_source.py --config "./config/train2d_source.cfg"
  1. Adapt the source model to the target domain, for instance, you can adapt the source model to domain B on the M&MS dataset:
python adapt_mian.py --config "./config/adapt.cfg"

🤝 Acknowledgement

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages