Skip to content

Latest commit

 

History

History
112 lines (85 loc) · 6.23 KB

File metadata and controls

112 lines (85 loc) · 6.23 KB

SwIPE: Efficient and Robust Medical Image Segmentation with Implicit Patch Embeddings

SwIPE: Efficient and Robust Medical Image Segmentation with Implicit Patch Embeddings, MICCAI 2023. [MICCAI] [arXiv] [PDF]
Charley Y. Zhang, Pengfei Gu, Nishchal Sapkota, Danny Z. Chen
@inproceedings{zhang2023swipe,
  title        = {{SwIPE}: Efficient and Robust Medical Image Segmentation with Implicit Patch Embeddings},
  author       = {Zhang, Charley Y. and Gu, Pengfei and Sapkota, Nishchal and Chen, Danny Z.},
  booktitle    = {International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI)},
  pages        = {315--326},
  year         = {2023},
  organization = {Springer}
}
Key Ideas & Main Findings

SwIPE (Segmentation with Implicit Patch Embeddings) is a medical image segmentation approach that utilizes implicit neural representations (INRs) to learn continuous representations rather than discrete ones which are commonly adopted by modern methods (e.g., CNNs, transformers, or combinations of both).

  1. Patch-based Implicit Neural Representations (INRs): SwIPE is the first approach to leverage patch-based INRs for medical image segmentation. This novel methodology allows for both accurate local boundary delineation and global shape coherence while moving away from discrete raster representations.
  2. Efficieny and Robustness: Through extensive evaluations, SwIPE outperforms state-of-the-art methods in both 2D polyp segmentation and 3D abdominal organ segmentation tasks. Notably, SwIPE achieves these results with over 10x fewer parameters, showcasing exceptional model efficiency. Additionally, SwIPE exhibits superior robustness to data shifts across image resolutions and datasets.
  3. Augmented Contextual Understanding with Multi-stage Embedding Attention (MEA) and Stochastic Patch Overreach (SPO): The introduction of MEA for dynamic feature extraction and SPO for enhanced boundary improve contextual understanding during the encoding step and address boundary continuities during occupancy decoding, leading to more accurate and coherent segmentation results.

View the full poster (PDF)

Training and Testing

Environment Setup

For our virtual env, we used Python 3.9. It's recommended to create a conda environment and then install the necessary packages within the environment's pip.

conda create --name swipe python=3.9 
conda activate swipe
pip install click torch torchvision torchsummary einops albumentations monai dmt SimpleITK psutil matplotlib pandas jupyter

Next, clone this repository.

git clone https://github.com/charzharr/miccai23-swipe-implicit-segmentation.git
cd miccai23-swipe-implicit-segmentation/src

Finally, access the model weights and point data used for training & inference at this Google Drive location. The swipe.zip file is just the compressed swipe folder. After uncompressing, move the 'artifacts' and 'data' folder into src/experiments/swipe (i.e. to src/experiments/swipe/artifacts and src/experiments/swipe/data). You may also do this via commandline:

# Ensure you're in the src directory
pip install gdown
gdown --fuzzy "https://drive.google.com/file/d/12bii96Z2J8YlgQVnRiuLwcKwSUE76AJS/view?usp=drive_link"
unzip swipe.zip

mv swipe/artifacts experiments/swipe/
mv swipe/data experiments/swipe/
rm -r swipe swipe.zip

Training

To train SwIPE, simply navigate to the src directory and run:

# Ensure you're in the src directory
python train.py --config swipe_sessile.yaml

Inference

A notebook for inference and prediction visualizations can be found in src/test.ipynb. Ensure that the artifacts folder is correctly placed in src/experiments/swipe and run all the cells in order. This notebook will then infer on the test set of the 2D sessile data and visualize 2 items: 1) the original image, local patch prediction, ground truth, and prediction errors (red indicates FP pixels and blue shows FN pixels), and 2) the variance map of each predicted pixel (by default the variance is computed from the predictions of the target point and the 8 neighboring points).

Custom Data Preparation

Creating the 2D and 3D points for custom data can be found in the notebooks 'create_points2d' and 'create_points3d' in the data directory (download 'data' from the Google Drive folder in swipe).

Acknowledgements, License & Usage

@inproceedings{zhang2023swipe,
  title        = {{SwIPE}: Efficient and Robust Medical Image Segmentation with Implicit Patch Embeddings},
  author       = {Zhang, Charley Y. and Gu, Pengfei and Sapkota, Nishchal and Chen, Danny Z.},
  booktitle    = {International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI)},
  pages        = {315--326},
  year         = {2023},
  organization = {Springer}
}

© This code is made available under the Commons Clause License and is available for non-commercial academic purposes.