MIDeepSeg: Minimally Interactive Segmentation of Unseen Objects from Medical Images Using Deep Learning [MedIA or Arxiv] and [Demo]
This repository proivdes a 2D medical image interactive segmentation method for segmentation and annotation.
-
This project was originally developed for our previous work MIDeepSeg, if you find it's useful for your research, please consider to cite the followings:
@article{luo2021mideepseg, title={MIDeepSeg: Minimally interactive segmentation of unseen objects from medical images using deep learning}, author={Luo, Xiangde and Wang, Guotai and Song, Tao and Zhang, Jingyang and Aertsen, Michael and Deprest, Jan and Ourselin, Sebastien and Vercauteren, Tom and Zhang, Shaoting}, journal={Medical Image Analysis}, volume={72}, pages={102102}, year={2021}, publisher={Elsevier}}
A visualization comparison of different distance transform methods, following GeodisTK.
Before you can use this package for image segmentation. You should:
- PyTorch version >=1.0.1
- Some common python packages such as Numpy, Pandas, SimpleITK,OpenCV, pyqt5, scipy......
- Install the GeodisTK for geodesic distance transformation.
- Install the SimpleCRF for interactive refinement.
1, compile the requirement library:
pip install -r requirements.txt
- launch the GUI
cd mideepseg
python main.py
-
load an image for segmentation. Once the image is loaded, Firstly, give some edge points by left mouse to get an initial interactions, click the Segmentation button to obtain an initial segmentation. Then, press left mouse button to give clicks in under-segmented regions, and press right mouse button to give clicks in over-segmented region. Then click the Refinement button, and the segmentation will be updated according to the interactions.
-
Note that, the pretrained model is only trained with placenta MR-T2 data.
- We thank the authors of Deep_Extreme_Cut, DeepIGeoS and BIFSeg for their elegant and efficient code base !
- This project was designed for academic research, not for clinical or commercial use, as it's a protected patent. If you want to use it for commercial, please contact Prof. Guotai Wang.