Skip to content

The official Pytorch implementation of training & evaluation code for ColonFormer

License

Notifications You must be signed in to change notification settings

ducnt9907/ColonFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ColonFormer: An Efficient Transformer based Method for Colon Polyp Segmentation

This repository contains the official Pytorch implementation of training & evaluation code for ColonFormer.

Environment

  • Creating a virtual environment in terminal: conda create -n ColonFormer
  • Install CUDA 11.1 and pytorch 1.7.1
  • Install other requirements: pip install -r requirements.txt

Dataset

Downloading necessary data:

  1. For Experiment 1 in our paper:
  2. For Experiment 2 and Experiment 3:

Training

Download MiT's pretrained weights ( google drive | onedrive ) on ImageNet-1K, and put them in a folder pretrained/. Config hyper-parameters and run train.py for training. For example:

python train.py --backbone b3 --train_path ./data/TrainDataset --train_save ColonFormerB3

Here is an example in Google Colab

Evaluation

For evaluation, specific your backbone version, weight's path and dataset and run test.py. For example:

python test.py --backbone b3 --weight ./snapshots/ColonFormerB3/last.pth --test_path ./data/TestDataset

We provide some pretrained weights in case you need.

Citation

If you find this code useful in your research, please consider citing:

@article{duc2022colonformer,
  title={Colonformer: An efficient transformer based method for colon polyp segmentation},
  author={Duc, Nguyen Thanh and Oanh, Nguyen Thi and Thuy, Nguyen Thi and Triet, Tran Minh and Dinh, Viet Sang},
  journal={IEEE Access},
  volume={10},
  pages={80575--80586},
  year={2022},
  publisher={IEEE}
}
@inproceedings{ngoc2021neounet,
  title={NeoUNet: Towards accurate colon polyp segmentation and neoplasm detection},
  author={Ngoc Lan, Phan and An, Nguyen Sy and Hang, Dao Viet and Long, Dao Van and Trung, Tran Quang and Thuy, Nguyen Thi and Sang, Dinh Viet},
  booktitle={Advances in Visual Computing: 16th International Symposium, ISVC 2021, Virtual Event, October 4-6, 2021, Proceedings, Part II},
  pages={15--28},
  year={2021},
  organization={Springer}
}
@article{thuan2023rabit,
  title={RaBiT: An Efficient Transformer using Bidirectional Feature Pyramid Network with Reverse Attention for Colon Polyp Segmentation},
  author={Thuan, Nguyen Hoang and Oanh, Nguyen Thi and Thuy, Nguyen Thi and Perry, Stuart and Sang, Dinh Viet},
  journal={arXiv preprint arXiv:2307.06420},
  year={2023}
}

About

The official Pytorch implementation of training & evaluation code for ColonFormer

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages