This repository provides the official Matlab demo for the following paper:
Fine-tuning Convolutional Neural Networks for Biomedical Image Analysis: Actively and Incrementally
Zongwei Zhou1, Jae Shin1, Lei Zhang1, Suryakanth Gurudu2, Michael B. Gotway2, and Jianming Liang1
1 Arizona State University, 2 Mayo Clinic
The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017
paper | code | slides | poster
Active, Continual Fine Tuning of Convolutional Neural Networks for Reducing Annotation Efforts
Zongwei Zhou1, Jae Shin1, Suryakanth Gurudu2, Michael B. Gotway2, and Jianming Liang1
1 Arizona State University, 2 Mayo Clinic
Medical Image Analysis (MedIA)
paper | code | slides
If you use this code for your research, please cite our paper:
@inproceedings{zhou2017fine,
title={Fine-tuning convolutional neural networks for biomedical image analysis: actively and incrementally},
author={Zhou, Zongwei and Shin, Jae and Zhang, Lei and Gurudu, Suryakanth and Gotway, Michael and Liang, Jianming},
booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
pages={7340--7351},
year={2017},
url = {http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhou_Fine-Tuning_Convolutional_Neural_CVPR_2017_paper.pdf}
}
@article{zhou2018active,
title={Active, Continual Fine Tuning of Convolutional Neural Networks for Reducing Annotation Efforts},
author={Zhou, Zongwei and Shin, Jae Y and Gurudu, Suryakanth R and Gotway, Michael B and Liang, Jianming},
journal={arXiv preprint arXiv:1802.00912},
year={2018}
}
@phdthesis{zhou2021towards,
title={Towards Annotation-Efficient Deep Learning for Computer-Aided Diagnosis},
author={Zhou, Zongwei},
year={2021},
school={Arizona State University}
}
This research has been supported partially by ASU and Mayo Clinic through a Seed Grant and an Innovation Grant, and partially by NIH under Award Number R01HL128785. The content is solely the responsibility of the authors and does not necessarily represent the official views of NIH. This is a patent-pending technology.