An Approach for Medical Image Arbitrary Scale Super-Resolution
In this work, we present an approach for medical image arbitrary-scale super-resolution (MIASSR), coupling meta-learning with generative adversarial networks (GANs) to super-resolve medical images at any scale of magnification in (1, 4]. Compared to SOTA SISR algorithms on single-modal magnetic resonance (MR) brain images (OASIS-brains) and multi-modal MR brain images (BraTS), MIASSR achieves comparable fidelity performance and the best perceptual quality with the smallest model size. We also employ transfer learning to enable MIASSR to tackle SR tasks of new medical modalities, such as cardiac MR images (ACDC) and chest computed tomography images (COVID-CT). The source code of our work is also public. Thus, MIASSR has the potential to become a new foundational pre-/post-processing step in clinical image analysis tasks such as reconstruction, image quality enhancement, and segmentation.
Framework of MIASSR.
To setup:
git clone https://github.com/GinZhu/MIASSR.git
cd MIASSR
pip install -r requirements.txt
To train:
python -W ignore train.py --model-type meta_sr --config-file config_files/meta_sr_example.ini
To test:
python -W ignore test.py --sr-scales multi --config-file config_files/testing_meta_sr_example.ini
To compare with SOTA SR methods, such as EDSR, we also provide:
python -W ignore train.py --model-type sota_sr --config-file config_files/sota_sr_example.ini
To test:
python -W ignore test.py --sr-scales multi --config-file config_files/testing_sota_sr_example.ini
Notice that multi pre-trained models are required for specific-scale methods.
Here we provide pre-trained models to download (on the OASIS dataset):
This work is available at IJNS, please cite as:
@article{zhu2021miassr,
title={Arbitrary scale super-resolution for medical images},
author={Zhu, Jin and Tan, Chuan and Yang, Junwei and Yang, Guang and Lio’, Pietro},
journal={International Journal of Neural Systems},
volume={31},
number={10},
pages={2150037},
year={2021},
publisher={World Scientific}
}
We refer to the previous works for better understanding of this project: