This repository is the official Pytorch implementation of "GLIMPSE: Generalized Local Imaging with MLPs".
(This code is tested with PyTorch 1.12.1, Python 3.8.3, CUDA 11.6 and cuDNN 7.)
- numpy
- scipy
- matplotlib
- imageio
- torch==1.12.1
- torchvision=0.13.1
Run the following code to install conda environment "environment.yml":
conda env create -f environment.yml
All datasets have been uploaded to SwitchDrive. You can access the complete LoDoPaB-CT by downloading it from here. Additionally, we have made available a smaller subset of the LoDoPaB-CT dataset, comprising approximately 1000 training and 100 test samples. Moreover, to evaluate model generalization, we have included out-of-distribution (OOD) brain images consisting of 18 samples. These datasets can be downloaded using the following commands:
Complete LoDoPaB-CT:
curl -O -J https://drive.switch.ch/index.php/s/XzMbtHQFrQsLgxC/download
Small LoDoPaB-CT training subset:
curl -O -J https://drive.switch.ch/index.php/s/qMlALcE7AZzUPBh/download
Small LoDoPaB-CT test subset:
curl -O -J https://drive.switch.ch/index.php/s/fWBUmtZjozwpN9W/download
Out-of-didstribution brain images:
curl -O -J https://drive.switch.ch/index.php/s/BQ8Yb8ofjutsEjV/download
After downloading the datasets, please sepcify the training, test and OOD directories in 'config.py' script.
All arguments for training are explained in 'config.py'. After specifying your arguments, you can run the following command to train the model:
python3 train.py
If you find the code useful in your research, please consider citing the paper.
@article{khorashadizadeh2024glimpse,
title={GLIMPSE: Generalized Local Imaging with MLPs},
author={Khorashadizadeh, AmirEhsan and Debarnot, Valentin and Liu, Tianlin and Dokmani{\'c}, Ivan},
journal={arXiv preprint arXiv:2401.00816},
year={2024}
}