This repository provides the code for the paper:
Zeyu Fu, Jianbo Jiao, Robail Yasrab, Lior Drukker, Aris T. Papageorghiou and J. Alison Noble. Anatomy-Aware Contrastive Representation Learning for Fetal Ultrasound. In: ECCV-MCV (2022).
To install the dependencies into a new conda environment, simpy run:
conda env create -f environment.yml
source activate awcl
Alternatively, you can install them manually:
conda create --name awcl
source activate awcl
conda install pytorch=1.12 torchvision cudatoolkit=11.6 -c pytorch
conda install opencv=3.4 -c conda-forge
conda install scipy, scikit-learn, scikit-image
pip install tensorboardX==2.1
As the used clinical dataset PULSE is not allowed to release to the public, it is not included here. But please feel free to use your own data, by modifying the data loader defined in data.py
Specify the paths of the dataset with the environment variable
LOCAL_DATA_DIR
.
To train the model, simpy run:
python main.py
By default, this function train the model on the PULSE data with the proposed anatomy-aware contrastive learning approach.
The training data and models are saved in the results
folder.
If you find AWCL useful, please cite the following BibTeX entry:
@inproceedings{awcl,
title={Anatomy-Aware Contrastive Representation Learning for Fetal Ultrasound},
author={Zeyu, Fu and Jianbo, Jiao and Robail, Yasrab and Lior, Drukker and Aris T. Papageorghiou and Alison, Noble},
booktitle="European Conference on Computer Vision Workshop",
year={2022},
}
Part of our codes are adapted from and based on SupContrast, we thank the authors for their contributions.