- Create conda environment and activate it
conda create --name GCN_MULTI_env python=3.11 conda activate GCN_MULTI_env
- Install requirements
pip install -r requirements.txt
- Download the CAMUS dataset folder from https://humanheart-project.creatis.insa-lyon.fr/database/#collection/6373703d73e9f0047faa1bc8/folder/63fde55f73e9f004868fb7ac
- Extract the downloaded folder and place the
database_nifti
folder indata/local_data
- Run
PYTHONPATH=./ python tools/preprocess_CAMUS_displacement.py
- Download the trained model from https://huggingface.co/gillesvdv/GCN_with_displacement_camus_cv1
- place the downloaded .pth file in
experiments/logs/CAMUS_displacement_cv_1/GCN_multi_displacement_small/mobilenet2/trained/
- Run
python eval.py
- The results will be saved in the folder
experiments/logs/CAMUS_displacement_cv_1/GCN_multi_displacement_small/mobilenet2/trained/weights_CAMUS_displacement_cv_1_GCN_multi_displacement_small_best_loss_eval_on_CAMUS_displacement_cv_1/
. Theplots
subfolder contains the resulting plots andpredictions.pkl
contains the predictions of each sample.
- Run
python train.py
(this will take a long time as the default trains for 5000 epochs) - Change the WEIGHTS parameter in
files/configs/Eval_CAMUS_displacement.yaml
to the path of checkpoint of the trained model inexperiments/logs/your_dataset/mobilenetv2/your_run_id/your_weights.pth
where your_dataset is the name of the dataset you trained on and your_weights is the name of the checkpoint you want to use, and your_run_id is the automatically generated id of the run you want to use.