python 3.9
pytorch(http://pytorch.org/)
tensorflow 2.10.0
munch 2.5.0
opencv-python 4.4.0.46
ffmpeg-python 0.2.0
- Download pre-trained models from BaiduNetdisk. password: zfxa.
- Create the folder expr, which contains the folders :checkpoints, results, samples.
- Copy the pre-training files to the expr/checkpoints/BraTS.
- To train MD-GAN, run the following command:
#BraTS2018
python main.py --mode train --num_domains 2 --w_hpf 0 \
--lambda_reg 1 --lambda_rec 0.01 --lambda_class 0.02 --lambda_l1 100 \
--train_img_dir data/BraTS/train \
--val_img_dir data/BraTS/val- To test MD-GAN, run the following command:
#BraTS2018
python main.py --mode sample --num_domains 2 --resume_iter 0 --w_hpf 0 \
--checkpoint_dir expr/checkpoints/BraTS \
--result_dir expr/results/BraTS \
--src_dir assets/BraTS/src \
--ref_dir assets/BraTS/ref- The implementation of proposed MD-GAN model is based on StarGAN V2(https://github.com/clovaai/stargan-v2) and ADGAN(https://github.com/LEI-YRO/ADGAN).
- To facilitate processing, some image data were uploaded, which were derived from the dataset BraTS2018.
- If you want to train a custom dataset, the file processing is the same as BraTS.
- For smooth training of the network, it is recommended that the image naming does not contain any modal nouns.
- Modify the weight file name you want to test in the solver file