This is the official Pytorch implementation of our CVPR 2024 paper "Think Twice Before Selection: Federated Evidential Active Learning for Medical Image Analysis with Domain Shifts".
Please review the following requirements and install the packages listed in the requirements.txt
$ pip install --upgrade pip
$ pip install -r requirements.txt
- Classification
- Segmentation
After downloading the datasets, please naviagte to the FEAL/data/
directory and execute prepare_dataset.py
for data preprocessing. The folder structure within Dataset/
should be organized as follows.
├── Dataset
├── FedISIC_npy
├── ISIC_0012653_downsampled.npy, ISIC_0012654_downsampled.npy, ...
├── FedCamelyon
├── patches
├── patient_004_node_4, patient_009_node_1, ...
├── FedPolyp_npy
├── client1
├── sample1.npy, sample2.npy, ...
├── client2
├── ...
├── FedProstate_npy
├── client1
├── Case00
├── slice_012.npy, slice_013.npy, ...
├── ...
├── client2
├── ...
├── FedFundus_npy
├── client1
├── sample1.npy, sample2.npy, ...
├── client2
├── ...
The data split of Fed-ISIC and Fed-Camelyon follows Flamby and HarmoFL, respectively. For Fed-Polyp, Fed-Prostate, and Fed-Fundus, please navigate to the FEAL/data
directory and execute train_test_split.py
for the data split process.
For skin lesion classification using the Fed-ISIC dataset, the command for execution is as follows:
CUDA_VISIBLE_DEVICES=1 python main_cls_al.py --dataset FedISIC --al_method FEAL --query_model both --query_ratio 0 --budget 500 --al_round 5 --max_round 100 --batch_size 32 --base_lr 5e-4 --kl_weight 1e-2 --display_freq 20
If you find this work helpful for your research, please consider citing:
@inproceedings{chen2024think,
title={Think Twice Before Selection: Federated Evidential Active Learning for Medical Image Analysis with Domain Shifts},
author={Chen, Jiayi and Ma, Benteng and Cui, Hengfei and Xia, Yong},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={11439--11449},
year={2024}
}
The codebase is adapted from FedDG, FedLC, and EDL. We sincerely appreciate their insightful work and contributions.