Skip to content

ubc-tea/FCRO-Fair-Classification-Orthogonal-Representation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains the PyTorch implementation of our IPMI 2023 paper "On Fairness of Medical Image Classification with Multiple Sensitive Attributes via Learning Orthogonal Representations".

Wenlong Deng*, Yuan Zhong*, Qi Dou, Xiaoxiao Li

[Paper]

Usage

Setup

Datasets

Please download the original CheXpert dataset here, and supplementary demographic data here.

In our paper, we use an augmented version of CheXpert. Please download the metadata of the augmented dataset here, and put it under the ./metadata/ directory.

Pretrained Models

Please download our pretrained models using 5-fold cross validation here, and put them under the ./checkpoint/ directory.

Run a single experiment

python train.py --image_path [image_path] --exp_path [exp_path] --metadata [metadata] --lr [lr] --weight_decay [weight_decay] --epoch [epoch] --batch_size [batch_size] -a [sensitive_attributes] --dim_rep [dim_rep] -wc [wc] -wr [wr] --subspace_thre [subspace_thre] -f [fold] --cond --moving_base --from_sketch

For more information, please execute python train.py -h for help.

Here is an example of how to run a experiment on fold 0 from sketch:

# Train from sketch, i.e., train the sensitive head first, then train the target head.
python train.py --image_path XXX -f 0 --cond --from_sketch

Here is another example of how to train the target model using a pretrained sensitive model:

python train.py --image_path XXX -f 0 --cond

By default, the pretrained sensitive model under the ./checkpoint/ directory will be used. If you want to customize it, please use --pretrained_path option.

To calculate column orthogonal loss using accumulative space construction variant, please use --moving_space option.

Test

After installing our pretrained model and metadata, you can reproduce our 5-fold cross validation results in our paper by running:

# Running test using model of fold 0. Please run full 5-fold to reproduce our results
python train.py --test --image_path XXX -f 0

You may customize --pretrained_path and --sensitive_attributes commands to use other pretrained models or test on other sensitive attributes combinations.

Citation

If you find this work helpful, feel free to cite our paper as follows:

@inproceedings{deng2023fairness,
  title={On fairness of medical image classification with multiple sensitive attributes via learning orthogonal representations},
  author={Deng, Wenlong and Zhong, Yuan and Dou, Qi and Li, Xiaoxiao},
  booktitle={International Conference on Information Processing in Medical Imaging},
  pages={158--169},
  year={2023},
  organization={Springer}
}

About

An official implementation of "On Fairness of Medical Image Classification with Multiple Sensitive Attributes via Learning Orthogonal Representations" (IPMI 2023)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages