Skip to content

A self-training method for transferring fairness under distribution shifts.

Notifications You must be signed in to change notification settings

umd-huang-lab/transfer-fairness

Repository files navigation

transfer-fairness

This is the code for our NeurIPS paper Transferring Fairness under Distribution Shifts via Fair Consistency Regularization. The code is contributed by Bang An and Zora Che. In this paper, we study how to transfer fairness under distribution shifts. We propose a self-training method, the key of which is to minimize and balance the consistency loss across groups via fair consistency regularization.

Datasets

  • Our synthetic dataset is based on 3dshapes dataset. We save each image as .jpg file and use the index as its file name.
  • The first experiment on real datasets is based on UTKFace and FairFace datasets. For UTKFace, we use "aligned&cropped faces". For FairFace, we use the Padding=0.25 version. The example indices used in our paper are under data/UTKFace and data/fairface.
  • The second experiment on real datasets is based on NewAdult dataset with the Folktables package. We do ACSIncome prediction task in our paper.
  • Download data into data folder. The file structure would be
.
├── data
    ├── shapes
    │   └── images     
    ├── newadult
    │   └── 2018
    ├── fairface
    └── UTKFace
        └── UTKFace

How to run

Followings are example running scripts for UTKFace-FairFace experiment:

  1. Base
python laftr.py --dataset utk-fairface --model vgg16  --adv-hidden-dim 1024  --lr 0.001  --batch-size 100  \
--test-batch-size 256  --fair-weight 0  --val-epoch 5  --epoch 200   --train-iteration 50  --save-name face_base 
  1. Laftr
python laftr.py --dataset utk-fairface --model vgg16  --adv-hidden-dim 1024  --lr 0.001  --batch-size 100  \
--test-batch-size 256  --fair-weight 1  --val-epoch 5  --epoch 200   --train-iteration 50  --save-name face_laftr 
  1. CFair
python cfair.py --dataset utk-fairface --model vgg16  --adv-hidden-dim 1024  --lr 0.001  --batch-size 100  \
--test-batch-size 256  --fair-weight 1  --val-epoch 5  --epoch 200   --train-iteration 50  --save-name face_cfair 
  1. Laftr+DANN
python laftr+dann.py --dataset utk-fairface --model vgg16  --adv-hidden-dim 1024  --lr 0.001  --batch-size 100  \
--test-batch-size 256  --fair-weight 1  --da-weight 1 --val-epoch 5  --epoch 200   --train-iteration 50  --save-name face_laftr_dann 
  1. Laftr+FixMatch
python laftr+consis.py --dataset utk-fairface --model vgg16  --adv-hidden-dim 1024  --lr 0.001  --batch-size 100  \
--test-batch-size 256  --fair-weight 1  --consis-weight-source 1 --consis-weight-target 1 --val-epoch 5  --epoch 200   \
--train-iteration 50  --save-name face_laftr_fixmatch 
  1. Ours (w/Laftr)
python laftr+consis.py --dataset utk-fairface --model vgg16  --adv-hidden-dim 1024  --lr 0.001  --batch-size 100  \
--test-batch-size 256  --fair-weight 1  --fair-consis --consis-weight-source 1 --consis-weight-target 1 --val-epoch 5  --epoch 200   \
--train-iteration 50  --save-name face_laftr_fairfixmatch 

Citation

To cite our paper please use the following bibtex.

@inproceedings{
an2022transferring,
title={Transferring Fairness under Distribution Shifts via Fair Consistency Regularization},
author={Bang An and Zora Che and Mucong Ding and Furong Huang},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=zp_Cp38qJE0}
}

Contact

Please contact bangan@umd.edu for any questions about the code.

Reference

Parts of our code are based on or inspired by the following repositories. We sincerely thank the contributors of them.

https://github.com/thuml/Transfer-Learning-Library

https://github.com/zykls/folktables

https://github.com/kekmodel/FixMatch-pytorch

About

A self-training method for transferring fairness under distribution shifts.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages