Skip to content

vdasu/neufair

Repository files navigation

NeuFair: Neural Network Fairness Repair with Dropout

This repository contains the source code for "NeuFair: Neural Network Fairness Repair with Dropout" accepted at ACM ISSTA 2024.

Setup

  1. Create the conda environment from the environment.yml file.
  2. Follow the instructions here to download the MEPS 16 dataset.

General Instructions

The saved_models folder contains the pretrained DNNs used in our experiments for all datasets and seeds. The data folder contains the dataset used in our experiments.

The simulated annealing and random walk repair strategies are implemented in sa.py. The utils.py script contains utility functions to preprocess the raw datasets and compute the fairness score of the DNN predictions. The model.py script defines the DNN architectures for all datasets.

The sa_experiments and random_experiments contain scripts to run SA and RW strategies for the following datasets and sensitive attributes:

  1. adult_race: Adult Census Income dataset with Race
  2. adult: Adult Census Income dataset with Sex
  3. bank: Bank Marketing dataset with Age
  4. compas: Compas Software with Race
  5. compas_sex: Compas Software with Sex
  6. default: Default Credit with Sex
  7. meps: Medical Expenditure with Race

To retrain your own models, please use the training scripts provided in train_models. Each training script trains models for a given dataset for 10 seeds. The training scripts contain the dataset-specific hyperparameters found after hyperparameter tuning.

Citation

@misc{dasu2024neufairneuralnetworkfairness,
      title={NeuFair: Neural Network Fairness Repair with Dropout}, 
      author={Vishnu Asutosh Dasu and Ashish Kumar and Saeid Tizpaz-Niari and Gang Tan},
      year={2024},
      eprint={2407.04268},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2407.04268}, 
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages