The repository provides the code and resources for measuring and addressing societal biases in retrieval results, as discussed in the paper:
Societal Biases in Retrieved Contents: Measurement Framework and Adversarial Mitigation of BERT Rankers.
Navid Rekabsaz, Simone Kopeinik, Markus Schedl.
In proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2021). Paper
Please find more information about each of the contributions in its corresponding folder:
dataset
: The dataset of fairness sensitive queries.measurement
: The code for measuring FaiRR and NFaiRR metrics, calculated on ranking results or on ranker-agnostic document sets.adversarial_mitigation
: The code for adversarial training of BERT rankers to mitigate gender bias (AdvBERT).
@inproceedings{rekabsaz2021fairnessir,
title={Societal Biases in Retrieved Contents: Measurement Framework and Adversarial Mitigation of BERT Rankers},
author={Rekabsaz, Navid and Kopeinik, Simone and Schedl, Markus},
booktitle={In Proceeding of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR'21), July 11–15, 2021, Virtual Event, Canada},
doi={10.1145/3404835.3462949}
year={2021},
publisher = {{ACM}}
}
Contact Navid for any question.