Gender Bias in Coreference Resolution:Evaluation and Debiasing Methods
Jieyu Zhao, Tianlu Wang, Mark Yatskar, Vicente Ordonez, Kai-Wei Chang. NAACL-2018 short paper.
- Please refer to Higher-order Coreference Resolution with Coarse-to-fine Inference for the coreference resolution codes and a brief instruction here.
- To use our dataset, change the evaulation_path and conll_eval_path in experiment.conf to the winobias dataset.
If you want to try the WinoBias dataset using allennlp, remember to add "pos_tag != '-'" in line 274 here.
We analyze different resolution systems to understand the gender bias issues lying in such systems. Providing the same sentence to the system but only changing the gender of the pronoun in the sentence, the performance of the systems varies. To demonstrate the gender bias issue, we created a WinoBias dataset.
The dataset is generated by the five authors of this paper. We use the professions from the Labor Force Statistics which show gender stereotypes:
|Professions and their percentages of women|
|Male biased||Female biased|