Skip to content

Stellakats/Master-thesis-gender-bias

Repository files navigation

Measuring Gender Bias in Contextualized Embeddings

This repository includes the code necessary to reproduce the experiments used to measure gender bias in contextualized embeddings as part of this Master Thesis Project which was done in collaboration with KTH and Peltarion.

This work has also been presented in the "Artificial intelligence with Biased or Scarce Data" (AIBSD) Workshop held In Conjunction with the 36th AAAI Conference on Artificial Intelligence 2022, the proceedings of which have also been published with MDPI. Please find the proceeding paper here.

Methods:

  1. Measuring gender bias through the downstream task of Semantic textual similarity.
    Languages used: English and Swedish.
    Models used: T5 and mT5.
  2. Measuring gender bias on contextualized embeddings using the gender polarity metric.
    Language used: English.
    Model used: T5.

Datasets:

We create new datasets out of the Swedish STS benchmark and the English STS benchmark (see dataloader/create_bias_dataset.py)

Reproducibility

1. Clone repository

git clone git@github.com:Stellakats/Master-thesis-gender-bias.git
cd Master-thesis-gender-bias

2. Create new virtual environment

conda create --name <env_name> python=3.7
conda activate <env_name> 
pip install -r requirements.txt

3. Reproduce first method

To reproduce the results using the first method simply run the following script.

python run_bias_experiments.py

Once the experiments are complete, the results will be available in results/bias_experiments.
The results/bias_experiments directory will include:

  • graphs for all 50 occupations alongside with the corresponding .csv files for the small, base and large version of T5
  • graphs for all 50 occupations alongside with the corresponding .csv files for the small and base version of mT5 for both Swedish and English
  • one graph for all sizes of T5 vs a specific occupation of choice.
    The default occupation used is technician.
    If you want to plot another occupation, please run python run_bias_experiments.py --occupation "<occupation>" instead.

4. Reproduce second method

To reproduce the second method and visualize the results, please run the notebook gender_polarity_t5_embeds.ipynb.

About

Detection and measurement of gender bias on T5 and mT5.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published