Skip to content

skipdividedd/bert_probing

Repository files navigation

Repository for probing research

Our pipeline

  1. Get training data.
  2. 'Spoiled' data, so that adjective's gender is broken.
  3. Trained two Russian BERT models.
  4. Marked data from rusenteval with stanza either per sent or per word.
  5. Conducted 3 types of probing experiments (by CLS token, by mean sentence embedding and per token), largely relying on NeuroX.
  6. Compared the results.

Probably useful

There are src files for probing and an example notebook.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors