Skip to content

MiuLab/PE-Study

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PE-Study

Code of paper:

What Do Position Embeddings Learn? An Empirical Study of Pre-Trained Language Model Positional Encoding (EMNLP 2020)

Absolute & Relative Position Regression

Run code

python3 absolute.py
python3 relative.py

Text Classification

Requirement:

torch
sklearn
python-box
tqdm

Run code

  1. cd classification

  2. Download dataset: link

  3. Configurate data_path and task in config.yaml

  4. Run python3 main.py

Language Modeling

Requirement:

torch
sklearn
transformers

Run code

1.cd lm

  1. Download dataset: link

  2. Configurate TRAIN_FILE, TEST_FILE and OUTPUT in wikitext2.sh and wikitext103.sh

  3. Run

bash wikitext2.sh
bash wikitext103.sh

Machine Translation

Requirement:

torch
sklearn
fairseq==0.9.0

Run code

  1. cd nmt
  2. Prepapre dataset
bash prepare-multi30k.sh
  1. Train models
bash train_multi30k.sh
  1. Generate translation & evaluation
bash generate_multi30k.sh

Reference

Main paper to be cited

@inproceedings{wang2020position,
  title={What Do Position Embeddings Learn? An Empirical Study of Pre-Trained Language Model Positional Encoding}
  author={Wang, Yu-An and Chen, Yun-Nung},
  booktitle={EMNLP 2020},
  year={2020}
}

About

Study of Pre-Trained Positional Embeddings

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published