Skip to content
Codes for "Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View"
Python Shell Other
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
bert
translation
.gitignore
LICENSE
README.md

README.md

macaron-net

This repo contains the codes and pretrained models for our paper:

Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View
Yiping Lu*, Zhuohan Li*, Di He, Zhiqing Sun, Bin Dong, Tao Qin, Liwei Wang, Tie-Yan Liu

The two sub-directories includes reproducible codes, pre-trained models and instructions for the machine translation and unsupervised pretraining (BERT) tasks. Please find the READMEs in the sub-directories for the detailed instructions for reproduction.

Both implementations are based on open-sourced fairseq (v0.6.0). The codes for unsupervised pretraining tasks are based on StackingBERT. Note that currently the codes in bert subdirectories cannot be used to train translation models. We are working on merging two code bases and planning to release the unified version in the near future.

Citation

@article{lu2019understanding,
  title={Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View},
  author={Lu, Yiping and Li, Zhuohan and He, Di and Sun, Zhiqing and Dong, Bin and Qin, Tao and Wang, Liwei and Liu, Tie-Yan},
  journal={arXiv preprint arXiv:1906.02762},
  year={2019}
}
You can’t perform that action at this time.