Skip to content

Latest commit

 

History

History
10 lines (8 loc) · 1.1 KB

NLP-transformer.md

File metadata and controls

10 lines (8 loc) · 1.1 KB

Natural Language Processing Transformer

No. Model Name Title Links Pub. Organization Release Time
1 BERT BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding paper code NAACL 2019 Google Oct 2018
2 GPT3 Language Models are Few-Shot Learners paper NeuRIPS 2020 OpenAI May 2020
3 GPT2 Language Models are Unsupervised Multitask Learners paper code arXiv OpenAI Feb 2019
4 RoBERTa RoBERTa: A Robustly Optimized BERT Pretraining Approach paper arXiv Facebook AI Jul 2019
5 XLNet XLNet: Generalized Autoregressive Pretraining for Language Understanding paper code NeuRIPS 2019 Google Jun 2019