Skip to content

Files

Latest commit

 

History

History
14 lines (13 loc) · 499 Bytes

what-is-the-bart-transformer-in-nlp.md

File metadata and controls

14 lines (13 loc) · 499 Bytes
title date categories tags
What is the BART Transformer in NLP?
2021-02-15
buffer
deep-learning
bart
bert
nlp
transformers

The Bidirectional and Auto-Regressive Transformer or BART is a Transformer that combines the Bidirectional Encoder (i.e. BERT like) with an Autoregressive decoder (i.e. GPT like) into one Seq2Seq model. In other words, it gets back to the original Transformer architecture proposed by Vaswani, albeit with a few changes.