title | date | categories | tags | ||||||
---|---|---|---|---|---|---|---|---|---|
What is the BART Transformer in NLP? |
2021-02-15 |
|
|
The Bidirectional and Auto-Regressive Transformer or BART is a Transformer that combines the Bidirectional Encoder (i.e. BERT like) with an Autoregressive decoder (i.e. GPT like) into one Seq2Seq model. In other words, it gets back to the original Transformer architecture proposed by Vaswani, albeit with a few changes.