Skip to content

Latest commit

 

History

History

NLP_Attention

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 

Attention

Attention 被我所熟知大概是因为 Attention is all you need [1] 这篇文章,然而这篇并非提出 attention 地文章,而是构建了一个纯 attention 的结构来取代以往recurrent 和 convolution 混搭的思路,即为 Transformer,这个我之后再讨论,今天主要在 vanilla attention 上展开。

Reference

  1. Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems 30 (2017): 5998-6008.
  2. Attention? Attention! - Lil'Log