Skip to content

khappiya/rnn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 

Repository files navigation

RNN

The fall of RNN / LSTM https://towardsdatascience.com/the-fall-of-rnn-lstm-2d1594c74ce0

Attention? Attention! https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html

Sequence To Sequence Attention Models In DyNet https://talbaumel.github.io/blog/attention/

最近のDeep Learning (NLP) 界隈におけるAttention事情 https://www.slideshare.net/yutakikuchi927/deep-learning-nlp-attention

Attentionで拡張されたRecurrent Neural Networks https://deepage.net/deep_learning/2017/03/03/attention-augmented-recurrent-neural-networks.html

自然言語処理における、Attentionの耐えられない短さ https://qiita.com/icoxfog417/items/f170666d81f773e4b1a7

Seq2Seq+Attentionのその先へ https://qiita.com/ymym3412/items/c84e6254de89c9952c55

Attention Is All You Need 논문리뷰 https://github.com/YBIGTA/DeepNLP-Study/wiki/Attention-Is-All-You-Need-%EB%85%BC%EB%AC%B8%EB%A6%AC%EB%B7%B0

論文解説 Attention Is All You Need (Transformer) http://deeplearning.hatenablog.com/entry/transformer

【前編】深層学習による自然言語処理 - ニューラル機械翻訳への理論 - http://deeplearning.hatenablog.com/entry/neural_machine_translation_theory

[DL輪読会]Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction https://www.slideshare.net/DeepLearningJP2016/dlpervasive-attention-2d-convolutional-neural-networks-for-sequencetosequence-prediction

An Empirical Evaluation of Generic Convolutional and Recurrent Networksfor Sequence Modeling https://arxiv.org/pdf/1803.01271.pdf

Memory, attention, sequences https://towardsdatascience.com/memory-attention-sequences-37456d271992

http://petar-v.com/GAT/

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published