Skip to content

All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.

License

Notifications You must be signed in to change notification settings

zaidalyafeai/AttentioNN

Repository files navigation

AttentioNN

All about attention in neural networks described as colab notebooks

Notebooks

Name Description Notebook
Attention maps How does a CNN attent to image objects
Attention in nmt Attention mechanism in neural machine translation
Attention in image captioning Attention in image captioning using sof attention and double stochastic regularization
Transofrmer I Positional encoding, mutli-head attention and point-wise feed-forward neural networks
Transofrmer II Masked multi-head attention with layer normalization

About

All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published