Action recognition using soft attention based deep recurrent neural networks
-
Updated
Oct 30, 2016 - Jupyter Notebook
Action recognition using soft attention based deep recurrent neural networks
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Learning to Auto-Complete using RNN Language Models
TensorFlow implementation of "Pointer Networks"
Neural Machine Translation with Attention (Dynet)
Hierarchical Attention Network for Document Classification
Models for identifying question pairs that have the same intent implemented by tensorflow.
Tensorflow Implementation of Im2Latex
An attention mechanism implementation based on keras with theano as backend.
Multimodal Deep Attention Recurrent Q-Network for perceivable social human-robot interaction.
Keras implementation of https://arxiv.org/pdf/1602.03609.pdf
My bachelor's degree thesis (with code and experiments) on sentiment classification of Russian texts using Bi-RNN with attention mechanism.
Unofficial implementation of Show, Ask, Attend, and Answer VQA model in Keras
Multigrid Neural Architecture
Chinese Poetry Generation
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
sequence to sequence model using tensorflow
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."