seq2seq model enhanced with attention mechanism
-
Updated
Mar 14, 2018 - Python
seq2seq model enhanced with attention mechanism
Experiments in meta-learning visual attention in convolutional neural networks.
nlp intro project
Project 3 of Term 1 in the Udacity Self Driving Car Nanodegree
Relational Attention in PyTorch
Environment Sound Classification Using An Attention-Based Residual Neural Network
Implementation of Deep Learning based Language Models from scratch in PyTorch
Attention-based Adaptive filter designing for keyword classification
NER for Chinese electronic medical records. Use doc2vec, self_attention and multi_attention.
Simple from-scratch implementations of transformer-based models that match the state of the art.
Using a 3D Nearby Self-Attention Transformer to leverage the spatiotemporal nature of video for representation learning.
A model for extracting numeral relations in FinNum2
Implementation of Attention is all you need(2017)
A repository for implementations of transformer("Attention Is All You Need") by PyTorch.
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."