Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
-
Updated
May 25, 2024 - Python
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
all kinds of text classification models and more with deep learning
A TensorFlow Implementation of the Transformer: Attention Is All You Need
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
A collection of important graph embedding, classification and representation learning papers with implementations.
Show, Attend, and Tell | a PyTorch Tutorial to Image Captioning
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Keras Attention Layer (Luong and Bahdanau scores).
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Text classifier for Hierarchical Attention Networks for Document Classification
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Reformer, the efficient Transformer, in Pytorch
To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Visualizing RNNs using the attention mechanism
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
MORAN: A Multi-Object Rectified Attention Network for Scene Text Recognition
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."