A Jax-based library for designing and training transformer models from scratch.
-
Updated
May 26, 2024 - Python
A Jax-based library for designing and training transformer models from scratch.
The official code of "CSTA: CNN-based Spatiotemporal Attention for Video Summarization"
Orchestrate Swarms of Agents From Any Framework Like OpenAI, Langchain, and Etc for Business Operation Automation. Join our Community: https://discord.gg/DbjBMJTSWD
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Self configuring and adapting vision transformer for segmentation of 3d images
Gradient Frequency Attention: Tell Neural Networks where speaker information is.
X-Llama🦙 is an Extensible advanced language model framework, inspired by the original Llama model.
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Simplified Implementation of SOTA Deep Learning Papers in Pytorch
A transformer-based model for learning datasets of multiband light curves.
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
PyTorch implementation for Convolutional Hierarchical Attention Network for Query-Focused Video Summarization paper, accepted by AAAI 2020 conference.
Offical implementation of "Attention Spiking Neural Networks" (IEEE T-PAMI2023)
LSTM-ARIMA with Attention and Multiplicative Decomposition for Sophisticated Stock Forecasting.
Implementation of Liquid Nets in Pytorch
Open source implementation of "Vision Transformers Need Registers"
Implementation of the model "AudioFlamingo" from the paper: "Audio Flamingo: A Novel Audio Language Model with Few-Shot Learning and Dialogue Abilities"
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with Sparse Transformers"
Text Summarization Modeling with three different Attention Types
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."