Deployed locally
-
Updated
Apr 30, 2024 - Python
Deployed locally
Official implementation of the paper "FedLSF: Federated Local Graph Learning via Specformers"
Transformer model based on the research paper: "𝗔𝘁𝘁𝗲𝗻𝘁𝗶𝗼𝗻 𝗜𝘀 𝗔𝗹𝗹 𝗬𝗼𝘂 𝗡𝗲𝗲𝗱"
Flexible Python library providing building blocks (layers) for reproducible Transformers research (Tensorflow ✅, Pytorch 🔜, and Jax 🔜)
GPT model that can take a text file from anywhere on the internet and imitate the linguistic style of the text
3D Printing Extrusion Detection using Multi-Head Attention Model
PyTorch implementation of the Transformer architecture from the paper Attention is All You Need. Includes implementation of attention mechanism.
The implementation of transformer as presented in the paper "Attention is all you need" from scratch.
A repository for implementations of attention mechanism by PyTorch.
A PyTorch Implementation of PGL-SUM from "Combining Global and Local Attention with Positional Encoding for Video Summarization", Proc. IEEE ISM 2021
Implementation of "Attention is All You Need" paper
Synthesizer Self-Attention is a very recent alternative to causal self-attention that has potential benefits by removing this dot product.
This package is a Tensorflow2/Keras implementation for Graph Attention Network embeddings and also provides a Trainable layer for Multihead Graph Attention.
list of efficient attention modules
A Transformer Encoder where the embedding size can be down-sized.
An experimental project for autonomous vehicle driving perception with steering angle prediction and semantic segmentation using a combination of UNet, attention and transformers.
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.
Chatbot using Tensorflow (Model is transformer) ko
Add a description, image, and links to the multihead-attention topic page so that developers can more easily learn about it.
To associate your repository with the multihead-attention topic, visit your repo's landing page and select "manage topics."