A minimalist 45 minutes implementation of the transformer backbone (encoder, decoder)
-
Updated
Aug 17, 2023 - Python
A minimalist 45 minutes implementation of the transformer backbone (encoder, decoder)
Code implementation of computer vision models for practice based on pytorch and einops.
Replication of "Attention Is All You Need" (Vaswani et al. 2017)
Pick sentence pairs, which, hopefully, would appear normal to human in the scenario of real world chit-chat, from lines of sentences.
Improve the Attentive State-Space Model by transformer
GPT-2 style architecture for training language generators for specific tasks. [Production Ready]
PyTorch implementation for an MPI-based dot-product attention distributed implementation
Talk with your transformer buddy on counseling
Conversational Agents Collection
Implements fast decoding on the transformer
Decoder model for language modelling
Automated News Aggregation Tool
My own implementation of ICML 2019 paper: Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks
implementation of transformer using pytorch_lightning
Implementation of Power Law Graph Transformer for Machine Translation and Representation Learning.
A Malware Detection Project during UC Davis Summer Research Program
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."