Yet another tensorflow implementation of "Attention is all you need" (a.k.a. Transformer)
-
Updated
Dec 28, 2017 - Python
Yet another tensorflow implementation of "Attention is all you need" (a.k.a. Transformer)
pytorch Transformer model with byte-pair encoding
Adversarial Machine Translation with pytorch
Implementation of Attention is all you need Transformer in Pytorch
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Transfomer based implementation of "An Efficient Framework For Learning Sentence Representation" Logeswaran et al
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Google AI BERT 2018 pytorch implementation
Google AI 2018 BERT pytorch implementation
A simple Neural Network for sentiment analysis, embedding sentences using a Transformer network.
Image classification with NVIDIA TensorRT from TensorFlow models.
Code and Configuration for Bringing self-attention architectures into real world scenarios
Simple Tensorflow Implementation of Transformer introduced by "Attention is All You Need" (NIPS 2017)
Kaggle新赛(baseline)-基于BERT的fine-tuning方案+基于tensor2tensor的Transformer Encoder方案
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
A Pytorch implementation of Transformer and Weighted Transformer
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."