Skip to content
Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All You Need"
Branch: master
Clone or download
Latest commit cd97437 Apr 21, 2019

README.md

Attention is all you need (Keras) [WIP]

Implementation of the Transformer architecture described by Vaswani et al. in "Attention Is All You Need" using the Keras Utility & Layer Collection (kulc).

This repository contains the code the create the model, train and evaluate it. Furthermore, it contains utility code to load a translation dataset (en2de) to run the experiments on it.

You can’t perform that action at this time.