Skip to content

leyuanheart/An-Implementation-of-Transformer

Repository files navigation

An-Implementation-of-Transformer

This repository is an implementation of Transformer with Pytorch.

All components needed for building a transformer are contained in transformer_package.py.

A simple copy-task.ipynb and Multi30k German-English Translation task.ipynb are two examples of how to implement transformer, including the data preparation, training and evaluating processes.

gpt.py is a reimplementation of Andrej Karpathy's nanoGPT but rid of those distracting parts such as the use of GPU and distributed computing.

I also write a Blog to introduce Attention, Transformer and GPT.

References

http://nlp.seas.harvard.edu/annotated-transformer/

https://github.com/jadore801120/attention-is-all-you-need-pytorch

https://github.com/karpathy/nanoGPT

Releases

No releases published

Packages

No packages published