Skip to content
Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
Branch: master
Clone or download
Latest commit 65ebd8e Oct 18, 2018
Type Name Latest commit message Commit time
Failed to load latest commit information.
LICENSE Initial commit Dec 2, 2017 removes is_eval Aug 30, 2018 Update Oct 18, 2018 smaller dataset Dec 7, 2017

mini seq2seq

Minimal Seq2Seq model with attention for neural machine translation in PyTorch.

This implementation focuses on the following features:

  • Modular structure to be used in other projects
  • Minimal code for readability
  • Full utilization of batches and GPU.

This implementation relies on torchtext to minimize dataset management and preprocessing parts.

Model description


  • GPU & CUDA
  • Python3
  • PyTorch
  • torchtext
  • Spacy
  • numpy
  • Visdom (optional)

download tokenizers by doing so:

sudo python3 -m spacy download de
sudo python3 -m spacy download en


Based on the following implementations

You can’t perform that action at this time.