A Seq2Seq model with attention for neural machine translation in PyTorch.
This implementation focuses on the following features:
- Modular structure to be used in other projects
- Minimal code for readability
- Full utilization of batches and GPU.
This implementation also use BLEU, NIST score and TER to evaluate the effect of the model.
- Encoder: Bidirectional GRU
- Decoder: GRU with Attention Mechanism
- Attention: Neural Machine Translation by Jointly Learning to Align and Translate
- BLUE
- NIST
- TER
- METEOR
perl mteval-v13a.pl -r example/ref.xml -s example/src.xml -t example/tst.xml
Use xml_transform.py
to transorm sentence txt to xml file.
reference: https://blog.csdn.net/angus_monroe/article/details/82943162
- GPU & CUDA
- Python3
- PyTorch
- torchtext
- Spacy
- numpy
- Visdom (optional)
Based on the following implementations