This repository includes some demo Seq2Seq models.
Note: The project refers to https://github.com/bentrevett/pytorch-seq2seq
Datasets:
dataset1
: news-commentary-v14.de-en
Models:
model1
: Sequence to Sequence Learning with Neural Networksmodel2
: Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translationmodel3
: Neural Machine Translation by Jointly Learning to Align and Translatemodel4
:model5
: Convolutional Sequence to Sequence Learningmodel6
: Attention is All You Need
PYTHONPATH=. python dataprocess/process.py
- For loader
# loader1
PYTHONPATH=. python loaders/loader1.py
# loader2
PYTHONPATH=. python loaders/loader2.py
- For module
# module1
PYTHONPATH=. python modules/module1.py
# module2
PYTHONPATH=. python modules/module2.py
# module3
PYTHONPATH=. python modules/module3.py
# module4
PYTHONPATH=. python modules/module4.py
# module5
PYTHONPATH=. python modules/module5.py
# module6
PYTHONPATH=. python modules/module6.py
python main.py
You can change the config either in the command line or in the file utils/parser.py
Here are the examples for each module:
# module1
python main.py \
--module 1 \
--grad_clip 1 \
--rnn_type lstm \
--enc_emb_dim 256 \
--dec_emb_dim 256 \
--enc_hid_dim 512 \
--dec_hid_dim 512 \
--enc_n_layers 2 \
--dec_n_layers 2 \
--enc_n_directions 1 \
--dec_n_directions 1 \
--enc_dropout 0.5 \
--dec_dropout 0.5
# module2
python main.py \
--module 2 \
--grad_clip 1 \
--rnn_type gru \
--enc_emb_dim 256 \
--dec_emb_dim 256 \
--enc_hid_dim 512 \
--dec_hid_dim 512 \
--enc_n_layers 1 \
--dec_n_layers 1 \
--enc_n_directions 1 \
--dec_n_directions 1 \
--enc_dropout 0.5 \
--dec_dropout 0.5
# module3
python main.py \
--module 3 \
--grad_clip 1 \
--rnn_type gru \
--enc_emb_dim 256 \
--dec_emb_dim 256 \
--enc_hid_dim 512 \
--dec_hid_dim 512 \
--enc_n_layers 1 \
--dec_n_layers 1 \
--enc_n_directions 2 \
--dec_n_directions 1 \
--enc_dropout 0.5 \
--dec_dropout 0.5
# module4
python main.py \
--module 4 \
--grad_clip 1 \
--rnn_type gru \
--enc_emb_dim 256 \
--dec_emb_dim 256 \
--enc_hid_dim 512 \
--dec_hid_dim 512 \
--enc_n_layers 1 \
--dec_n_layers 1 \
--enc_n_directions 2 \
--dec_n_directions 1 \
--enc_dropout 0.5 \
--dec_dropout 0.5
# module5
python main.py \
--module 5 \
--grad_clip 0.1 \
--enc_emb_dim 256 \
--dec_emb_dim 256 \
--enc_hid_dim 512 \
--dec_hid_dim 512 \
--enc_filter_layers 10 \
--dec_filter_layers 10 \
--enc_kernel_size 3 \
--dec_kernel_size 3 \
--enc_dropout 0.25 \
--dec_dropout 0.25
# module6
python main.py \
--module 6 \
--grad_clip 1 \
--enc_hid_dim 256 \
--dec_hid_dim 256 \
--enc_transformer_layers 3 \
--dec_transformer_layers 3 \
--enc_attention_heads 8 \
--dec_attention_heads 8 \
--enc_mid_dim 512 \
--dec_mid_dim 512 \
--enc_dropout 0.1 \
--dec_dropout 0.1