- [X] dataset
- [X] metrics
- [X] losses
- [X] optimizers
- [ ] conf
- [8/8] models
- [X] Encoder
- [X] Decoder
- [X] MultiheadAttention
- [X] PositionwiseFeedForward
- [X] ScaledDotProdAttention
- [X] PositionalEncoding
- [X] Embedding
- [X] share proj weight
- [X] task
- [X] data
- [X] hydra conf
- [X] optuna settings
- [ ] mlflow settings
- train/dev/test.en
this is a example . hello !
- train/dev.ja
これ は 例 です 。 こんにち は 。
python -m unittest discover
python task.py --help
python task.py [some argument for update parameters]
ref. hydra (facebook)
python task.py --help
python task.py [some argument for update parameters with comma] -m
ref. optuna (PFN)
python gen_conf.py [some argument for update parameters] > optuna_args.yaml
# (edit some line in tuning.py for your tuning parameter)
python tuninig.py
https://github.com/jadore801120/attention-is-all-you-need-pytorch