Skip to content

An attempt to make Back-Translation differentiable, using probability weighted embeddings for predicted translations in the nucleus of the predicted distribution over target language sentences.

License

Notifications You must be signed in to change notification settings

deterministic-algorithms-lab/Back-2-Back-Translation

Repository files navigation

Back-2-Back Translation

Dataset

The dataset used is WMT-14 en-de .

Model

Back2Back

Training

See colab notebook

Results

Learning from Explanations with Neural Execution Tree

About

An attempt to make Back-Translation differentiable, using probability weighted embeddings for predicted translations in the nucleus of the predicted distribution over target language sentences.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published