Models for the WMT21Similar Languages: Improving Similar Language Translation With Transfer Learning are available on Huggingface. Please use the following links to retrieve the models.
- Bambara - French.
- French - Bambara.
- Spanish - Portuguese.
- Portuguese - Spanish.
- Spanish - Catalan.
- Catalan - Spanish.
If you use any of these models, please site this paper as follows:
@inproceedings{adebara-abdul-mageed-2021-improving,
title = "Improving Similar Language Translation With Transfer Learning",
author = "Adebara, Ife and
Abdul-Mageed, Muhammad",
booktitle = "Proceedings of the Sixth Conference on Machine Translation",
month = nov,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.wmt-1.27",
pages = "273--278",
abstract = "We investigate transfer learning based on pre-trained neural machine translation models to translate between (low-resource) similar languages. This work is part of our contribution to the WMT 2021 Similar Languages Translation Shared Task where we submitted models for different language pairs, including French-Bambara, Spanish-Catalan, and Spanish-Portuguese in both directions. Our models for Catalan-Spanish (82.79 BLEU)and Portuguese-Spanish (87.11 BLEU) rank top 1 in the official shared task evaluation, and we are the only team to submit models for the French-Bambara pairs.",
}