To evaluate machine translation, they use several methods, some of which we fully implemented
-
Updated
Mar 4, 2025 - Python
To evaluate machine translation, they use several methods, some of which we fully implemented
We at the Telecommunication Research Center decided to test and evaluate the Tergoman machine translation system. This evaluation is done by 6 algorithms
An interactive tool for evaluating English-to-Bengali machine translations using the NLLB (No Language Left Behind) model and METEOR evaluation metrics. The tool uses Facebook's NLLB-200-1.3B model from Hugging Face to translate English text to Bengali.
Add a description, image, and links to the meteor-score topic page so that developers can more easily learn about it.
To associate your repository with the meteor-score topic, visit your repo's landing page and select "manage topics."