You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Include BART and MBART in the list of supported Fairseq architectures
Add Fairseq converter option --no_default_special_tokens to require all special tokens to be set by the user during inference, including the decoder start tokens (for example, this is required by MBART-25 to properly set the language tokens)
Fixes and improvements
Fix conversion of Post-Norm Transformers trained with OpenNMT-tf
Fix scoring with Fairseq models that used an incorrect decoder start token (Fairseq uses </s> as the decoder start token, not <s>)
Fix scoring result to include the end of sentence token
Ignore OpenNMT-py options --alignment_layer and --alignment_heads for models that are not trained with alignments
Enable batch encoding in return_alternatives translation mode (the decoding still runs sequentially)
Make enumerations ctranslate2.specs.Activation and ctranslate2.specs.EmbeddingsMerge public since they could be used to configure the Transformer specification