You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.
Thanks for you work. I have a question. When I training for languages with great differences, such as Chinese-English, English-Kazakh. Is it a good choice to share all parameters? I notice that XLM usually share all parameters.
The text was updated successfully, but these errors were encountered:
Hi, yes I don't think this is a problem. More and more studies have shown that a single model can handle multiple languages even when these are very different. Of course, if there are very few anchor points it is better to have parallel data to facilitate the alignment.
Thanks for your instant reply. I will seriously in thinking about this question. The neural network is so amazing that some problems are difficult to understand. Such as share all the parameters for languages with few anchor points also still achieve good results.
Thanks for you work. I have a question. When I training for languages with great differences, such as Chinese-English, English-Kazakh. Is it a good choice to share all parameters? I notice that XLM usually share all parameters.
The text was updated successfully, but these errors were encountered: