You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, can you add more documentation for the Usage? I'm a little bit confused about how to use the model for training until getting the final result. Thank you
The text was updated successfully, but these errors were encountered:
Hey @muhammadfhadli1453, thanks for filing this issue, and apologies for the belated reply.
I'll certainly try to add more documentation on model usage. For the time being, I'll try to explain more on this thread in the hopes of answering your question.
You've probably already seen this from the README:
You can consider gMLPForLanguageModeling as a BERT-like encoder model. Since this is an encoder model, you would train this model via masked language modeling (MLM), where some parts of the tokens are replaced with [MASK] and the model has to predict the actual token for each mask token.
Hi, can you add more documentation for the Usage? I'm a little bit confused about how to use the model for training until getting the final result. Thank you
The text was updated successfully, but these errors were encountered: