You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This commit changing the AdamW implementation used from transformers to torch introduces an error in the example pretraining script make_multilingual.py.
In the transformers implementation there is an argument called correct_bias.
Hi,
This commit changing the AdamW implementation used from
transformers
totorch
introduces an error in the example pretraining script make_multilingual.py.In the
transformers
implementation there is an argument called correct_bias.This argument doesn't exist in
torch
library's implementation: https://pytorch.org/docs/stable/generated/torch.optim.AdamW.html .Can be fixed by just removing
correct_bias
from the dict passed to optimizer_params.https://github.com/UKPLab/sentence-transformers/blob/master/examples/training/multilingual/make_multilingual.py#L220
The text was updated successfully, but these errors were encountered: