You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If run with fine_tune off, there is another error:
Error(s) in loading state_dict for SequentialRNN:
Missing key(s) in state_dict: "0.fwd_lm.encoder.weight", "0.fwd_lm.encoder_dp.emb.weight", ...
Unexpected key(s) in state_dict: "fwd_lm.0.encoder.weight", "fwd_lm.0.encoder_dp.emb.weight", ...
The text was updated successfully, but these errors were encountered:
Great. I think the fix did it. I managed to test pretrain_lm on Arabic wiki tokens. The model (only 1 cycle) was read successfully by fastai's language_model_learner (implemented IMDB sample lesson - using machine translation to convert texts to Arabic). Will work on train_clas next.
As for drop_mult, may be it can be passed as a hyberparameter (the 0.1 default fits large tokens).
While running the train_clas.py script, I get
convert_weights_with_prefix. I also tried the refactored version of the same fn.
If run with fine_tune off, there is another error:
Error(s) in loading state_dict for SequentialRNN:
Missing key(s) in state_dict: "0.fwd_lm.encoder.weight", "0.fwd_lm.encoder_dp.emb.weight", ...
Unexpected key(s) in state_dict: "fwd_lm.0.encoder.weight", "fwd_lm.0.encoder_dp.emb.weight", ...
The text was updated successfully, but these errors were encountered: