You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried running the predict_amrs_from_plaintext.py script but came cross a runtime error. It occurred when loading the state_dict of the checkpoint you released for AMR 3.0 for AMRBartForConditionalGeneration. I saw that you suggested that transformers version < 3 should be used. I experimented version 2.11.0 as suggested in your requirements.txt file, and also version 2.8.0, but the problem persisted. The tokenizers is of version 2.7.0. I was wondering if you have any idea regarding what the reason might be, and how I can fix the problem? Many thanks!
I've attached the full error message below:
RuntimeError: Error(s) in loading state_dict for AMRBartForConditionalGeneration:
size mismatch for final_logits_bias: copying a param with shape torch.Size([1, 53587]) from checkpoint, the shape in current model is torch.Size([1, 53075]).
size mismatch for model.shared.weight: copying a param with shape torch.Size([53587, 1024]) from checkpoint, the shape in current model is torch.Size([53075, 1024]).
size mismatch for model.encoder.embed_tokens.weight: copying a param with shape torch.Size([53587, 1024]) from checkpoint, the shape in current model is torch.Size([53075, 1024]).
size mismatch for model.decoder.embed_tokens.weight: copying a param with shape torch.Size([53587, 1024]) from checkpoint, the shape in current model is torch.Size([53075, 1024]).
The text was updated successfully, but these errors were encountered:
Thanks for this amazing work!
I tried running the predict_amrs_from_plaintext.py script but came cross a runtime error. It occurred when loading the state_dict of the checkpoint you released for AMR 3.0 for AMRBartForConditionalGeneration. I saw that you suggested that transformers version < 3 should be used. I experimented version 2.11.0 as suggested in your requirements.txt file, and also version 2.8.0, but the problem persisted. The tokenizers is of version 2.7.0. I was wondering if you have any idea regarding what the reason might be, and how I can fix the problem? Many thanks!
I've attached the full error message below:
RuntimeError: Error(s) in loading state_dict for AMRBartForConditionalGeneration:
size mismatch for final_logits_bias: copying a param with shape torch.Size([1, 53587]) from checkpoint, the shape in current model is torch.Size([1, 53075]).
size mismatch for model.shared.weight: copying a param with shape torch.Size([53587, 1024]) from checkpoint, the shape in current model is torch.Size([53075, 1024]).
size mismatch for model.encoder.embed_tokens.weight: copying a param with shape torch.Size([53587, 1024]) from checkpoint, the shape in current model is torch.Size([53075, 1024]).
size mismatch for model.decoder.embed_tokens.weight: copying a param with shape torch.Size([53587, 1024]) from checkpoint, the shape in current model is torch.Size([53075, 1024]).
The text was updated successfully, but these errors were encountered: