Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

get RuntimeError: Error(s) in loading state_dict for E2E: when doing test on model from training #8

Closed
icang1694 opened this issue Feb 13, 2023 · 2 comments

Comments

@icang1694
Copy link

here's my config for training, i use FUNSD dataset
%run 'main.py' --add-geom --add-embs --add-hist --add-visual --add-eweights --src-data 'FUNSD' --gpu 0 --edge-type 'fully' --node-granularity 'gt' --model 'e2e' --weights *.pt

then i run the best model using this:
%run 'main.py' -addG -addT -addE -addV --gpu 0 --test --weights e2e-20230213-0530.pt

then i got the Error:
File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1497, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for E2E:
Unexpected key(s) in state_dict: "projector.modalities.3.0.weight", "projector.modalities.3.0.bias", "projector.modalities.3.1.weight", "projector.modalities.3.1.bias".
size mismatch for projector.modalities.1.0.weight: copying a param with shape torch.Size([300, 4]) from checkpoint, the shape in current model is torch.Size([300, 300]).
size mismatch for projector.modalities.2.0.weight: copying a param with shape torch.Size([300, 300]) from checkpoint, the shape in current model is torch.Size([300, 1448]).
size mismatch for message_passing.linear.weight: copying a param with shape torch.Size([1200, 2400]) from checkpoint, the shape in current model is torch.Size([900, 1800]).
size mismatch for message_passing.linear.bias: copying a param with shape torch.Size([1200]) from checkpoint, the shape in current model is torch.Size([900]).
size mismatch for message_passing.lynorm.weight: copying a param with shape torch.Size([1200]) from checkpoint, the shape in current model is torch.Size([900]).
size mismatch for message_passing.lynorm.bias: copying a param with shape torch.Size([1200]) from checkpoint, the shape in current model is torch.Size([900]).
size mismatch for edge_pred.W1.weight: copying a param with shape torch.Size([300, 2414]) from checkpoint, the shape in current model is torch.Size([300, 1814]).
size mismatch for node_pred.0.weight: copying a param with shape torch.Size([4, 1200]) from checkpoint, the shape in current model is torch.Size([4, 900]).

@andreagemelli
Copy link
Owner

andreagemelli commented Feb 19, 2023

Hi @icang1694 ,
at test time it seems like you are not using histogram textual embeddings --add-hist.
Try again and if does not work post here the two models, during training and testing time (so we can get a look at them).
A.

@icang1694
Copy link
Author

thankyou @andreagemelli i miss that option

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants