Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when loading checkpoint. #1

Closed
mrdrozdov opened this issue Feb 12, 2021 · 3 comments
Closed

Error when loading checkpoint. #1

mrdrozdov opened this issue Feb 12, 2021 · 3 comments

Comments

@mrdrozdov
Copy link

RuntimeError: Error(s) in loading state_dict for AMRBartForConditionalGeneration:
Unexpected key(s) in state_dict: "model.encoder.embed_backreferences.weight", "model.encoder.embed_backreferences.transform.weight", "model.encoder.embed_backreferences.transform.bias", "model.decoder.embed_backreferences.weight", "model.decoder.embed_backreferences.transform.weight", "model.decoder.embed_backreferences.transform.bias".

When running:

python bin/predict_amrs.py \
    --datasets <AMR-ROOT>/data/amrs/split/test/*.txt \
    --gold-path data/tmp/amr2.0/gold.amr.txt \
    --pred-path data/tmp/amr2.0/pred.amr.txt \
    --checkpoint runs/<checkpoint>.pt \
    --beam-size 5 \
    --batch-size 500 \
    --device cuda \
    --penman-linearization --use-pointer-tokens

With the http://nlp.uniroma1.it/AMR/AMR2.parsing-1.0.tar.bz2 checkpoint (AMR2.amr-lin3.pt).

Can those keys be ignored from the checkpoint?

@mbevila
Copy link
Collaborator

mbevila commented Feb 15, 2021

Thank you for pointing out the issue! Seems I have uploaded unpatched checkpoints.

Yeah, those parameters are not used and can be completely ignored.

I have added to the repo a simple script (bin/patch_legacy_checkpoint.py) to patch checkpoints so that they work with the code that you have. Meanwhile I'll upload patched files to the server.

Sorry for the inconvenience!

(Btw, I really liked DIORA/S-DIORA)

@mbevila
Copy link
Collaborator

mbevila commented Feb 16, 2021

I have uploaded new checkpoints. They don't need patching.

@mbevila mbevila closed this as completed Feb 16, 2021
@mrdrozdov
Copy link
Author

Thanks for the help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants