Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot load model parameters from checkpoint #15

Closed
qiyea opened this issue Mar 2, 2022 · 5 comments
Closed

Cannot load model parameters from checkpoint #15

qiyea opened this issue Mar 2, 2022 · 5 comments

Comments

@qiyea
Copy link

qiyea commented Mar 2, 2022

When running the script ./command/finetune/finetune.sh, an error occurred.

Traceback (most recent call last):
File "train.py", line 14, in
cli_main()
File "/data/binVul/trex-main/fairseq_cli/train.py", line 496, in cli_main
distributed_utils.call_main(cfg, main)
File "/data/binVul/trex-main/fairseq/distributed/utils.py", line 369, in call_main
main(cfg, **kwargs)
File "/data/binVul/trex-main/fairseq_cli/train.py", line 149, in main
extra_state, epoch_itr = checkpoint_utils.load_checkpoint(
File "/data/binVul/trex-main/fairseq/checkpoint_utils.py", line 213, in load_checkpoint
extra_state = trainer.load_checkpoint(
File "/data/binVul/trex-main/fairseq/trainer.py", line 472, in load_checkpoint
raise Exception(
Exception: Cannot load model parameters from checkpoint checkpoints/similarity/checkpoint_best.pt; please ensure that the architectures match.

How can I solve it? Thanks!

@qiyea
Copy link
Author

qiyea commented Mar 3, 2022

My environment:

  • 3090 GPU
  • pytorch 1.10.2 or 1.8.0
  • cuda 11.x

@peikexin9
Copy link
Member

Hi @qiyea Thanks for your interest. I just updated the pretrained weights (checkpoint_best.pt) in the Gdrive. It should work this time.

@qiyea
Copy link
Author

qiyea commented Mar 3, 2022

@peikexin9 Thanks! It works now.

  • pytorch 1.8.0
  • cuda 11.1
  • checkpoint_best.pt that you update in the Gdrive

And How can I run inference on the trained model? Is there any scripts for inference? Thanks.

@peikexin9
Copy link
Member

Sure :-) you might want to check: command/inference/get_embedding.py

@qiyea
Copy link
Author

qiyea commented Mar 4, 2022

OK, thanks!

@qiyea qiyea closed this as completed Mar 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants