Skip to content

Warning during save_hyperparameter() gives misleading advice? #13615

You must be logged in to vote

the attributes that are not saved as hparams need to be passed explicitly. Considering you are using load_from_checkpoint API, you can use model = MyModule.load_from_checkpoint(ckpt_path, model=model).

If you include it in the hparams, your checkpoints will be unnecessarily big and can create issues if you have large models.

By

is already saved during checkpointing.

it means the model weights are already saved in the checkpoint and are loaded using PyTorch API, not as hparams.

Replies: 8 comments 7 replies

You must be logged in to vote
1 reply
@adosar

Answer selected by hogru

You must be logged in to vote
0 replies

You must be logged in to vote
0 replies

You must be logged in to vote
0 replies

You must be logged in to vote
1 reply
@hogru

You must be logged in to vote
0 replies

You must be logged in to vote
2 replies
@LukasK13

@SuryaThiru

You must be logged in to vote
3 replies
@e-yi

@fses91

@arijit-hub

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lightningmodule pl.LightningModule pl Generic label for PyTorch Lightning package