-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Checkpoints cannot be loaded in non-pl env #2653
Comments
You can use |
I am using |
Can you check when you load that checkpoint manually in pl env, what keys does that file have? |
Error in non-pl env
keys in pl env @rohitgr7 sorry about the late reply, completely forgot about this issue Edit:
Edit 2: |
I got around to testing and can load checkpoints now in non-pl envs. The only change needed was to cast hyper_parameters to dict in - Thoughts? |
Yeah this looks good to avoid such error since |
@s-rog , I tried on master with dict_keys(['epoch', 'global_step', 'pytorch-lightning_version', 'state_dict']) |
@rohitgr7 Did the model have If you look at hparams loggging is only controlled by |
ok, yeah my bad :) |
## 馃殌 FeatureAdd an option to save only
state_dict
forModelCheckpoint
callbacks馃悰 Bug
PL checkpoints cannot be loaded in non-pl envs
Motivation
To be able to move trained models and weights into pytorch only environments
Additional context
Currently when you do
torch.load()
on a pl generated checkpoint in an environment without pl, there is a pickling error. For my current use case I have to load the checkpoints in my training environment and save them again with onlystate_dict
for the weights.See reply below for more info
The text was updated successfully, but these errors were encountered: