Skip to content

Checkpoint loading patch fails when using a pre-trained featurizer within a Lightning module. #12922

@jdhorwood

Description

@jdhorwood

🐛 Bug

I am getting the error: ModuleNotFoundError: No module named 'pytorch_lightning.utilities.argparse_utils' coming from
migration.py when loading checkpoints for models which leverage pre-trained models also trained via Pytorch Lightning.

Essentially, the context manager is called recursively, since the model relies on features generated from a pre-trained model. This model also requires a call to load_from_checkpoint(). As a result, the exit function of the context manager gets called twice in a row, with the error occurring on the second call.

Expected behavior

I would expect checkpoint loading to not raise en error because of a pre-trained featurizer model.

Environment

pytorch-lightning == 1.6.1

cc @awaelchli @ananthsub @ninginthecloud @rohitgr7

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingcheckpointingRelated to checkpointing

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions