Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Mixing hparams and arguments in LightningModule #1505

Merged
merged 9 commits into from Apr 19, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 2 additions & 0 deletions CHANGELOG.md
Expand Up @@ -39,6 +39,8 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Fixed loggers - flushing last logged metrics even before continue, e.g. `trainer.test()` results ([#1459](https://github.com/PyTorchLightning/pytorch-lightning/pull/1459))

- Fixed LightningModule - Mixing hparams and arguments in `LightningModule.__init__()` crashes load_from_checkpoint() ([#1505](https://github.com/PyTorchLightning/pytorch-lightning/pull/1505))

- Added a missing call to the `on_before_zero_grad` model hook ([#1493](https://github.com/PyTorchLightning/pytorch-lightning/pull/1493)).

-
Expand Down
8 changes: 3 additions & 5 deletions pytorch_lightning/core/lightning.py
Expand Up @@ -1434,6 +1434,7 @@ def load_from_checkpoint(
it stores the hyperparameters in the checkpoint if you initialized your :class:`LightningModule`
with an argument called ``hparams`` which is a :class:`~argparse.Namespace`
(output of :meth:`~argparse.ArgumentParser.parse_args` when parsing command line arguments).
Any other arguments specified through \*args and \*\*kwargs will be passed to the model.

Example:
.. code-block:: python
Expand Down Expand Up @@ -1493,7 +1494,7 @@ def __init__(self, hparams):
# or load passing whatever args the model takes to load
MyLightningModule.load_from_checkpoint(
'path/to/checkpoint.ckpt',
learning_rate=0.1,
learning_rate=0.1, # These arguments will be passed to the model using **kwargs
layers=2,
pretrained_model=some_model
)
Expand Down Expand Up @@ -1544,10 +1545,7 @@ def _load_model_state(cls, checkpoint: Dict[str, Any], *args, **kwargs) -> 'Ligh

# load the state_dict on the model automatically
model_args = [hparams] if hparams else []
if len(model_args) > 0:
model = cls(*model_args)
else:
model = cls(*args, **kwargs)
model = cls(*model_args, *args, **kwargs)
model.load_state_dict(checkpoint['state_dict'])

# give model a chance to load something
Expand Down