Skip to content

Commit

Permalink
Fix Mixing hparams and arguments in LightningModule (#1505)
Browse files Browse the repository at this point in the history
* Attempt to fix #1468

* Remove the if statement, it doesn't actually make any difference

* Update docs

* Correct warnings I caused in the last commit

* Add to changelog

* Actually add to changelog

* Clarify documentation and examples

* Update CHANGELOG.md

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
  • Loading branch information
HenryJia and Borda committed Apr 19, 2020
1 parent e021469 commit 3c6f856
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 5 deletions.
2 changes: 2 additions & 0 deletions CHANGELOG.md
Expand Up @@ -39,6 +39,8 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).

- Fixed loggers - flushing last logged metrics even before continue, e.g. `trainer.test()` results ([#1459](https://github.com/PyTorchLightning/pytorch-lightning/pull/1459))

- Fixed LightningModule - Mixing hparams and arguments in `LightningModule.__init__()` crashes load_from_checkpoint() ([#1505](https://github.com/PyTorchLightning/pytorch-lightning/pull/1505))

- Added a missing call to the `on_before_zero_grad` model hook ([#1493](https://github.com/PyTorchLightning/pytorch-lightning/pull/1493)).

-
Expand Down
8 changes: 3 additions & 5 deletions pytorch_lightning/core/lightning.py
Expand Up @@ -1434,6 +1434,7 @@ def load_from_checkpoint(
it stores the hyperparameters in the checkpoint if you initialized your :class:`LightningModule`
with an argument called ``hparams`` which is a :class:`~argparse.Namespace`
(output of :meth:`~argparse.ArgumentParser.parse_args` when parsing command line arguments).
Any other arguments specified through \*args and \*\*kwargs will be passed to the model.
Example:
.. code-block:: python
Expand Down Expand Up @@ -1493,7 +1494,7 @@ def __init__(self, hparams):
# or load passing whatever args the model takes to load
MyLightningModule.load_from_checkpoint(
'path/to/checkpoint.ckpt',
learning_rate=0.1,
learning_rate=0.1, # These arguments will be passed to the model using **kwargs
layers=2,
pretrained_model=some_model
)
Expand Down Expand Up @@ -1544,10 +1545,7 @@ def _load_model_state(cls, checkpoint: Dict[str, Any], *args, **kwargs) -> 'Ligh

# load the state_dict on the model automatically
model_args = [hparams] if hparams else []
if len(model_args) > 0:
model = cls(*model_args)
else:
model = cls(*args, **kwargs)
model = cls(*model_args, *args, **kwargs)
model.load_state_dict(checkpoint['state_dict'])

# give model a chance to load something
Expand Down

0 comments on commit 3c6f856

Please sign in to comment.