Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Checkpointing Docs to include details on initializing model with other hyperparameters #17767

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion docs/source-pytorch/common/checkpointing_basic.rst
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ The LightningModule also has access to the Hyperparameters

Initialize with other parameters
================================
If you used the *self.save_hyperparameters()* method in the init of the LightningModule, you can initialize the model with different hyperparameters.
If you used the *self.save_hyperparameters()* method in the init of the LightningModule, you can initialize the model with different hyperparameters. Named arguments passed to ``load_from_checkpoint`` will be forwarded to the model's ``__init__`` function, with the exception of arguments named ``strict``, ``map_location``, and ``hparams_file``, which are consumed by ``load_from_checkpoint``. Positional arguments will **not** be forwarded to the model.
Copy link
Member

@awaelchli awaelchli Jun 8, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks like a good addition, thanks!. Two minor comments:

  • The last sentence sounds a bit misleading. "will not be forwarded" sounds like we are silently not doing anything with them, but in fact it is just not supported and will raise an error. Forcing this best practice is to avoid collision and user-error.
  • Maybe the comment could be added at the bottom of the section, since it is better to introduce the user with the concept and the code examples first before pointing out additional details of optional usage patterns.

What do you think?


.. code-block:: python

Expand All @@ -124,6 +124,8 @@ If you used the *self.save_hyperparameters()* method in the init of the Lightnin

----



*************************
nn.Module from checkpoint
*************************
Expand Down
Loading