Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Misleading exception raised during batch scaling #1973

Merged
merged 3 commits into from Jun 17, 2020
Merged

Misleading exception raised during batch scaling #1973

merged 3 commits into from Jun 17, 2020

Conversation

tejasvi
Copy link
Contributor

@tejasvi tejasvi commented May 27, 2020

Use batch_size from model.hparams.batch_size instead of model.batch_size

Use batch_size from `model.hparams.batch_size` instead of `model.batch_size`
@mergify mergify bot requested a review from a team May 27, 2020 17:09
@justusschock
Copy link
Member

Hi @tejasvi . Thanks for contributing, but I probably wouldn't merge this, sice we recently moved away from hparams in #1896

@Borda Borda added feature Is an improvement or enhancement waiting on author Waiting on user action, correction, or update labels May 28, 2020
@Borda
Copy link
Member

Borda commented May 28, 2020

@tejasvi mind update it to reflect we new structure of arguments?

@pep8speaks
Copy link

pep8speaks commented May 28, 2020

Hello @tejasvi! Thanks for updating this PR.

Line 259:58: W291 trailing whitespace

Comment last updated at 2020-06-08 16:17:27 UTC

@justusschock
Copy link
Member

What about something like:

splitted_batch_arg_name = batch_arg_name.split('.')
contains_arg = model
for name in splitted_batch_arg_name[:-1]:
    contains_arg = getattr(contains_arg, name)

batch_size = getattr(contains_arg, splitted_batch_arg_name[-1])

if value:
    setattr(contains_arg, splitted_batch_arg_name[-1], value)

This way one could either specify the name to something like 'batch_size' (which would correspond to model.batch_size) or to 'hparams.batch_size' (which would correspond to model.hparamsbatch_size).

The issue is, that arguments are not always attached and hparams can also be under other names (like 'hyperparameters'). These issues would all be resolved with something like this I guess :)

@SkafteNicki
Copy link
Member

We had a similar problem, when the learning rate finder was implemented, namely that some users like to store their learning rate in something like model.hparams.optimizers.lr instead of just model.hparams.lr. We solved it by introducing these two utility functions:
https://github.com/PyTorchLightning/pytorch-lightning/blob/c3cf33d1de7de3444e3a4a7d45bc613d8e01a7c0/pytorch_lightning/trainer/lr_finder.py#L485-L501
Which can be repurposed for this PR (move to utilities.parsing?). A similar nested_getattr needs to be implemented.

@SkafteNicki SkafteNicki mentioned this pull request May 29, 2020
5 tasks
pytorch_lightning/trainer/training_tricks.py Outdated Show resolved Hide resolved
pytorch_lightning/trainer/training_tricks.py Outdated Show resolved Hide resolved
Comment on lines +250 to +251
if hasattr(model, batch_arg_name):
batch_size = getattr(model, batch_arg_name)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we are not registering in the model anymore...

@mergify mergify bot requested a review from a team June 8, 2020 13:08
Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
@williamFalcon williamFalcon merged commit f8103f9 into Lightning-AI:master Jun 17, 2020
williamFalcon added a commit that referenced this pull request Jun 17, 2020
williamFalcon added a commit that referenced this pull request Jun 17, 2020
@williamFalcon
Copy link
Contributor

sorry, merged by accident. mind re-opening?
(merge has been reversed)

@Borda
Copy link
Member

Borda commented Jun 17, 2020

sorry, merged by accident. mind re-opening?
(merge has been reversed)

merged cannot be reopened, it is already in master
and reverting and immediate merge does not solve the issue much
@williamFalcon pls ping me next time...

@Borda
Copy link
Member

Borda commented Jun 17, 2020

@tejasvi mind open a new PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement waiting on author Waiting on user action, correction, or update
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants