Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix setting batch_size attribute in batch_size finder (finishing PR #2523) #3043

Merged
merged 10 commits into from Aug 19, 2020

Conversation

awaelchli
Copy link
Member

@awaelchli awaelchli commented Aug 19, 2020

What does this PR do?

Follow up to #2523 (unfinished, stale PR)

Added test and made sure it's failing on master.

Fixes #2484

  • Set model.batch_size based on model.hparams.batch_size if not defined
  • Changed init_val for batch_size to 0 so that it starts with user defined batch_size instead of 2 all the time

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 馃檭

@awaelchli awaelchli added the bug Something isn't working label Aug 19, 2020
@awaelchli
Copy link
Member Author

@SkafteNicki I finished this old PR and added tests as best as I could.
Not sure why the default argument needs to be changed.

@codecov
Copy link

codecov bot commented Aug 19, 2020

Codecov Report

Merging #3043 into master will increase coverage by 29%.
The diff coverage is 86%.

@@           Coverage Diff            @@
##           master   #3043     +/-   ##
========================================
+ Coverage      57%     86%    +29%     
========================================
  Files          81      81             
  Lines        7585    7669     +84     
========================================
+ Hits         4358    6610   +2252     
+ Misses       3227    1059   -2168     

@awaelchli awaelchli marked this pull request as ready for review August 19, 2020 01:53
@mergify mergify bot requested a review from a team August 19, 2020 01:53
@mergify mergify bot requested a review from a team August 19, 2020 07:02
@mergify
Copy link
Contributor

mergify bot commented Aug 19, 2020

This pull request is now in conflict... :(

@williamFalcon williamFalcon merged commit 7b917de into master Aug 19, 2020
@Borda Borda deleted the bugfix/batch-size-scaler-attr branch August 20, 2020 21:45
@Borda Borda added this to the 0.9.0 milestone Aug 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Trainer.scale_batch_size requires model.batch_size instead of model.hparams.batch_size
4 participants