Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug comparing max_steps to global step which inits at 0 #4278

Merged
merged 10 commits into from
Oct 22, 2020

Conversation

SeanNaren
Copy link
Contributor

What does this PR do?

Fixes #4193

Just going to make a test before calling this ready for review.

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from a team October 21, 2020 09:38
@mergify mergify bot requested a review from a team October 21, 2020 09:39
pytorch_lightning/trainer/trainer.py Outdated Show resolved Hide resolved
@mergify mergify bot requested a review from a team October 21, 2020 12:20
@SeanNaren SeanNaren added the bug Something isn't working label Oct 22, 2020
@codecov
Copy link

codecov bot commented Oct 22, 2020

Codecov Report

Merging #4278 into master will increase coverage by 0%.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #4278   +/-   ##
======================================
  Coverage      93%     93%           
======================================
  Files         111     111           
  Lines        8040    8003   -37     
======================================
+ Hits         7440    7441    +1     
+ Misses        600     562   -38     

tests/trainer/test_trainer.py Outdated Show resolved Hide resolved
pytorch_lightning/trainer/training_loop.py Show resolved Hide resolved
@mergify mergify bot requested a review from a team October 22, 2020 11:37
@SeanNaren SeanNaren added the ready PRs ready to be merged label Oct 22, 2020
Copy link
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great PR ! Nice cleaned up.

@SeanNaren SeanNaren merged commit 065cc94 into master Oct 22, 2020
@SeanNaren SeanNaren deleted the bug/4193-accum branch October 22, 2020 12:59

# progress global step according to grads progress
if num_accumulated_batches_reached or num_training_batches_reached:
self.trainer.global_step += 1

def accumulated_batches_reached(self):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we have it rather as protected as it is not meant to be used by the user...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will make a followup PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

max_steps has no effect in combination with gradient accumulation
5 participants