New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
every_n_val_epochs
-> every_n_epochs
#8383
Conversation
Hello @carmocca! Thanks for updating this PR.
Comment last updated at 2021-07-12 16:04:10 UTC |
Codecov Report
@@ Coverage Diff @@
## master #8383 +/- ##
======================================
- Coverage 92% 92% -0%
======================================
Files 216 216
Lines 14097 14099 +2
======================================
- Hits 13012 12998 -14
- Misses 1085 1101 +16 |
…_model_checkpoint.py
@carmocca why is every_n_epochs along with a boolean flag to save on train epoch end preferable to supporting two flags like every_n_val_epochs and every_n_train_epochs? The latter allows us to adjust the checkpoint frequency separately for training and validation. The mutual exclusion check is something we could address in the callback implementation. |
@ananthsub Is your question in the context that the user can set both If there is a need to specify both, I think two Also this seems better to avoid user confusion about the flag difference |
@carmocca it's more that as a user, users need to now know about the default behavior of
|
You could also make a similar argument that the user would need to know if
Are you saying |
What does this PR do?
Deprecate
every_n_val_epochs
in favor ofevery_n_epochs
The flag is used in the
on_validation_end
hook, but we will also want to use it in theon_train_epoch_end
hookPart of #7724
Does your PR introduce any breaking changes ? If yes, please list them.
None
Before submitting
PR review