Add auto loss logging#27
Merged
wli51 merged 12 commits intoWayScience:mainfrom Apr 30, 2026
Merged
Conversation
…entially useful optional parameter, epoch, that would help future addition of loss weight scheudling functionality; and 2) adding get_config functions for auto loss config logging
…poch as parameter to modified loss group (currently not used for anything) that will be useful for the future; and 2) add loss group logging property for loss auto logging
… to global fixture, and modified in test_abstract_trainer tests to use fixtures as opposed to rely on helper class import. This would benefit future test addition by exposing more useful fixtures
MattsonCam
approved these changes
Apr 27, 2026
Member
MattsonCam
left a comment
There was a problem hiding this comment.
This looks clean and reasonable to me, good job @wli51 !
…y added `Progress` object in place of a epoch number.
…avior against one one type of model across two trainers. Replace with newly added tests files separate for UNet, ConvNeXtUNet, and WGAN model with UNet testing for both loss logging and model logging.
…potentially useful enhancement fitting the theme of this PR
…gging enhancements in CHANGELOG.md
Collaborator
Author
|
Thanks @MattsonCam for reviewing! Merging now! |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This pull request adds automated loss name, weight and other configs under
LossItemlogging. This new feature should make the process of experimenting with different loss combination/relative loss weighting more reproducible and convenient as the user never have to manually describe or specify tags as the trainer and logger automatically talks to get these info logged.Adds:
get_config()methods to bothLossItemandLossGroupclasses inloss_group.pyloss_groupsproperties to bothLoggingTrainerandLoggingWGANTrainerto standardize access to all loss groups and specifically the information we wish to logMlflowLoggerclass which is now called automatically during theon_train_startlife cycle of the trainer, this logs all loss names and weights as tags and also saves a config file under artifacts logging otherLossItemconfigurations such asenabledandcompute_at_val__call__methods inLossItemandLossGroupto accept an optionalepochargument, this is currently unused and has no effect, but in the future this should help support epoch-dependent loss weightingSubsequent enhancement PRs will probably focus on similar auto-logging enhancements for the optimizer configurations.