Skip to content

Add auto loss logging#27

Merged
wli51 merged 12 commits intoWayScience:mainfrom
wli51:dev-auto-loss-log
Apr 30, 2026
Merged

Add auto loss logging#27
wli51 merged 12 commits intoWayScience:mainfrom
wli51:dev-auto-loss-log

Conversation

@wli51
Copy link
Copy Markdown
Collaborator

@wli51 wli51 commented Apr 25, 2026

This pull request adds automated loss name, weight and other configs under LossItem logging. This new feature should make the process of experimenting with different loss combination/relative loss weighting more reproducible and convenient as the user never have to manually describe or specify tags as the trainer and logger automatically talks to get these info logged.

Adds:

  1. get_config() methods to both LossItem and LossGroup classes in loss_group.py
  2. Added loss_groups properties to both LoggingTrainer and LoggingWGANTrainer to standardize access to all loss groups and specifically the information we wish to log
  3. Added new logging helper under MlflowLogger class which is now called automatically during the on_train_start life cycle of the trainer, this logs all loss names and weights as tags and also saves a config file under artifacts logging other LossItem configurations such as enabled and compute_at_val
  4. Updated the signature of __call__ methods in LossItem and LossGroup to accept an optional epoch argument, this is currently unused and has no effect, but in the future this should help support epoch-dependent loss weighting
  5. Some tests

Subsequent enhancement PRs will probably focus on similar auto-logging enhancements for the optimizer configurations.

wli51 added 5 commits April 25, 2026 12:09
…entially useful optional parameter, epoch, that would help future addition of loss weight scheudling functionality; and 2) adding get_config functions for auto loss config logging
…poch as parameter to modified loss group (currently not used for anything) that will be useful for the future; and 2) add loss group logging property for loss auto logging
… to global fixture, and modified in test_abstract_trainer tests to use fixtures as opposed to rely on helper class import. This would benefit future test addition by exposing more useful fixtures
@wli51 wli51 marked this pull request as ready for review April 27, 2026 18:34
@MattsonCam MattsonCam self-requested a review April 27, 2026 19:02
Copy link
Copy Markdown
Member

@MattsonCam MattsonCam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks clean and reasonable to me, good job @wli51 !

Comment thread src/virtual_stain_flow/engine/loss_group.py
Comment thread src/virtual_stain_flow/engine/loss_group.py Outdated
Comment thread src/virtual_stain_flow/engine/loss_group.py
Comment thread src/virtual_stain_flow/engine/loss_group.py
Comment thread src/virtual_stain_flow/vsf_logging/MlflowLogger.py
Comment thread src/virtual_stain_flow/vsf_logging/MlflowLogger.py
Comment thread tests/vsf_logging/test_mlflow_logger_loss_config.py Outdated
Comment thread tests/vsf_logging/test_mlflow_logger_loss_config.py Outdated
Comment thread src/virtual_stain_flow/trainers/logging_trainer.py Outdated
Comment thread tests/conftest.py
wli51 added 6 commits April 30, 2026 11:42
…y added `Progress` object in place of a epoch number.
…avior against one one type of model across two trainers. Replace with newly added tests files separate for UNet, ConvNeXtUNet, and WGAN model with UNet testing for both loss logging and model logging.
…potentially useful enhancement fitting the theme of this PR
@wli51
Copy link
Copy Markdown
Collaborator Author

wli51 commented Apr 30, 2026

Thanks @MattsonCam for reviewing! Merging now!

@wli51 wli51 merged commit ed6c22f into WayScience:main Apr 30, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants