-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Description
🚀 Feature
The proposal is to add a callback hook for the training step before .backward. Current hooks run either after .backward (on_after_backward) or after optimizer.step (on_train_batch_end, on_batch_end). The proposed callback hook would complement the existing hooks.
Motivation
Adding this hook allows one to create callbacks related to loss calculations as well as batch operations for loss calculations. The current hooks cannot be used for this because the automatic pl optimizer is already called before the existing hooks.
Adding the proposed hook would increase the flexibility and usability of pytorch lightning without modifying existing functionality.
Pitch
Concretely, it would suffice to add an additional callback hook, named for example "on_before_backward", that takes in trainer and pl_module as arguments.
Alternatives
Without the proposed callback, the alternative for the end user is to modify their own code to create their own hook in all the pl modules they use.