Skip to content

Add training step callback hook before .backward #7789

@GR4HAM

Description

@GR4HAM

🚀 Feature

The proposal is to add a callback hook for the training step before .backward. Current hooks run either after .backward (on_after_backward) or after optimizer.step (on_train_batch_end, on_batch_end). The proposed callback hook would complement the existing hooks.

Motivation

Adding this hook allows one to create callbacks related to loss calculations as well as batch operations for loss calculations. The current hooks cannot be used for this because the automatic pl optimizer is already called before the existing hooks.

Adding the proposed hook would increase the flexibility and usability of pytorch lightning without modifying existing functionality.

Pitch

Concretely, it would suffice to add an additional callback hook, named for example "on_before_backward", that takes in trainer and pl_module as arguments.

Alternatives

Without the proposed callback, the alternative for the end user is to modify their own code to create their own hook in all the pl modules they use.

Metadata

Metadata

Labels

featureIs an improvement or enhancementhelp wantedOpen to be worked on

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions