A simple implementation of the Gradient Difference Loss function in PyTorch, and its custom formulation with MSE loss function, for the training of Convolutional Neural Networks.
First proposed in [1].
Expression of the Mean Squared Error (already implemented in PyTorch):
Expression of the Gradient Difference Loss (from [1]):
Hybrid Loss function combining GDL and MSE.
Lambdas are weighting coefficients (scalars) used to balance the participation of the GDL loss and MSE loss. For a given 2D prediciton, the GDL loss is typically much higher than the MSE Loss, so lambdaGDL is set smaller than lambdaMSE. Their value can be set by testing on the available data (from [2]).
[1] Mathieu et al. (2015). Deep multi-scale video prediction beyond mean square error. https://arxiv.org/abs/1511.05440
[2] Alguacil et al. (2021). Predicting the Propagation of Acoustic Waves using Deep Convolutional Neural Networks. https://arc.aiaa.org/doi/10.2514/6.2020-2513