Skip to content

A simple implementation of the Gradient Difference Loss function in PyTorch, and its custom formulation with MSE loss.

License

Notifications You must be signed in to change notification settings

mmany/pytorch-GDL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Gradient Difference Loss (GDL) in PyTorch

A simple implementation of the Gradient Difference Loss function in PyTorch, and its custom formulation with MSE loss function, for the training of Convolutional Neural Networks.

First proposed in [1].

Expression of the Mean Squared Error (already implemented in PyTorch):

Expression of the Gradient Difference Loss (from [1]):

Hybrid Loss function combining GDL and MSE.

Lambdas are weighting coefficients (scalars) used to balance the participation of the GDL loss and MSE loss. For a given 2D prediciton, the GDL loss is typically much higher than the MSE Loss, so lambdaGDL is set smaller than lambdaMSE. Their value can be set by testing on the available data (from [2]).

References

[1] Mathieu et al. (2015). Deep multi-scale video prediction beyond mean square error. https://arxiv.org/abs/1511.05440

[2] Alguacil et al. (2021). Predicting the Propagation of Acoustic Waves using Deep Convolutional Neural Networks. https://arc.aiaa.org/doi/10.2514/6.2020-2513

About

A simple implementation of the Gradient Difference Loss function in PyTorch, and its custom formulation with MSE loss.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages