Skip to content

guisf/rgd

Repository files navigation

Relativistic Gradient Descent (RGD)

RGD is a simple optimization method based on the simulation of a relativistic particle under the influence of a potential (objective function) and friction. We use a symplectic integrator to simulate such a physical system.

Gradient descent (GD) is probably the most well-known optimization method. The classical momentum method (CM), also known as Polyak's heavy ball, and Nesterov's accelerated gradient method (NAG) are accelerated variants of GD which are extensively used in machine learning. RGD generalizes both CM and NAG and usually have a superior performance. For instance, its convergence rate in a matrix completion problem (which is nonconvex) is illustrated in the figure below.

About

Relativistic Gradient Descent optimization method.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages