WWW'24, Mirror Gradient (MG) makes multimodal recommendation models approach flat local minima easier compared to models with normal training.
-
Updated
Nov 1, 2024 - Python
WWW'24, Mirror Gradient (MG) makes multimodal recommendation models approach flat local minima easier compared to models with normal training.
Gradient method, Newton method, Euler and Heun method, discrete Fourier transform and Monte Carlo simulation in C++
Optimisation and algorithm project. I) L1, L2, and L2^2 regularisers in optimal trajectory synthesis; II) Logistic data classification; III) Gradient methods.
Implementation of unconstrained and constrained convex optimization algorithms in Python, focusing on solving data science problems such as semi-supervised learning and Support Vector Machines.
AutoSGM
An easy implementation of the Stochastic / Batch gradient descent and comparison with the standard Gradient Descent Method
Add a description, image, and links to the gradient-method topic page so that developers can more easily learn about it.
To associate your repository with the gradient-method topic, visit your repo's landing page and select "manage topics."