Skip to content

A PyTorch implementation of various Online & Stochastic optimization algorithms for deep learning

Notifications You must be signed in to change notification settings

pursueorigin/PyTorch_OLoptim

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

PyTorch_OLoptim

A Pytorch implementation of various Online / Stochastic optimization algorithms

Descriptions

FTRL: Follow the Regularized Leader

  • intro: a classic algorithm in online learning

FTML: [ICML 2017] Follow the Moving Leader in Deep Learning

SGDOL: [NeurIPS 2019] Surrogate Losses for Online Learning of Stepsizes in Stochastic Non-Convex Optimization

STORM: [NeurIPS 2019] Momentum-Based Variance Reduction in Non-Convex SGD

EXP3: Exponential-weight algorithm for Exploration and Exploitation

UCB: Upper Confidence Bound algorithm

SGDPF

  • intro: a toy example to use gradient descent to automatically tune the learning rate. The name comes from 'SGD + parameter free'

About

A PyTorch implementation of various Online & Stochastic optimization algorithms for deep learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%