Skip to content

A class project I have done during the class statistics and optimization given by Francis BACH.

License

Notifications You must be signed in to change notification settings

elmahdichayti/SAGA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SAGA

A reproduction of the article "SAGA: A fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives, by Aaron Defazio, Francis Bach, Simon Lacoste-Julien" during a course given by Francis BACH at university paris sud. The article can be found in ArXiv here https://arxiv.org/abs/1407.0202

In the article, a new (at the time) stochastic algorithm namely SAGA is introduced. This algorithm can be seen as an extension of the SAG (stochastic average gradient) and SVRG (stochastic variance reduced gradients). SAGA is shown to have better theoretical results than the before mentioned algorithms, it supports non- differentiable objectives through the use of the proximal operator. SAGA takes advantage of any inherent strong convexity and applies directly to non-strongly convex cases.

About

A class project I have done during the class statistics and optimization given by Francis BACH.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published