Skip to content

Latest commit

 

History

History
21 lines (11 loc) · 1.13 KB

README.md

File metadata and controls

21 lines (11 loc) · 1.13 KB

Spring term 2021

  1. Some applications of convex optimizations (ru)

  2. Intro to numerical optimization methods. Gradient descent (ru)

  3. How to accelerate gradient descent: conjugate gradient method, heavy-ball method and fast gradient method (vol 1, ru; vol 2, ru)

  4. Second order methods: Newton method. Quasi-Newton methods as trade-off between convergence speed and cost of one iterations (ru)

  5. Non-smooth optimization problems: subgradient methods and intro to proximal methods (en)

  6. Smoothing: smooth minimization of non-smooth functions (original paper) (ru)

  7. Simple constrained optimization problems: projected gradient method and Frank-Wolfe method (ru)

  8. General purpose solvers: interior point methods (ru)

  9. How to parallelize optimization methods: penalty method, augmented Lagrangian method and ADMM (ru)

  10. Coordinate-wise methods