-
Overview of the term topics and some applications of convex optimization (ru)
-
Intro to numerical optimization methods. Gradient descent (ru)
-
How to accelerate gradient descent
-
Second order methods: Newton method. Quasi-Newton methods as trade-off between convergence speed and cost of one iterations (ru)
-
Non-smooth optimization problems: subgradient methods and intro to proximal methods (en)
5*. Smoothing: smooth minimization of non-smooth functions (original paper)
-
Frank-Wolfe method (ru)
-
General purpose solvers
- interior point methods
- SQP as generalization of interior point methods to non-convex problems
-
How to parallelize optimization methods: penalty method, augmented Lagrangian method and ADMM (ru)
-
Stochastic gradient methods: non-convex non-smooth but structured objectives. Training neural networks as a basic example (ru)