Introductory course on non-smooth optimization
The course mainly covers topics in non-smooth optimization and first-order proximal splitting methods. This includes gradient based methods ((sub)gradient method, proximal gradient method, accelerated gradient methods), operator splitting methods (augmented Lagrangian method, alternating direction method of multipliers, monotone operators and operator splitting schemes), and (possibly) interior-point algorithms. Non-convex optimisation and stochastic optimisation will also be introduced.
- Gradient method
- Proximal gradient method
- Krasnosel'skii-Mann iteration
- Backward--Backward splitting
- Douglas--Rachford splitting
- Primal--Dual splitting
- Other operator splitting methods
- Alternating direction method of multipliers
- Non-convex optimisation
- Stochastic optimisation
Acknowledgement: some slides are based on the lecture slides of Prof. Stephen Boyd and Prof. Lieven Vandenberghe.
- S. Boyd and L. Vandenberghe. Convex optimization. Cambridge university press, 2004.
- R. T. Rockafellar. Convex analysis. Princeton university press, 2015.
- A. Beck. First-order methods in optimization. Vol. 25. SIAM, 2017.
- H. H. Bauschke and P. L. Combettes. Convex analysis and monotone operator theory in Hilbert spaces. Vol. 408. New York: Springer, 2011.
- B. Polyak. Introduction to optimization. Optimization Software, 1987.
- Y. Nesterov. Introductory lectures on convex optimization: A basic course. Vol. 87. Springer Science & Business Media, 2013.