Introductory course on non-smooth optimization

Description

The course mainly covers topics in non-smooth optimization and first-order proximal splitting methods. This includes gradient based methods ((sub)gradient method, proximal gradient method, accelerated gradient methods), operator splitting methods (augmented Lagrangian method, alternating direction method of multipliers, monotone operators and operator splitting schemes), and (possibly) interior-point algorithms. Non-convex optimisation and stochastic optimisation will also be introduced.

Lecture slides

  1. Introduction
  2. Gradient method
  3. Proximal gradient method
  4. Krasnosel'skii-Mann iteration
  5. Backward--Backward splitting
  6. Douglas--Rachford splitting
  7. Primal--Dual splitting
  8. Other operator splitting methods
  9. Alternating direction method of multipliers
  10. Non-convex optimisation
  11. Stochastic optimisation

Acknowledgement: some slides are based on the lecture slides of Prof. Stephen Boyd and Prof. Lieven Vandenberghe.

Projects


References

  • S. Boyd and L. Vandenberghe. Convex optimization. Cambridge university press, 2004.
  • R. T. Rockafellar. Convex analysis. Princeton university press, 2015.
  • A. Beck. First-order methods in optimization. Vol. 25. SIAM, 2017.
  • H. H. Bauschke and P. L. Combettes. Convex analysis and monotone operator theory in Hilbert spaces. Vol. 408. New York: Springer, 2011.
  • B. Polyak. Introduction to optimization. Optimization Software, 1987.
  • Y. Nesterov. Introductory lectures on convex optimization: A basic course. Vol. 87. Springer Science & Business Media, 2013.