Advanced Topics in Machine Learning (COMP4680/COMP8650)
Undergraduate/Postgraduate level, Australian National University, 2022
Textbooks and papers:
- Boyd and Vandenberghe, “Convex Optimization”, Cambridge Press, 2004.
- Goodfellow, Bengio and Courville, “Deep Learning”, MIT Press, 2016.
- Zhang, Lipton, Li and Smola, “Dive into Deep Learning”, 2021.
- Gould, Hartley and Campbell, “Deep Declarative Networks”, TPAMI 2021.
This course focuses on topics on convex optimisation, deep learning and differentiable optimisation
- Distinguish definitions of key concepts in convex analysis, including convexity of sets and functions, subgradients, and the convex dual.
- Derive basic results about convex functions such as Jensen’s inequality.
- Deduce how Bregman divergences are constructed from convex functions and derive some of their properties.
- Produce a formal optimization problem from a high-level description and determine whether the problem is convex.
- Recognize standard convex optimization problems such as linear programs and quadratic programs.
- Derive the standard (dual) quadratic program for support vector machines and understand the extension to max-margin methods for structured prediction.
- Implement and analyse gradient descent algorithms such as stochastic gradient descent and mirror descent.
|1||Overview and Background|
|4||Convex Optimisation Problems|
|6||Applications (ML Focused)|