Advanced Topics in Machine Learning (COMP4680/COMP8650)
Undergraduate/Postgraduate level, Australian National University, 2022
Useful links:
Textbooks and papers:
- Boyd and Vandenberghe, “Convex Optimization”, Cambridge Press, 2004.
- Goodfellow, Bengio and Courville, “Deep Learning”, MIT Press, 2016.
- Zhang, Lipton, Li and Smola, “Dive into Deep Learning”, 2021.
- Gould, Hartley and Campbell, “Deep Declarative Networks”, TPAMI 2021.
Overview
This course focuses on topics on convex optimisation, deep learning and differentiable optimisation
Learning Outcomes:
- Distinguish definitions of key concepts in convex analysis, including convexity of sets and functions, subgradients, and the convex dual.
- Derive basic results about convex functions such as Jensen’s inequality.
- Deduce how Bregman divergences are constructed from convex functions and derive some of their properties.
- Produce a formal optimization problem from a high-level description and determine whether the problem is convex.
- Recognize standard convex optimization problems such as linear programs and quadratic programs.
- Derive the standard (dual) quadratic program for support vector machines and understand the extension to max-margin methods for structured prediction.
- Implement and analyse gradient descent algorithms such as stochastic gradient descent and mirror descent.
Schedule:
Week | Topics |
---|---|
1 | Overview and Background |
2 | Convex Sets |
3 | Convex Functions |
4 | Convex Optimisation Problems |
5 | Duality |
6 | Applications (ML Focused) |
7 | Unconstrained Minimisation |
8 | Constrained Minimisation |
9 | Interior-point Methods |
10 | Deep Learning |
11 | Differentiable Optimisation |
12 | Review |