Advanced Topics in Machine Learning (COMP4680/COMP8650)

Undergraduate/Postgraduate level, Australian National University, 2022

Useful links:

Textbooks and papers:

  • Boyd and Vandenberghe, “Convex Optimization”, Cambridge Press, 2004.
  • Goodfellow, Bengio and Courville, “Deep Learning”, MIT Press, 2016.
  • Zhang, Lipton, Li and Smola, “Dive into Deep Learning”, 2021.
  • Gould, Hartley and Campbell, “Deep Declarative Networks”, TPAMI 2021.


This course focuses on topics on convex optimisation, deep learning and differentiable optimisation

Learning Outcomes:

  1. Distinguish definitions of key concepts in convex analysis, including convexity of sets and functions, subgradients, and the convex dual.
  2. Derive basic results about convex functions such as Jensen’s inequality.
  3. Deduce how Bregman divergences are constructed from convex functions and derive some of their properties.
  4. Produce a formal optimization problem from a high-level description and determine whether the problem is convex.
  5. Recognize standard convex optimization problems such as linear programs and quadratic programs.
  6. Derive the standard (dual) quadratic program for support vector machines and understand the extension to max-margin methods for structured prediction.
  7. Implement and analyse gradient descent algorithms such as stochastic gradient descent and mirror descent.


1Overview and Background
2Convex Sets
3Convex Functions
4Convex Optimisation Problems
6Applications (ML Focused)
7Unconstrained Minimisation
8Constrained Minimisation
9Interior-point Methods
10Deep Learning
11Differentiable Optimisation