Course dates – MT – week beginning Monday 22nd October 2018 – for Year 1 students
Pawan Mudigonda

  • Motivating examples (binary classification using SVMs and logistic regression)
  • Optimal solutions and stationary points
  • Convex functions and convex sets
  • Continuous optimization
  • Projected (sub)gradient descent
  • Mirror descent
  • Stochastic optimization
  • Duality
  • Conditional gradient aka Frank-Wolfe

* Newton's method

* Interior point algorithms

  • Discrete optimization
  • Motivating examples (structured prediction)
  • Convex relaxations


* topics will not be covered in detail


Other Sources
  • Convex optimization: Algorithms and complexity by Sebastien Bubeck (https://arxiv.org/abs/1405.4980)
  • Convex optimization by Stephen Boyd and Lieven Vandenberghe (http://stanford.edu/~boyd/cvxbook)
  • Smoothing and First Order Methods: A Unified Framework by Amir Beck and Marc Teboulle (https://epubs.siam.org/doi/abs/10.1137/100818327)
  • Proximal algorithms by Neal Parikh and Stephen Boyd (http://web.stanford.edu/~boyd/papers/prox_algs.html)
  • Numerical optimization by Jorge Nocedal and Stephen J. Wright
  • The design of approximation algorithms by David P. Williamson and David B. Shmoys (http://www.designofapproxalgs.com)




Three x 3hr (Tue, Thu, Fri)

  • Unconstrained continuous optimization for regression and classification
  • Constrained continuous optimization for regression and classification
  • Convex relaxations for dense CRFs