ORF523: Advanced Optimization

This course is a mathematical introduction to {convex, large-scale, stochastic}-optimization. Topics covered include the ellipsoid method, the analysis of diverse gradient-descent algorithms such as sub-gradient descent, Nesterov's accelerated gradient descent, FISTA and mirror descent, as well as a discussion of complexity lower bounds à la Nemirovski. These methods will be compared to the conic programming approach, and applications to high-dimensional statistics and machine learning will also be discussed.

Lectures

Spring 2013, Tuesday and Thursday, 11:00am-12:20pm, Room TBD.

Office hours on Friday, 2:00-4:00pm, Sherrerd 225.

Recommended textbooks

The course will be essentially based on the following two references:

I will also take some examples from the following book:

Lecture notes

The lecture notes will be posted on my blog. You are encouraged to post your questions (anonymously or not) directly on the blog, so that other students can try to answer them.