Learning Outcomes:
On completion of this module, students should be able to:
1. Formulate standard optimization techniques in continuous optimization, understand the convergence criteria, and implement these methods from scratch;
2. Implement the same methods using standard software packages, understand when these methods will work well and when they won’t;
3. Understand the first-order necessary conditions for optimality in constrained optimization, be able to solve simple problems by hand
4. Understand the need for global optimization, implement a simulated-annealing algorithm
5. Using Python programming, apply optimization techniques to problems in Machine Learning
Indicative Module Content:
Topics covered: Steepest-Descent and Newton-type methods, including analysis of convergence, Trust-region methods, including the construction of solutions of the constrained sub-problem. Numerical implementations of standard optimization methods. Necessary first-order optimality conditions. Introduction to Global Optimization, to include a discussion on Simulated Annealing. Application of optimization techniques through worked examples in Python. Examples may include: Linear Regression, Matrix Completion and Compressed Sensing, Support Vector Machines, and Neural Networks.