Learning Outcomes:
On completion of this module, students should be able to:
1. Formulate standard optimization techniques in continuous optimization, understand the convergence criteria, and implement these methods from scratch;
2. Implement the same methods using standard software packages, understand when these methods will work well and when they won’t;
3. Understand the first-order necessary conditions for optimality in constrained optimization, be able to solve simple problems by hand;
4. Prove the Karush-Kuhn-Tucker conditions;
5. Formulate the Dual Problem in constrained optimization;
5. Understand the need for global optimization, implement a simulated-annealing algorithm.
Indicative Module Content:
Topics covered: Steepest-Descent and Newton-type methods, including analysis of convergence, Trust-region methods, including the construction of solutions of the constrained sub-problem; Numerical implementations of standard optimization methods. Constrained Optimization with equality and inequality constraints, examples motivating the introduction of the Lagrange Multiplier Technique. Necessary first-order optimality conditions, including a derivation of the Karush-Kuhn-Tucker conditions. Farkas’s Lemma and the Separating Hyperplane Theorem. Formulation of the Dual Problem in Constrained Optimization. Introduction to Global Optimization, to include a discussion on Simulated Annealing.