Show/hide contentOpenClose All
Curricular information is subject to change
On completion of this module, students should be able to:
1. Formulate standard optimization techniques in continuous optimization, understand the convergence criteria, and implement these methods from scratch;
2. Implement the same methods using standard software packages, understand when these methods will work well and when they won’t;
3. Understand the first-order necessary conditions for optimality in constrained optimization, be able to solve simple problems by hand
4. Sketch the proof of the Karush-Kuhn-Tucker conditions
5. Understand the need for global optimization, implement a simulated-annealing algorithm
Topics covered: Steepest-Descent and Newton-type methods, including analysis of convergence, Trust-region methods, including the construction of solutions of the constrained sub-problem, Non-Linear Least Squares, including the Levenberg-Marquardt method. Numerical implementations of standard optimization methods. Constrained optimization with equality and inequality constraints, examples motivating the introduction of the Lagrange Multiplier Technique. Necessary first-order optimality conditions, including a derivation of the Karush-Kuhn-Tucker conditions. Farkas’s lemma and the Separating Hyperplane Theorem. Introduction to Global Optimization, to include a discussion on Simulated Annealing.
Student Effort Type | Hours |
---|---|
Lectures | 36 |
Specified Learning Activities | 24 |
Autonomous Student Learning | 40 |
Total | 100 |
Not applicable to this module.
Description | Timing | Component Scale | % of Final Grade | ||
---|---|---|---|---|---|
Not yet recorded. |
Resit In | Terminal Exam |
---|---|
Summer | Yes - 1 Hour |
• Feedback individually to students, post-assessment
• Self-assessment activities
Not yet recorded.