Fundamentals of optimization:
existence of minima and maxima, gradient, Hessian, convexity, convergence
Unconstrained static optimization:
optimality conditions, computer-aided optimization, line search methods, choice of the step length, principle of nested intervals, Armijo condition, Wolfe condition, gradient method, Newton method, conjugate gradient method, Quasi-Newton method, Gauss-Newton-method, trust region method, Nelder-Mead method
Static optimization with constraints:
equality and inequality constraints, sensitivity considerations, active set method, gradient projection method, reduced gradient method, penalty and barrier functions, sequential quadratic programming (SQP), local SQP, globalization of SQP
Dynamic optimization:
basics of the calculus of variations, optimality conditions, Euler-Lagrange equations, Weierstrass-Erdmann conditions, design of optimal control solutions, minimum principle of Pontryagin, energy-optimal, ressource-optimal, time-optimal, Bang-Bang control, singular arcs
The performance is evaluated in an oral exam, which can take place at any time Monday to Friday from 6:00 to 20:00. To arrange a time for the examination, send an e-mail with desired dates, times or time slots, your name, student ID number, and study code to steinboeck@acin.tuwien.ac.at.