3 Summary optimality conditions Conditions for local minimum of unconstrained problem:First Order Necessity Condition:Second Order Sufficiency Condition: H positive definiteFor convex f in convex feasible domain: condition for global minimum:Sufficiency Condition:
4 Stationary point nature summary Definiteness H Nature x*Positive d. MinimumPositive semi-d. ValleyIndefinite SaddlepointNegative semi-d. RidgeNegative d. Maximum
5 Complex eigenvalues?Question: what is the nature of a stationary point when H has complex eigenvalues?Answer: this situation never occurs, because H is symmetric by definition. Symmetric matrices have real eigenvalues (spectral theory).
6 Nature of stationary points Nature of initial position depends on load (buckling):Fk1k2l
8 Unconstrained optimization algorithms Single-variable methods0th order (involving only f )1st order (involving f and f ’ )2nd order (involving f, f ’ and f ” )Multiple variable methods
9 Why optimization algorithms? Optimality conditions often cannot be used:Function not explicitly known (e.g. simulation)Conditions cannot be solved analyticallyExample:Stationary points:
10 0th order methods: pro/con Weaknesses:(Usually) less efficient than higher order methods (many function evaluations)Strengths:No derivatives neededWork also for discontinuous / non- differentiable functionsEasy to programRobust
11 Minimization with one variable Why?Simplest case: good starting pointUsed in multi-variable methods during line searchSetting:fxModelOptimizerIterative process:
12 Termination criteria Stop optimization iterations when: Solution is sufficiently accurate (check optimality criteria)Progress becomes too slow:Maximum resources have been spentThe solution divergesCycling occursxaxb
14 Basic strategy of 0th order methods for single-variable case Find interval [a0, b0] that contains the minimum (bracketing)Iteratively reduce the size of the interval [ak, bk] (sectioning)Approximate the minimum by the minimum of a simple interpolation function over the interval [aN, bN]Sectioning methods:Dichotomous searchFibonacci methodGolden section method
15 Bracketing the minimum fx4 = x3+g2Dx1[a0, b0]x2 = x1+Dx3 = x2+gDxStarting point x1, stepsize D, expansion parameter g: user-defined
16 UnimodalityBracketing and sectioning methods work best for unimodal functions: “An unimodal function consists of exactly one monotonically increasing and decreasing part”
17 Dichotomous search Conceptually simple idea: Main Entry: di·chot·o·mous Pronunciation: dI-'kät-&-m&s also d&- Function: adjective : dividing into two partsConceptually simple idea:Try to split interval in half in each stepL0a0b0L0/2d << L0
18 Dichotomous search (2) Interval size after 1 step (2 evaluations): Interval size after m steps (2m evaluations):Proper choice for d :
19 Dichotomous search (3)Example: m = 10Ideal interval reductionm
20 Sectioning - Fibonacci Situation: minimum bracketed between x1 and x3 :x4x4x1x2x3Test new points and reduce intervalOptimal point placement?
21 Optimal sectioning Fibonacci method: optimal sectioning method Given: Initial interval [a0, b0]Predefined total number of evaluations N, or:Desired final interval size e
22 Fibonacci sectioning - basic idea Start at final interval and use symmetry and maximum interval reduction:d << INININ-1 = 2ININ-2 = 3ININ-3 = 5ININ-4 = 8ININ-5 = 13INYellow point is point that has been added in the previous iteration.Fibonacci number
23 Sectioning – Golden Section For large N, Fibonacci fraction b converges to golden section ratio f ( …):Golden section method uses this constant interval reduction ratio ff1
24 Sectioning - Golden Section Origin of golden section:I1I2 = fI1I2 = fI1I3 = fI2Final interval:
25 Comparison sectioning methods Ideal dichotomous interval reductionFibonacciGolden sectionEvaluationsNDichotomous 12Golden section 9Fibonacci 8(Exhaustive 99)Example: reduction to 2% of original interval:Conclusion: Golden section simple and near-optimal
26 Quadratic interpolation Three points of the bracket define interpolating quadratic function:ai+1bi+1xnewNew point evaluated at minimum of parabola:aibiFor minimum: a > 0!Shift xnew when very close to existing point
27 Unconstrained optimization algorithms Single-variable methods0th order (involving only f )1st order (involving f and f ’ )2nd order (involving f, f ’ and f ” )Multiple variable methods
28 Cubic interpolationSimilar to quadratic interpolation, but with 2 points and derivative information:aibi
29 Bisection methodOptimality conditions: minimum at stationary point Root finding of f ’Similar to sectioning methods, but uses derivative:ff’Interval is halved in each iteration. Note, this is better than any of the direct methods.
30 Secant method f ’ Also based on root finding of f ’ Uses linear interpolationf ’Interval possibly even more than halved in each iteration. Best.
31 Unconstrained optimization algorithms Single-variable methods0th order (involving only f )1st order (involving f and f ’ )2nd order (involving f, f ’ and f ” )Multiple variable methods
32 Newton’s method Again, root finding of f ’ Basis: Taylor approximation of f ’:Linear approximationNew guess:
33 Newton’s method f’ f’ Best convergence of all methods: xk+1xk+1xk+2xkxk+2xkNote, jumping from point to point, not contained in an interval. Dangerous, may diverge.Unless it diverges
34 Summary single variable methods Bracketing +Dichotomous sectioningFibonacci sectioningGolden ratio sectioningQuadratic interpolationCubic interpolationBisection methodSecant methodNewton methodIn practice: additional “tricks” needed to deal with:MultimodalityStrong fluctuationsRound-off errorsDivergence0th order1st order2nd orderAnd many, many more!