Presentation is loading. Please wait.

Presentation is loading. Please wait.

Engineering Optimization

Similar presentations

Presentation on theme: "Engineering Optimization"— Presentation transcript:

1 Engineering Optimization
Concepts and Applications Fred van Keulen Matthijs Langelaar CLA H21.1

2 Recap / overview Optimization problem Definition Checking
Negative null form Model Special topics Linear / convex problems Sensitivity analysis Topology optimization Solution methods Unconstrained problems Constrained problems Optimality criteria Optimality criteria Optimization algorithms Optimization algorithms

3 Summary optimality conditions
Conditions for local minimum of unconstrained problem: First Order Necessity Condition: Second Order Sufficiency Condition: H positive definite For convex f in convex feasible domain: condition for global minimum: Sufficiency Condition:

4 Stationary point nature summary
Definiteness H Nature x* Positive d. Minimum Positive semi-d. Valley Indefinite Saddlepoint Negative semi-d. Ridge Negative d. Maximum

5 Complex eigenvalues? Question: what is the nature of a stationary point when H has complex eigenvalues? Answer: this situation never occurs, because H is symmetric by definition. Symmetric matrices have real eigenvalues (spectral theory).

6 Nature of stationary points
Nature of initial position depends on load (buckling): F k1 k2 l

7 Nature of stationary points (2)

8 Unconstrained optimization algorithms
Single-variable methods 0th order (involving only f ) 1st order (involving f and f ’ ) 2nd order (involving f, f ’ and f ” ) Multiple variable methods

9 Why optimization algorithms?
Optimality conditions often cannot be used: Function not explicitly known (e.g. simulation) Conditions cannot be solved analytically Example: Stationary points:

10 0th order methods: pro/con
Weaknesses: (Usually) less efficient than higher order methods (many function evaluations) Strengths: No derivatives needed Work also for discontinuous / non- differentiable functions Easy to program Robust

11 Minimization with one variable
Why? Simplest case: good starting point Used in multi-variable methods during line search Setting: f x Model Optimizer Iterative process:

12 Termination criteria Stop optimization iterations when:
Solution is sufficiently accurate (check optimality criteria) Progress becomes too slow: Maximum resources have been spent The solution diverges Cycling occurs xa xb

13 Brute-force approach Simple approach: exhaustive search
Disadvantage: rather inefficient f x L0 n points: Final interval size = Ln

14 Basic strategy of 0th order methods for single-variable case
Find interval [a0, b0] that contains the minimum (bracketing) Iteratively reduce the size of the interval [ak, bk] (sectioning) Approximate the minimum by the minimum of a simple interpolation function over the interval [aN, bN] Sectioning methods: Dichotomous search Fibonacci method Golden section method

15 Bracketing the minimum
f x4 = x3+g2D x1 [a0, b0] x2 = x1+D x3 = x2+gD x Starting point x1, stepsize D, expansion parameter g: user-defined

16 Unimodality Bracketing and sectioning methods work best for unimodal functions: “An unimodal function consists of exactly one monotonically increasing and decreasing part”

17 Dichotomous search Conceptually simple idea:
Main Entry: di·chot·o·mous Pronunciation: dI-'kät-&-m&s also d&- Function: adjective : dividing into two parts Conceptually simple idea: Try to split interval in half in each step L0 a0 b0 L0/2 d << L0

18 Dichotomous search (2) Interval size after 1 step (2 evaluations):
Interval size after m steps (2m evaluations): Proper choice for d :

19 Dichotomous search (3) Example: m = 10 Ideal interval reduction m

20 Sectioning - Fibonacci
Situation: minimum bracketed between x1 and x3 : x4 x4 x1 x2 x3 Test new points and reduce interval Optimal point placement?

21 Optimal sectioning Fibonacci method: optimal sectioning method Given:
Initial interval [a0, b0] Predefined total number of evaluations N, or: Desired final interval size e

22 Fibonacci sectioning - basic idea
Start at final interval and use symmetry and maximum interval reduction: d << IN IN IN-1 = 2IN IN-2 = 3IN IN-3 = 5IN IN-4 = 8IN IN-5 = 13IN Yellow point is point that has been added in the previous iteration. Fibonacci number

23 Sectioning – Golden Section
For large N, Fibonacci fraction b converges to golden section ratio f ( …): Golden section method uses this constant interval reduction ratio f f 1

24 Sectioning - Golden Section
Origin of golden section: I1 I2 = fI1 I2 = fI1 I3 = fI2 Final interval:

25 Comparison sectioning methods
Ideal dichotomous interval reduction Fibonacci Golden section Evaluations N Dichotomous 12 Golden section 9 Fibonacci 8 (Exhaustive 99) Example: reduction to 2% of original interval: Conclusion: Golden section simple and near-optimal

26 Quadratic interpolation
Three points of the bracket define interpolating quadratic function: ai+1 bi+1 xnew New point evaluated at minimum of parabola: ai bi For minimum: a > 0! Shift xnew when very close to existing point

27 Unconstrained optimization algorithms
Single-variable methods 0th order (involving only f ) 1st order (involving f and f ’ ) 2nd order (involving f, f ’ and f ” ) Multiple variable methods

28 Cubic interpolation Similar to quadratic interpolation, but with 2 points and derivative information: ai bi

29 Bisection method Optimality conditions: minimum at stationary point  Root finding of f ’ Similar to sectioning methods, but uses derivative: f f’ Interval is halved in each iteration. Note, this is better than any of the direct methods.

30 Secant method f ’ Also based on root finding of f ’
Uses linear interpolation f ’ Interval possibly even more than halved in each iteration. Best.

31 Unconstrained optimization algorithms
Single-variable methods 0th order (involving only f ) 1st order (involving f and f ’ ) 2nd order (involving f, f ’ and f ” ) Multiple variable methods

32 Newton’s method Again, root finding of f ’
Basis: Taylor approximation of f ’: Linear approximation New guess:

33 Newton’s method f’ f’ Best convergence of all methods:
xk+1 xk+1 xk+2 xk xk+2 xk Note, jumping from point to point, not contained in an interval. Dangerous, may diverge. Unless it diverges

34 Summary single variable methods
Bracketing + Dichotomous sectioning Fibonacci sectioning Golden ratio sectioning Quadratic interpolation Cubic interpolation Bisection method Secant method Newton method In practice: additional “tricks” needed to deal with: Multimodality Strong fluctuations Round-off errors Divergence 0th order 1st order 2nd order And many, many more!

Download ppt "Engineering Optimization"

Similar presentations

Ads by Google