L21 Numerical Methods part 1 Homework Review Search problem Line Search methods Summary 1 Test 4 Wed
Problem
H20 cont’d 3
4 a. Increase cost “by” $0.16, fnew=$53,238 or +$838 inc b. Reduce mill A capacity to 200 logs/day Changes nothing c. Reduce mill B capacity to 270 logs/day, increases cost by $750 and new opt sol’n is x1=0, x2=30, x3=200, and x4=70
H20 cont’d 5
Sensitivity Analyses 6 how sensitive are the: a. optimal value (i.e. f(x) and b. optimal solution (i.e. x) … to the parameters (i.e. assumptions) in our model?
Model parameters 7 Consider your abc’s, i.e. A, b and c
Simplex LaGrange Multipliers 8 Constraint Type ≤ = ≥ slackeithersurplus c’ column“regular”artificial Find the multipliers in the final tableau (right side)
Let’s minimize f even further 9 Increase/decrease ei to reduce f(x)
Is there more to Optimization Simplex is great…but…. Many problems are non-linear Many of these cannot be “linearized” Need other methods! 10
General Optimization Algorithms: Sub Problem A Which direction to head next? Sub Problem B How far to go in that direction? 11
Magnitude and direction 12 Let u be a unit vector of length 1, parallel to a Alpha = magnitude or step size (i.e.scalar) Unit vector = direction (i.e. vector)
13 Figure 10.2 Conceptual diagram for iterative steps of an optimization method. We are here Which direction should we head?
Minimize f(x): Let’s go downhill! 14 Descent condition scalar
Dot Product 15 At what angle does the dot product become most negative? Max descent …..
Desirable Direction 16 Descent is guaranteed!
Ex: Using the “descent condition” 17
Step Size? How big should we make alpha? Can we step too “far?” i.e. can our step size be chosen so big that we step over the “minimum?” 18
19 Figure 10.5 Nonunimodal function f( ) for 0 Nonunimodal functions Unimodal if stay in the locale?
Monotonic Increasing Functions 20
Monotonic Decreasing Functions 21 continous
22 Figure 10.4 Unimodal function f( ). Unimodal functions: monotonic increasing then monotonic decreasing monotonic decreasing then monotonic increasing
Some Step Size Methods “Analytical” Search direction = (-) gradient, (i.e. line search) Form line search function f( α) Find f’( α)=0 Region Elimination (“interval reducing”) Equal interval Alternate equal interval Golden Section 23
24 Figure 10.3 Graph of f( ) versus . Analytical Step size Slope of line search=
Analytical Step Size Example 25
Alternative Analytical Step Size 26 New gradient must be orthogonal to d for
Some Step Size Methods “Analytical” Search direction = (-) gradient, (i.e. line search) Form line search function f( α) Find f’( α)=0 Region Elimination (“interval reducing”) Equal interval Alternate equal interval Golden Section 27
28 Figure 10.6 Equal-interval search process. (a) Phase I: initial bracketing of minimum. (b) Phase II: reducing the interval of uncertainty. “Interval Reducing” Region elimination “bounding phase” Interval reduction phase”
2 delta! 29
Successive-Equal Interval Algorithm 30 “Interval” of uncertainty
Successive Equal Inteval Search Very robust Works for continuous and discrete functions Lots of f(x) evaluations!!! 31
32 Figure 10.7 Graphic of an alternate equal-interval solution process. Alternate equal interval
Which region to reject? 33
Summary Sensitivity Analyses add value to your solutions Sensitivity is as simple as Abc’s Constraint variation sensitivity theorem can answer simple resource limits questions General Opt Alg’ms have two sub problems: search direction, and step size In local neighborhood.. Assume uimodal! Descent condition assures correct direction Step size methods: analytical, region elimin. 34