Presentation is loading. Please wait.

Presentation is loading. Please wait.

Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm

Similar presentations


Presentation on theme: "Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm"— Presentation transcript:

1 Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm
One double-sided cheat sheet (8.5in x 11in) allowed Bring your calculator to the exam Chapters 1-11 Error Analysis Taylor Series Roots of Equations Linear Systems Numerical Methods, Lecture 8 Prof. Jinbo Bi CSE, UConn

2 Today’s class Optimization Multi-dimensional unconstrained problems
Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

3 Multi-dimensional unconstrained problems
Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

4 Multi-dimensional unconstrained problems
Given a function f(x1, x2, x3, …, xn) find the set of values that minimize or maximize the function Solution techniques Direct or nongradient methods Gradient or descent/ascent methods Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

5 Random Search Brute force search using randomly selected inputs
Conduct a sufficient number of samples and the optimum will eventually be selected Guaranteed to converge Will always find the global minimum Very inefficient in terms of convergence Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

6 Random Search Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

7 Univariate search Try changing just one variable at a time
Iteratively optimize each dimension until you have arrived at an optimum You can use one-dimensional searches to improve the approximation Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

8 Univariate search Numerical Methods, Prof. Jinbo Bi CSE, UConn
It is not efficient to search along the narrow ridge area. Why don’t we directly move from p1 to p3 or p5, or we directly move from p2 – p4 or p6. Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

9 Univariate search Changing one variable at a time can be inefficient especially along the narrow ridge toward the optimum Use the pattern or conjugate vectors to help guide you to the optimal Powell’s Method The best known method is called Powell’s method. Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

10 Powell’s method Numerical Methods, Prof. Jinbo Bi CSE, UConn
The observation we are using here is that if p1 and p2 are obtained by 1-dimensional search in the same direction but from different starting point. Then the direction determined by these two points will be directed toward the maximum. This new direction is called conjugate direction. For a quadratic function, the search along conjugate directions is proved to always converge in a finite number of steps regardless of the starting point. As nonlinear functions can often be reasonably approximated by a quadratic function, the conjugate direction methods are usually efficient, can reach quadratic convergence rate. Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

11 Powell’s method Numerical Methods, Prof. Jinbo Bi CSE, UConn
Each iteration we maintain two directions, and search along the two directions sequentially. This is used to find conjugate directions. Start from 0, search along h1, then it gives us p1, then search along h2, it gives us p2. We connect p0 and p2 to form a new direction, (may or may not be conjugate direction). Then we get p3. From p3, we go to next iteration, we keep two directions, h2 and h3, we serach along h2, then we hit p4, we then search along h3 from p4, we hit p5. Then we connect p3 and p5. Then h4 is conjugate to h3. Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

12 Powell’s method Example: find minimum of
Start with initial point X0=(5.5,2) and initial vectors (1,0) and (0,1) Find minimum along (1,0) vector Minimum at γ1= π-5.5 (point 1) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

13 Powell’s method Find minimum along (0,1) vector Minimum at (point 1)
Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

14 Powell’s method Now minimize along the P2-P0 vector
Minimum at γ=0.9817 (h3) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

15 Powell’s method Find minimum along U1=(0,1) vector
(point 3) Find minimum along U1=(0,1) vector Minimum at γ1=0.0497 (point 4) (h3) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

16 Powell’s method Find minimum along U2=(-2.3584,2.7124) vector
Minimum at γ2= (point 5) (h4) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

17 Powell’s method Now minimize along the U2 vector
Minimum at γ= Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

18 Gradient Methods Given a starting point, use the gradient to tell you which direction to proceed The gradient gives you the largest slope out from the current position Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

19 Gradient Methods Numerical Methods, Prof. Jinbo Bi CSE, UConn
Lecture 10 Prof. Jinbo Bi CSE, UConn

20 Gradient Methods The slope at point (a,b) is Maximize the slope
Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

21 Gradient Methods Example: Steepest ascent of f(x,y)=xy2 at (2,2)
Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

22 Gradient Methods Numerical Methods, Prof. Jinbo Bi CSE, UConn
Lecture 10 Prof. Jinbo Bi CSE, UConn

23 Gradient Methods Second derivative tells you whether you have reached a minimum or maximum But in multi-dimensions, it is a little trickier Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

24 Gradient Methods Numerical Methods, Prof. Jinbo Bi CSE, UConn
A saddle point. If you think to calculate partial derivative of f(x) with respect x, calculate the partial second derivative, it is positive, so you say we reach the minimum. Then you think what happens to the y diemsjjon, then you calculate f(x,y) with respect to y, calculate for second derivative, you like both are positive. OK, this also reaches the minimum along y dimension, so hooray, we are at the minimum point, but actually, when you look at the direction defined by y=x, this point is actually the maximum point Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

25 Gradient Methods Compute the Hessian determinant Numerical Methods,
Lecture 10 Prof. Jinbo Bi CSE, UConn

26 Gradient Methods Numerical Methods, Prof. Jinbo Bi CSE, UConn
Lecture 10 Prof. Jinbo Bi CSE, UConn

27 Gradient Methods Example: Maximize f(x,y)=2xy+2x-x2-2y2, with an initial solution of (-1,1) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

28 Gradient Methods Find the maximum along the vector (-1,1)+γ(6,-6)
Now Maximize New point is now (-1,1)+γ(6,-6) = (0.2, -0.2) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

29 Next class Constrained Optimization Read Chapter 15 Numerical Methods,
Lecture 10 Prof. Jinbo Bi CSE, UConn


Download ppt "Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm"

Similar presentations


Ads by Google