Second Derivative Test Local min, local max, saddle point Gradient of f – vector (d f/ dx, d f /dy, d f /dz ) direction of fastest increase of f Global min/max vs. local min/max
Gradient Descent Method Examples Minimize function
Gradient Descent Method Examples Use function gd(alpha,x0) Does gd.m converge to a local min? Is there a difference if > 0 vs. < 0? How many iterations does it take to converge to a local min? How do starting points x0 affect number of iterations? Use function gd2(x0) Does gd2.m converge to a local min? How do starting points x0 affect number of iterations and the location of a local minimum?
How good are the optimization methods? Starting point Convergence to global min/max. Classes of nice optimization problems Example: f(x,y) = 0.5( x 2 +y 2 ), > 0 Every local min is global min.
Other optimization methods Non smooth, non differentiable surfaces can not compute the gradient of f can not use Gradient Method Nelder-Mead Method Others
Convex Hull A set C is convex if every point on the line segment connecting x and y is in C. The convex hull for a set of points X is the minimal convex set containing X.
Simplex A simplex or n-simplex is the convex hull of a set of (n +1). A simplex is an n- dimensional analogue of a triangle. Example: a 1-simplex is a line segment a 2-simplex is a triangle a 3-simplex is a tetrahedron a 4-simplex is a pentatope
Nelder-Mead Method n = number of variables, n+1 points form simplex using these points; convex hull move in direction away from the worst of these points: reflect, expand, contract, shrink Example: 2 variables 3 points simplex is triangle 3 variables 4 points simplex is tetrahedron