Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm

Slides:



Advertisements
Similar presentations
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 14.
Advertisements

Optimization.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Optimization : The min and max of a function
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Optimization of thermal processes
Optimization 吳育德.
Optimization methods Review
Optimization Introduction & 1-D Unconstrained Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Today’s class Romberg integration Gauss quadrature Numerical Methods
Direction Set (Powell’s) Methods in Multidimensions
5/20/ Multidimensional Gradient Methods in Optimization Major: All Engineering Majors Authors: Autar Kaw, Ali.
Empirical Maximum Likelihood and Stochastic Process Lecture VIII.
Steepest Decent and Conjugate Gradients (CG). Solving of the linear equation system.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Gradient Methods April Preview Background Steepest Descent Conjugate Gradient.
Numerical Optimization
Design Optimization School of Engineering University of Bradford 1 Numerical optimization techniques Unconstrained multi-parameter optimization techniques.
Optimization Methods One-Dimensional Unconstrained Optimization
Gradient Methods May Preview Background Steepest Descent Conjugate Gradient.
Optimization Mechanics of the Simplex Method
Optimization Methods One-Dimensional Unconstrained Optimization
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Advanced Topics in Optimization
ISM 206 Lecture 6 Nonlinear Unconstrained Optimization.
Why Function Optimization ?
Optimization Methods One-Dimensional Unconstrained Optimization
UNCONSTRAINED MULTIVARIABLE
Principles of Computer-Aided Design and Manufacturing Second Edition 2004 ISBN Author: Prof. Farid. Amirouche University of Illinois-Chicago.
Today’s class Boundary Value Problems Eigenvalue Problems
Today’s class Numerical Integration Newton-Cotes Numerical Methods
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Review Taylor Series and Error Analysis Roots of Equations
Nonlinear programming Unconstrained optimization techniques.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
CSE 3802 / ECE 3431 Numerical Methods in Scientific Computation
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Today’s class Numerical Differentiation Finite Difference Methods Numerical Methods Lecture 14 Prof. Jinbo Bi CSE, UConn 1.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Today’s class Roots of equation Finish up incremental search
Today’s class Numerical differentiation Roots of equation Bracketing methods Numerical Methods, Lecture 4 1 Prof. Jinbo Bi CSE, UConn.
Chapter 10 Minimization or Maximization of Functions.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Gradient Methods In Optimization
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Today’s class Ordinary Differential Equations Runge-Kutta Methods
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm One double-sided cheat sheet (8.5in x 11in) allowed Bring your calculator to the exam Chapters.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Lecture 4 Chapter 3 Improving Search
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
Non-linear Minimization
Chapter 14.
Collaborative Filtering Matrix Factorization Approach
Today’s class Multiple Variable Linear Regression
Chapter 7 Optimization.
L5 Optimal Design concepts pt A
EEE 244-8: Optimization.
Part 4 - Chapter 13.
Performance Optimization
Derivatives and Gradients
Presentation transcript:

Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm One double-sided cheat sheet (8.5in x 11in) allowed Bring your calculator to the exam Chapters 1-11 Error Analysis Taylor Series Roots of Equations Linear Systems Numerical Methods, Lecture 8 Prof. Jinbo Bi CSE, UConn

Today’s class Optimization Multi-dimensional unconstrained problems Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Multi-dimensional unconstrained problems Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Multi-dimensional unconstrained problems Given a function f(x1, x2, x3, …, xn) find the set of values that minimize or maximize the function Solution techniques Direct or nongradient methods Gradient or descent/ascent methods Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Random Search Brute force search using randomly selected inputs Conduct a sufficient number of samples and the optimum will eventually be selected Guaranteed to converge Will always find the global minimum Very inefficient in terms of convergence Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Random Search Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Univariate search Try changing just one variable at a time Iteratively optimize each dimension until you have arrived at an optimum You can use one-dimensional searches to improve the approximation Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Univariate search Numerical Methods, Prof. Jinbo Bi CSE, UConn It is not efficient to search along the narrow ridge area. Why don’t we directly move from p1 to p3 or p5, or we directly move from p2 – p4 or p6. Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Univariate search Changing one variable at a time can be inefficient especially along the narrow ridge toward the optimum Use the pattern or conjugate vectors to help guide you to the optimal Powell’s Method The best known method is called Powell’s method. Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Powell’s method Numerical Methods, Prof. Jinbo Bi CSE, UConn The observation we are using here is that if p1 and p2 are obtained by 1-dimensional search in the same direction but from different starting point. Then the direction determined by these two points will be directed toward the maximum. This new direction is called conjugate direction. For a quadratic function, the search along conjugate directions is proved to always converge in a finite number of steps regardless of the starting point. As nonlinear functions can often be reasonably approximated by a quadratic function, the conjugate direction methods are usually efficient, can reach quadratic convergence rate. Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Powell’s method Numerical Methods, Prof. Jinbo Bi CSE, UConn Each iteration we maintain two directions, and search along the two directions sequentially. This is used to find conjugate directions. Start from 0, search along h1, then it gives us p1, then search along h2, it gives us p2. We connect p0 and p2 to form a new direction, (may or may not be conjugate direction). Then we get p3. From p3, we go to next iteration, we keep two directions, h2 and h3, we serach along h2, then we hit p4, we then search along h3 from p4, we hit p5. Then we connect p3 and p5. Then h4 is conjugate to h3. Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Powell’s method Example: find minimum of Start with initial point X0=(5.5,2) and initial vectors (1,0) and (0,1) Find minimum along (1,0) vector Minimum at γ1= π-5.5 (point 1) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Powell’s method Find minimum along (0,1) vector Minimum at (point 1) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Powell’s method Now minimize along the P2-P0 vector Minimum at γ=0.9817 (h3) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Powell’s method Find minimum along U1=(0,1) vector (point 3) Find minimum along U1=(0,1) vector Minimum at γ1=0.0497 (point 4) (h3) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Powell’s method Find minimum along U2=(-2.3584,2.7124) vector Minimum at γ2=0.00788 (point 5) (h4) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Powell’s method Now minimize along the U2 vector Minimum at γ=0.8035684 Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Given a starting point, use the gradient to tell you which direction to proceed The gradient gives you the largest slope out from the current position Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Numerical Methods, Prof. Jinbo Bi CSE, UConn Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods The slope at point (a,b) is Maximize the slope Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Example: Steepest ascent of f(x,y)=xy2 at (2,2) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Numerical Methods, Prof. Jinbo Bi CSE, UConn Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Second derivative tells you whether you have reached a minimum or maximum But in multi-dimensions, it is a little trickier Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Numerical Methods, Prof. Jinbo Bi CSE, UConn A saddle point. If you think to calculate partial derivative of f(x) with respect x, calculate the partial second derivative, it is positive, so you say we reach the minimum. Then you think what happens to the y diemsjjon, then you calculate f(x,y) with respect to y, calculate for second derivative, you like both are positive. OK, this also reaches the minimum along y dimension, so hooray, we are at the minimum point, but actually, when you look at the direction defined by y=x, this point is actually the maximum point Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Compute the Hessian determinant Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Numerical Methods, Prof. Jinbo Bi CSE, UConn Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Example: Maximize f(x,y)=2xy+2x-x2-2y2, with an initial solution of (-1,1) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Gradient Methods Find the maximum along the vector (-1,1)+γ(6,-6) Now Maximize New point is now (-1,1)+γ(6,-6) = (0.2, -0.2) Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn

Next class Constrained Optimization Read Chapter 15 Numerical Methods, Lecture 10 Prof. Jinbo Bi CSE, UConn