1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.

Slides:



Advertisements
Similar presentations
Optimization.
Advertisements

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Optimization : The min and max of a function
Optimization 吳育德.
Optimization methods Review
Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
Nonlinear programming: One dimensional minimization methods
1cs542g-term Notes  Assignment 1 due tonight ( me by tomorrow morning)
Function Optimization Newton’s Method. Conjugate Gradients
Unconstrained Optimization Rong Jin. Recap  Gradient ascent/descent Simple algorithm, only requires the first order derivative Problem: difficulty in.
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
Tutorial 12 Unconstrained optimization Conjugate gradients.
Design Optimization School of Engineering University of Bradford 1 Numerical optimization techniques Unconstrained multi-parameter optimization techniques.
Optimization Methods One-Dimensional Unconstrained Optimization
Tutorial 5-6 Function Optimization. Line Search. Taylor Series for Rn
Optimization Methods One-Dimensional Unconstrained Optimization
Function Optimization. Newton’s Method Conjugate Gradients Method
Advanced Topics in Optimization
Why Function Optimization ?
An Introduction to Optimization Theory. Outline Introduction Unconstrained optimization problem Constrained optimization problem.
Optimization Methods One-Dimensional Unconstrained Optimization
Tier I: Mathematical Methods of Optimization

9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
UNCONSTRAINED MULTIVARIABLE
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 9. Optimization problems.
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Gradient descent.
Chapter 2 Single Variable Optimization
Chapter 17 Boundary Value Problems. Standard Form of Two-Point Boundary Value Problem In total, there are n 1 +n 2 =N boundary conditions.
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Application of Differential Applied Optimization Problems.
Nonlinear programming Unconstrained optimization techniques.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Solution of Nonlinear Functions
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Lecture 13. Geometry Optimization References Computational chemistry: Introduction to the theory and applications of molecular and quantum mechanics, E.
One Dimensional Search
Chapter 10 Minimization or Maximization of Functions.
L24 Numerical Methods part 4
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm
L25 Numerical Methods part 5 Project Questions Homework Review Tips and Tricks Summary 1.
Steepest Descent Method Contours are shown below.
Gradient Methods In Optimization
Survey of unconstrained optimization gradient based algorithms
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
METHOD OF STEEPEST DESCENT ELE Adaptive Signal Processing1 Week 5.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
Optimal Control.
CS5321 Numerical Optimization
Chapter 10. Numerical Solutions of Nonlinear Systems of Equations
Chapter 7 Optimization.
Conjugate Gradient Method
Optimization Part II G.Anuradha.
Instructor :Dr. Aamer Iqbal Bhatti
~ Least Squares example
~ Least Squares example
Neural Network Training
Performance Optimization
L23 Numerical Methods part 3
Outline Preface Fundamentals of Optimization
Presentation transcript:

1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent Search Direction Don’t need to normalize Method terminates at any stationary point. Why?

2 Chapter 6

3

4 Numerical Method Use coarse search first (1) Fixed  (  = 1) or variable  (  = 1, 2, ½, etc.) Options for optimizing  (1) Use interpolation such as quadratic, cubic (2) Region Elimination (Golden Search) (3) Newton, Secant, Quasi-Newton (4) Random (5) Analytical optimization (1), (3), and (5) are preferred. However, it may not be desirable to exactly optimize  (better to generate new search directions).

5 Chapter 6

6

7 Conjugate Search Directions Improvement over gradient method for general quadratic functions Basis for many NLP techniques Two search directions are conjugate relative to Q if To minimize f(x nx1 ) when H is a constant matrix (=Q), you are guaranteed to reach the optimum in n conjugate direction stages if you minimize exactly at each stage (one-dimensional search) Chapter 6

8

9 Conjugate Gradient Method by minimizing f(x) with respect to  in the s 0 direction (i.e., carry out a unidimensional search for  0 ). Step 3. CalculateThe new search direction is a linear combination of For the kth iteration the relation is For a quadratic function it can be shown that these successive search directions are conjugate. After n iterations (k = n), the quadratic function is minimized. For a nonquadratic function, the procedure cycles again with x n+1 becoming x 0. Step 4. Test for convergence to the minimum of f(x). If convergence is not attained, return to step 3. Step n. Terminate the algorithm when is less than some prescribed tolerance. (6.6) Chapter 6