L25 Numerical Methods part 5 Project Questions Homework Review Tips and Tricks Summary 1.

Slides:



Advertisements
Similar presentations
Optimization with Constraints
Advertisements

Line Search.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Optimization : The min and max of a function
Optimization 吳育德.
Optimization Introduction & 1-D Unconstrained Optimization
Least Squares example There are 3 mountains u,y,z that from one site have been measured as 2474 ft., 3882 ft., and 4834 ft.. But from u, y looks 1422 ft.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Optimization in Engineering Design 1 Lagrange Multipliers.
Chapter 4 Roots of Equations
Function Optimization Newton’s Method. Conjugate Gradients
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
458 Interlude (Optimization and other Numerical Methods) Fish 458, Lecture 8.
Optimization Mechanics of the Simplex Method
Optimization Methods One-Dimensional Unconstrained Optimization
12 1 Variations on Backpropagation Variations Heuristic Modifications –Momentum –Variable Learning Rate Standard Numerical Optimization –Conjugate.
Advanced Topics in Optimization
Why Function Optimization ?
Optimization Methods One-Dimensional Unconstrained Optimization
Tier I: Mathematical Methods of Optimization

9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
UNCONSTRAINED MULTIVARIABLE
L20 LP part 6 Homework Review Postoptimality Analysis Summary 1.
Application of Differential Applied Optimization Problems.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
1 Chapter 7 Linear Programming. 2 Linear Programming (LP) Problems Both objective function and constraints are linear. Solutions are highly structured.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
Sensitivity derivatives Can obtain sensitivity derivatives of structural response at several levels Finite difference sensitivity (section 7.1) Analytical.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
L21 Numerical Methods part 1 Homework Review Search problem Line Search methods Summary 1 Test 4 Wed.
One Dimensional Search
L22 Numerical Methods part 2 Homework Review Alternate Equal Interval Golden Section Summary Test 4 1.
Chapter 10 Minimization or Maximization of Functions.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
L24 Numerical Methods part 4
Non-Linear Programming © 2011 Daniel Kirschen and University of Washington 1.
Gradient Methods In Optimization
Variations on Backpropagation.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Geology 5670/6670 Inverse Theory 27 Feb 2015 © A.R. Lowry 2015 Read for Wed 25 Feb: Menke Ch 9 ( ) Last time: The Sensitivity Matrix (Revisited)
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
Linear Programming McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Optimal Control.
Excel’s Solver Use Excel’s Solver as a tool to assist the decision maker in identifying the optimal solution for a business decision. Business decisions.
L6 Optimal Design concepts pt B
Computational Optimization
L11 Optimal Design L.Multipliers
CS5321 Numerical Optimization
Variations on Backpropagation.
Chapter 7 Optimization.
Optimization Part II G.Anuradha.
Instructor :Dr. Aamer Iqbal Bhatti
~ Least Squares example
CS5321 Numerical Optimization
Optimization and Some Traditional Methods
~ Least Squares example
Variations on Backpropagation.
Performance Optimization
L23 Numerical Methods part 3
CS5321 Numerical Optimization
Presentation transcript:

L25 Numerical Methods part 5 Project Questions Homework Review Tips and Tricks Summary 1

H24 2 F F

10.57 revisited 3

Prob

Conjugate Gradient 7 Proof:

“Deflected” Steepest Descent 8 A comparison of the convergence of gradient descent with optimal step size (in green) and conjugate vector (in red) for minimizing a quadratic function associated with a given linear system. Conjugate gradient, assuming exact arithmetic, converges in at most n steps where n is the size of the matrix of the system (here n=2). Wik gradient descent ient_method

Higher Order Methods 9

Optimization Project Formulating Computer Modeling Solving/executing Evaluating (your “solution”) Analyzing the sensitivity of your solution 10

1. Tips: Formulating 11 Functional requirements (HoQ) Eng. Characteristics (i.e. quantifiable measures) Identify design variables, names, symbols, units, limits Develop Objective function Retrieve or develop analytical formulas/models Develop constraints (laws of nature, man & economics) “Principal of Optimum Sloppiness”, significant figures?

2. Tips: Computer Modeling Pre-test custom-written code Hand-check (w/calculator): f(x), g(x), h(x) at some x(1) Eliminate ratios, if possible (to avoid divide by zero) Eliminate non-differentiable functions (such as abs(), max()) Check analytical derivatives w/FD derivatives Exploit available library routines Scale variables and or constraints if difficulties arise 12

3. Tips: Solving/Executing 13 Test “optimizer” w/ known problems/solutions Solve from multiple starting points If algorithm fails, monitor each iteration Record statistics: constraint values, solutions

4. Tips: Evaluating the Solution 14 Hand-check (calculator): f(x*), g(x*), h(x*) Evaluate constraint activity – Violated – Non-binding/inactive – Binding/active Do results make physical sense?

5. Tips: Analyzing the sensitivity 15 Relax R.H.S. Record Δf(x) for Δx Change cost coefficients in f(x) Vary parameters in g(x), h(x) Remember: A, b and c’s Look for opportunity!

Test 5 on Wed T/F Region elimination methods Steepest descent algorithim Conjugate Gradient algorithm Be prepared to do hand calculations. 16

Alternate Equal Interval Golden Section Equal Interval aka “Exhaustive” Fractional Reduction 17 Add these formulas to your notes for next test!

Summary Steepest descent algorithm may stall Conjugate Gradient Convergence in n iterations (n=# of design var’s) Still has lots of Fcn evals (in line search) May need to restart after n+1 iterations Use “TIPS” to facilitate your project 18