L21 Numerical Methods part 1 Homework Review Search problem Line Search methods Summary 1 Test 4 Wed.

Slides:



Advertisements
Similar presentations
Optimization : The min and max of a function
Advertisements

Optimization 吳育德.
Optimization Introduction & 1-D Unconstrained Optimization
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
MIT and James Orlin © Nonlinear Programming Theory.
PHYS2020 NUMERICAL ALGORITHM NOTES ROOTS OF EQUATIONS.
Extremum. Finding and Confirming the Points of Extremum.
458 Interlude (Optimization and other Numerical Methods) Fish 458, Lecture 8.
Engineering Optimization
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
Optimization Mechanics of the Simplex Method
Advanced Topics in Optimization
Max-flow/min-cut theorem Theorem: For each network with one source and one sink, the maximum flow from the source to the destination is equal to the minimal.
Computational Optimization
KKT Practice and Second Order Conditions from Nash and Sofer
Principles of Computer-Aided Design and Manufacturing Second Edition 2004 ISBN Author: Prof. Farid. Amirouche University of Illinois-Chicago.
1. The Simplex Method.
ME 2304: 3D Geometry & Vector Calculus Dr. Faraz Junejo Double Integrals.
Chapter 11 Nonlinear Programming
L20 LP part 6 Homework Review Postoptimality Analysis Summary 1.
Section 10.2a VECTORS IN THE PLANE. Vectors in the Plane Some quantities only have magnitude, and are called scalars … Examples? Some quantities have.
Nonlinear programming Unconstrained optimization techniques.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
L4 Graphical Solution Homework See new Revised Schedule Review Graphical Solution Process Special conditions Summary 1 Read for W for.
Solving Linear Programming Problems: The Simplex Method
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
Department of Mechanical Engineering, The Ohio State University Sl. #1GATEWAY Optimization.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
1 The Geometry of Linear Programs –the geometry of LPs illustrated on GTC Handouts: Lecture Notes February 5, 2002.
L8 Optimal Design concepts pt D
L22 Numerical Methods part 2 Homework Review Alternate Equal Interval Golden Section Summary Test 4 1.
Chapter 10 Minimization or Maximization of Functions.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
L24 Numerical Methods part 4
Optimization of functions of one variable (Section 2)
L25 Numerical Methods part 5 Project Questions Homework Review Tips and Tricks Summary 1.
Non-Linear Programming © 2011 Daniel Kirschen and University of Washington 1.
Steepest Descent Method Contours are shown below.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Ch. Eick: Num. Optimization with GAs Numerical Optimization General Framework: objective function f(x 1,...,x n ) to be minimized or maximized constraints:
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Lecture 4 Chapter 3 Improving Search
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
6.4 Vectors and Dot Products Objectives: Students will find the dot product of two vectors and use properties of the dot product. Students will find angles.
MIT and James Orlin © The Geometry of Linear Programs –the geometry of LPs illustrated on GTC.
Optimal Control.
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Chap 10. Sensitivity Analysis
Non-linear Minimization
L11 Optimal Design L.Multipliers
Dr. Arslan Ornek IMPROVING SEARCH
Chapter 5. Sensitivity Analysis
Chapter 7 Optimization.
Optimization Part II G.Anuradha.
L5 Optimal Design concepts pt A
L10 Optimal Design L.Multipliers
Chapter 5. The Duality Theorem
Optimization and Some Traditional Methods
Math 175: Numerical Analysis II
Part 4 - Chapter 13.
Phys 13 General Physics 1 Vector Product MARLON FLORES SACEDON.
L23 Numerical Methods part 3
Chapter 6. Large Scale Optimization
Bracketing.
L8 Optimal Design concepts pt D
Presentation transcript:

L21 Numerical Methods part 1 Homework Review Search problem Line Search methods Summary 1 Test 4 Wed

Problem

H20 cont’d 3

4 a. Increase cost “by” $0.16, fnew=$53,238 or +$838 inc b. Reduce mill A capacity to 200 logs/day Changes nothing c. Reduce mill B capacity to 270 logs/day, increases cost by $750 and new opt sol’n is x1=0, x2=30, x3=200, and x4=70

H20 cont’d 5

Sensitivity Analyses 6 how sensitive are the: a. optimal value (i.e. f(x) and b. optimal solution (i.e. x) … to the parameters (i.e. assumptions) in our model?

Model parameters 7 Consider your abc’s, i.e. A, b and c

Simplex LaGrange Multipliers 8 Constraint Type ≤ = ≥ slackeithersurplus c’ column“regular”artificial Find the multipliers in the final tableau (right side)

Let’s minimize f even further 9 Increase/decrease ei to reduce f(x)

Is there more to Optimization Simplex is great…but…. Many problems are non-linear Many of these cannot be “linearized” Need other methods! 10

General Optimization Algorithms: Sub Problem A Which direction to head next? Sub Problem B How far to go in that direction? 11

Magnitude and direction 12 Let u be a unit vector of length 1, parallel to a Alpha = magnitude or step size (i.e.scalar) Unit vector = direction (i.e. vector)

13 Figure 10.2 Conceptual diagram for iterative steps of an optimization method. We are here Which direction should we head?

Minimize f(x): Let’s go downhill! 14 Descent condition scalar

Dot Product 15 At what angle does the dot product become most negative? Max descent …..

Desirable Direction 16 Descent is guaranteed!

Ex: Using the “descent condition” 17

Step Size? How big should we make alpha? Can we step too “far?” i.e. can our step size be chosen so big that we step over the “minimum?” 18

19 Figure 10.5 Nonunimodal function f(  ) for 0   Nonunimodal functions Unimodal if stay in the locale?

Monotonic Increasing Functions 20

Monotonic Decreasing Functions 21 continous

22 Figure 10.4 Unimodal function f(  ). Unimodal functions: monotonic increasing then monotonic decreasing monotonic decreasing then monotonic increasing

Some Step Size Methods “Analytical” Search direction = (-) gradient, (i.e. line search) Form line search function f( α) Find f’( α)=0 Region Elimination (“interval reducing”) Equal interval Alternate equal interval Golden Section 23

24 Figure 10.3 Graph of f(  ) versus . Analytical Step size Slope of line search=

Analytical Step Size Example 25

Alternative Analytical Step Size 26 New gradient must be orthogonal to d for

Some Step Size Methods “Analytical” Search direction = (-) gradient, (i.e. line search) Form line search function f( α) Find f’( α)=0 Region Elimination (“interval reducing”) Equal interval Alternate equal interval Golden Section 27

28 Figure 10.6 Equal-interval search process. (a) Phase I: initial bracketing of minimum. (b) Phase II: reducing the interval of uncertainty. “Interval Reducing” Region elimination “bounding phase” Interval reduction phase”

2 delta! 29

Successive-Equal Interval Algorithm 30 “Interval” of uncertainty

Successive Equal Inteval Search Very robust Works for continuous and discrete functions Lots of f(x) evaluations!!! 31

32 Figure 10.7 Graphic of an alternate equal-interval solution process. Alternate equal interval

Which region to reject? 33

Summary Sensitivity Analyses add value to your solutions Sensitivity is as simple as Abc’s Constraint variation sensitivity theorem can answer simple resource limits questions General Opt Alg’ms have two sub problems: search direction, and step size In local neighborhood.. Assume uimodal! Descent condition assures correct direction Step size methods: analytical, region elimin. 34