Survey of gradient based constrained optimization algorithms Select algorithms based on their popularity. Additional details and additional algorithms.

Slides:



Advertisements
Similar presentations
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Advertisements

Survey of gradient based constrained optimization algorithms Select algorithms based on their popularity. Additional details and additional algorithms.
Optimization with Constraints
Engineering Optimization
Engineering optimization dilemma Optimization algorithms developed by mathematicians are normally based on linear and quadratic approximations Usually.
Unconstrained optimization Gradient based algorithms –Steepest descent –Conjugate gradients –Newton and quasi-Newton Population based algorithms –Nelder.
Chapter 5 The Simplex Method The most popular method for solving Linear Programming Problems We shall present it as an Algorithm.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Adam Networks Research Lab Transformation Methods Penalty and Barrier methods Study of Engineering Optimization Adam (Xiuzhong) Chen 2010 July 9th Ref:
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2014 – 35148: Continuous Solution for Boundary Value Problems.
Optimization Introduction & 1-D Unconstrained Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Constrained optimization Indirect methods Direct methods.
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Linear programming, quadratic programming, sequential quadratic programming.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Thursday, April 25 Nonlinear Programming Theory Separable programming Handouts: Lecture Notes.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
Optimization in Engineering Design 1 Lagrange Multipliers.
MIT and James Orlin © Nonlinear Programming Theory.
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
Nonlinear Optimization for Optimal Control
Methods For Nonlinear Least-Square Problems
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and.
Optimization Methods One-Dimensional Unconstrained Optimization
Unconstrained Optimization Problem
An Introduction to Optimization Theory. Outline Introduction Unconstrained optimization problem Constrained optimization problem.
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Tier I: Mathematical Methods of Optimization
1 Chapter 5 Nonlinear Programming Chemical Engineering Department National Tsing-Hua University Prof. Shi-Shang Jang May, 2003.
Name: Mehrab Khazraei(145061) Title: Penalty or Exterior penalty function method professor Name: Sahand Daneshvar.
456/556 Introduction to Operations Research Optimization with the Excel 2007 Solver.
1 Chapter 8 Nonlinear Programming with Constraints.
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Fin500J: Mathematical Foundations in Finance
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
System Optimization (1) Liang Yu Department of Biological Systems Engineering Washington State University
Optimization unconstrained and constrained Calculus part II.
Introduction to Optimization Methods
Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
Survey of unconstrained optimization gradient based algorithms
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Integer Programming, Branch & Bound Method
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
By Liyun Zhang Aug.9 th,2012. * Method for Single Variable Unconstraint Optimization : 1. Quadratic Interpolation method 2. Golden Section method * Method.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 1 Primal Methods.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 117 Penalty and Barrier Methods General classical constrained.
Lecture 4 Chapter 3 Improving Search
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
Computational Optimization
Chapter 5 The Simplex Method
Dr. Arslan Ornek IMPROVING SEARCH
CS5321 Numerical Optimization
3-3 Optimization with Linear Programming
Chapter 7 Optimization.
CS5321 Numerical Optimization
CS5321 Numerical Optimization
EEE 244-8: Optimization.
Part 4 - Chapter 13.
Transformation Methods Penalty and Barrier methods
Constraints.
Presentation transcript:

Survey of gradient based constrained optimization algorithms Select algorithms based on their popularity. Additional details and additional algorithms in Chapter 5 of Haftka and Gurdal’s Elements of Structural Optimization

Optimization with constraints Standard formulation Equality constraints are a challenge, but are fortunately missing in most engineering design problems, so this lecture will deal only with equality constraints.

Derivative based optimizers All are predicated on the assumption that function evaluations are expensive and gradients can be calculated. Similar to a person put at night on a hill and directed to find the lowest point in an adjacent valley using a flashlight with limited battery Basic strategy: 1. Flash light to get derivative and select direction. 2.Go straight in that direction until you start going up or hit constraint. 3.Repeat until converged. Some methods move mostly along the constraint boundaries, some mostly on the inside (interior point algorithms)

Gradient projection and reduced gradient methods Find good direction tangent to active constraints Move a distance and then restore to constraint boundaries A typical active set algorithm, used in Excel

Penalty function methods Quadratic penalty function Gradual rise of penalty parameter leads to sequence of unconstrained minimization technique (SUMT). Why is it important?

Example 5.7.1

Contours for r=1.

Contours for r=1000. For non-derivative methods can avoid this by having penalty proportional to absolute value of violation instead of its square!

Problems Penalty With an extremely robust algorithm, we can find a very accurate solution with a penalty function approach by using a very high r. However, at some high value the algorithm will begin to falter, either taking very large number of iterations or not reaching the solution. Test fminunc and fminsearch on Example starting from x0=[2,2]. Start with r=1000 and increase. SolutionSolution

5.9: Projected Lagrangian methods Sequential quadratic programming Find direction by solving Find alpha by minimizing

Matlab function fmincon FMINCON attempts to solve problems of the form: min F(X) subject to: A*X <= B, Aeq*X = Beq (linear cons) X C(X) <= 0, Ceq(X) = 0 (nonlinear cons) LB <= X <= UB [X,FVAL,EXITFLAG,OUTPUT,LAMBDA] =FMINCON(FUN,X0,A,B,Aeq,Beq,LB,UB,NONLCON) The function NONLCON accepts X and returns the vectors C and Ceq, representing the nonlinear inequalities and equalities respectively. (Set LB=[] and/or UB=[] if no bounds exist.). Possible values of EXITFLAG 1 First order optimality conditions satisfied. 0 Too many function evaluations or iterations. -1 Stopped by output/plot function. -2 No feasible point found.

Quadratic function and constraint example function f=quad2(x) Global a f=x(1)^2+a*x(2)^2; end function [c,ceq]=ring(x) global ri ro c(1)=ri^2-x(1)^2-x(2)^2; c(2)=x(1)^2+x(2)^2-ro^2; ceq=[]; end x0=[1,10]; a=10;ri=10.; ro=20; x = fval =

Fuller output Optimization completed because the objective function is non-decreasing in feasible directions, to within the default value of the function tolerance, and constraints are satisfied to within the default value of the constraint tolerance. x = fval = flag = 1 output = iterations: 6 funcCount: 22 lssteplength: 1 stepsize: e-06 algorithm: 'medium-scale: SQP, Quasi-Newton, line-search' firstorderopt: e-08 constrviolation: e-11 lambda.ineqnonlin’=

Making it harder for fmincon a=1.1; Maximum number of function evaluations exceeded; increase OPTIONS.MaxFunEvals. x = fval = flag=0 iterations: 14 funcCount: 202 lssteplength: e-04 stepsize: algorithm: 'medium-scale: SQP, Quasi-Newton, line-search' firstorderopt: constrviolation:

Restart sometimes helps x0=x x0 = x = fval = flag = 1 iterations: 15 funcCount: 108 lssteplength: 1 stepsize: e-04 algorithm: 'medium-scale: SQP, Quasi-Newton, line-search' firstorderopt: e-07 constrviolation: e-07

Problem fmincon For the ring problem with a=10, ro=20, can you find a starting point within a circle of radius 30 around the origin that will prevent fmincon of finding the optimum? Solution in the notes page.