Engineering Optimization

Slides:



Advertisements
Similar presentations
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Advertisements

Nonlinear Programming McCarl and Spreen Chapter 12.
Engineering Optimization
Introducción a la Optimización de procesos químicos. Curso 2005/2006 BASIC CONCEPTS IN OPTIMIZATION: PART II: Continuous & Unconstrained Important concepts.
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
Engineering Optimization
1 OR II GSLM Outline  classical optimization – unconstrained optimization  dimensions of optimization  feasible direction.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
by Rianto Adhy Sasongko Supervisor: Dr.J.C.Allwright
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Separating Hyperplanes
Inexact SQP Methods for Equality Constrained Optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
EE 553 Introduction to Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1. 2 Local maximum Local minimum 3 Saddle point.
Optimization in Engineering Design 1 Lagrange Multipliers.
Numerical Optimization
Engineering Optimization
Engineering Optimization
Constrained Optimization
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
Engineering Optimization
Unconstrained Optimization Problem
Constrained Optimization Economics 214 Lecture 41.
Engineering Optimization
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Tier I: Mathematical Methods of Optimization
Lecture 9 – Nonlinear Programming Models
Introduction to Optimization (Part 1)
KKT Practice and Second Order Conditions from Nash and Sofer
Machine Learning Week 4 Lecture 1. Hand In Data Is coming online later today. I keep test set with approx test images That will be your real test.
1. Problem Formulation. General Structure Objective Function: The objective function is usually formulated on the basis of economic criterion, e.g. profit,
Slide 2a.1 Stiff Structures, Compliant Mechanisms, and MEMS: A short course offered at IISc, Bangalore, India. Aug.-Sep., G. K. Ananthasuresh Lecture.
Nonlinear Programming Models
Machine Learning Weak 4 Lecture 2. Hand in Data It is online Only around 6000 images!!! Deadline is one week. Next Thursday lecture will be only one hour.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
L8 Optimal Design concepts pt D
Part 4 Nonlinear Programming 4.1 Introduction. Standard Form.
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Signal & Weight Vector Spaces
Performance Surfaces.
Introduction to Optimization
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Inequality Constraints Lecture 7. Inequality Contraints (I) n A Review of Lagrange Multipliers –As we discussed last time, the first order necessary conditions.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
OR II GSLM
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
Part 4 Nonlinear Programming
Chapter 11 Optimization with Equality Constraints
L11 Optimal Design L.Multipliers
The Lagrange Multiplier Method
L10 Optimal Design L.Multipliers
Part 4 Nonlinear Programming
Outline Unconstrained Optimization Functions of One Variable
Part 3. Linear Programming
CS5321 Numerical Optimization
Optimal Control of Systems
Constraints.
L8 Optimal Design concepts pt D
Presentation transcript:

Engineering Optimization Concepts and Applications WB 1440 Fred van Keulen Matthijs Langelaar CLA H21.1 A.vanKeulen@tudelft.nl

Geometrical interpretation For single equality constraint: simple geometrical interpretation of Lagrange optimality condition: f h x1 x2 Gradients parallel  tangents parallel  h tangent to isolines Meaning: h f For multiple equality constraints, this doesn’t work anymore, because the multipliers define a subspace. Since they can have any sign, there is no interpretation, other than the fact that the gradient must lie in this subspace.

Summary f h x1 x2 h f First order optimality condition for equality constrained problem: Zero reduced gradient: Equivalent: stationary Lagrangian:

Contents Constrained Optimization: Optimality Criteria Reduced gradient Lagrangian Sufficiency conditions Inequality constraints Karush-Kuhn-Tucker (KKT) conditions Interpretation of Lagrange multipliers Constrained Optimization: Algorithms

Sufficiency? Until now, only stationary points considered. Does not guarantee minimum! f h h f f h h f Lagrange condition: maximum minimum Using second order Taylor approximation, with this Hessian, we can formulate the condition for a minimum. f h h f f h h f minimum no extremum

Constrained Hessian Sufficiency conditions follow from 2nd order Taylor approximation Second order information required: constrained Hessian: obtained by differentiation of the constrained gradient, and second-order constraint perturbation: with

Sufficiency conditions Via 2nd order Taylor approximation, it follows that at a minimum the following must hold: (Constrained Hessian positive definite) and Lagrangian approach also yields: with i.e., the constrained hessian should be positive definite. Perturbations only in tangent subspace of h!

Summary Optimality conditions for equality constrained problem: 1. Necessary condition: stationary point when: 2. Sufficient condition: minimum when (1) and: on tangent subspace.

Example x2 f h x1 1. Necessary condition: stationary point when

Contents Constrained Optimization: Optimality Criteria Reduced gradient Lagrangian Sufficiency conditions Inequality constraints Karush-Kuhn-Tucker (KKT) conditions Interpretation of Lagrange multipliers Constrained Optimization: Algorithms

Inequality constrained problems Consider problem with only inequality constraints: At optimum, only active constraints matter: Optimality conditions similar to equality constrained problem

Inequality constraints First order optimality: Consider feasible local variation around optimum: (boundary optimum) (feasible perturbation)

Optimality condition Multipliers must be non-negative: x1 x2 f -f This interpretation is given in Haftka. Interpretation: negative gradient (descent direction) lies in cone spanned by positive constraint gradients -f

Optimality condition (2) g2 Feasible cone x2 Feasible direction: g1 f -f Descent direction: x1 This interpretation is given in Belegundu. Equivalent interpretation: no descent direction exists within the cone of feasible directions

Examples f f -f f -f -f

Optimality condition (3) Active constraints: Inactive constraints: Formulation including all inequality constraints: For regular points, for the non-degenerate case, the multipliers of active constraints are positive, and cannot be zero. Because all multipliers must be nonnegative, and because inactive constraints have negative g-values, the multipliers related to the constraints must be zero. So therefore the inner product is equivalent to mu_I times g_I. and Complementaritycondition

Example x2 x1 L m x2 x1 m L Also used in finite element contact algorithms

Mechanical application: contact Lagrange multipliers also used in: Contact in multibody dynamics Contact in finite elements

Contents Constrained Optimization: Optimality Criteria Reduced gradient Lagrangian Sufficiency conditions Inequality constraints Karush-Kuhn-Tucker (KKT) conditions Interpretation of Lagrange multipliers Constrained Optimization: Algorithms

Karush-Kuhn-Tucker conditions Combining Lagrange conditions for equality and inequality constraints yields KKT conditions for general problem: Lagrangian: (optimality) Note, this condition applies only to regular points, I.e. were the constraint gradients are not linearly dependent. and (feasibility) (complementarity)

Sufficiency KKT conditions are necessary conditions for local constrained minima For sufficiency, consider the sufficiency conditions based on the active constraints: on tangent subspace of h and active g. Interpretation: objective and feasible domain locally convex

Additional remarks Global optimality: Globally convex objective function? And convex feasible domain? Then KKT point gives global optimum Pitfall: Sign conventions for Lagrange multipliers in KKT condition depend on standard form! Presented theory valid for negative null form

Contents Constrained Optimization: Optimality Criteria Reduced gradient Lagrangian Sufficiency conditions Inequality constraints Karush-Kuhn-Tucker (KKT) conditions Interpretation of Lagrange multipliers Constrained Optimization: Algorithms

Significance of multipliers Consider case where optimization problem depends on parameter a: Lagrangian: KKT: Looking for:

Significance of multipliers (2) Looking for: KKT:

Significance of multipliers (3) Lagrange multipliers describe the sensitivity of the objective to changes in the constraints: Similar equations can be derived for multiple constraints and inequalities Multipliers give “price of raising the constraint” Note, this makes it logical that at an optimum, multipliers of inequality constraints must be positive!

Minimize mass (volume): Example A, sy N Minimize mass (volume): l Stress constraint:

Constraint sensitivity: Example (2) Stress constraint: Constraint sensitivity: Check: