L8 Optimal Design concepts pt D

Slides:



Advertisements
Similar presentations
The simplex algorithm The simplex algorithm is the classical method for solving linear programs. Its running time is not polynomial in the worst case.
Advertisements

Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions.
Linear Programming Problem
L16 LP part2 Homework Review N design variables, m equations Summary 1.
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
L12 LaGrange Multiplier Method Homework Review Summary Test 1.
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimization in Engineering Design 1 Lagrange Multipliers.
Engineering Optimization
Tutorial 7 Constrained Optimization Lagrange Multipliers
Constrained Optimization
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Unconstrained Optimization Problem
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
MIT and James Orlin © Chapter 3. The simplex algorithm Putting Linear Programs into standard form Introduction to Simplex Algorithm.
Ch. 9: Direction Generation Method Based on Linearization Generalized Reduced Gradient Method Mohammad Farhan Habib NetLab, CS, UC Davis July 30, 2010.
KKT Practice and Second Order Conditions from Nash and Sofer
Linear Programming - Standard Form
1. The Simplex Method.
L13 Optimization using Excel See revised schedule read 8(1-4) + Excel “help” for Mar 12 Test Answers Review: Convex Prog. Prob. Worksheet modifications.
L4 Graphical Solution Homework See new Revised Schedule Review Graphical Solution Process Special conditions Summary 1 Read for W for.
Duality Theory  Every LP problem (called the ‘Primal’) has associated with another problem called the ‘Dual’.  The ‘Dual’ problem is an LP defined directly.
Machine Learning Weak 4 Lecture 2. Hand in Data It is online Only around 6000 images!!! Deadline is one week. Next Thursday lecture will be only one hour.
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
L21 Numerical Methods part 1 Homework Review Search problem Line Search methods Summary 1 Test 4 Wed.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Constraints Feasible region Bounded/ unbound Vertices
LINEAR PROGRAMMING 3.4 Learning goals represent constraints by equations or inequalities, and by systems of equations and/or inequalities, and interpret.
CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Constrained optimization.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
3-5: Linear Programming. Learning Target I can solve linear programing problem.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
OR II GSLM
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
L15 LP Problems Homework Review Why bother studying LP methods History N design variables, m equations Summary 1.
Chapter 9: Systems of Equations and Inequalities; Matrices
An Introduction to Linear Programming
L6 Optimal Design concepts pt B
Lecture 7 Constrained Optimization Lagrange Multipliers
Chap 10. Sensitivity Analysis
Copyright © Cengage Learning. All rights reserved.
Chapter 11 Optimization with Equality Constraints
L11 Optimal Design L.Multipliers
Chap 9. General LP problems: Duality and Infeasibility
Linear Systems Chapter 3.
Chapter 5. Sensitivity Analysis
Module E3-b Economic Dispatch (continued).
CS5321 Numerical Optimization
3-3 Optimization with Linear Programming
Part 3. Linear Programming
The Lagrange Multiplier Method
L5 Optimal Design concepts pt A
L10 Optimal Design L.Multipliers
7.5 – Constrained Optimization: The Method of Lagrange Multipliers
CS5321 Numerical Optimization
Linear Programming Problem
CS5321 Numerical Optimization
1.6 Linear Programming Pg. 30.
Chapter 2. Simplex method
L7 Optimal Design concepts pt C
Constraints.
L8 Optimal Design concepts pt D
Presentation transcript:

L8 Optimal Design concepts pt D Homework Review Inequality constraints General LaGrange Function Necessary Conditions for general Lagrange Multiplier Method Example Summary

MV Optimization- E. CONSTRAINED For x* to be a local minimum:

LaGrange Function If we let x* be the minimum f(x*) in the feasible region: All x* satisfy the equality constraints (i.e. hj =0)  Note: when x is not feasible, hj is not equal to 0 and by minimizing hj we are pushing x towards feasibility!

Necessary Condition Necessary condition for a stationary point Given f(x), one equality constraint, and n=2

Example

Example cont’d

Lagrange Multiplier Method 1. Both f(x) and all hj(x) are differentiable 2. x* must be a regular point: 2.1 x* is feasible 2.2 Gradient vectors of hj(x) are linearly independent 3. LaGrange multipliers can be +, - or 0.

MV Optimization Inequality Constrained

To Use LaGrange Approach Convert Inequalities to Equalties? Given an inequality Add a variable sj to take up the slack No longer an inequality Can now use Lagrange Multiplier Approach

MV Optimization Active or Inactive Inequalities?

KKT Necessary Conditions for Min 1. Lagrange Function (in standard form) 2. Gradient Conditions

KKT Conditions Cont’d 3. Feasibility Check for Inequalities 4. Switching Conditions, e.g. 5. Non-negative LaGrange Multipliers for inequalities 6. Regularity check gradients of active inequality constraints are linearly independent

KKT Necessary Conditions for Min Regularity check - gradients of active inequality constraints are linearly independent

Ex 4.32 pg 150

LaGrange Function

4 equations and 4 unknowns Non-linear system of equations

Use Switching Conditions to Simplify

Case 1 Non-linear system of equations Both inequalities are VIOLATED Therefore, x is INFEASIBLE

Case 2 System of 3 linear equations in 3 unknowns Rewrite

Gaussian Elimination

Gaussian Elimination cont’d

Ex 4.32 cont’d Check last eqn (for s1 feasiblity) Nope! The point is not feasible.

Case 4 We can use Gaussian elimination again Where are cases 1-4?

Check if regular point Rank= order of largest non-singular matrix in A since det (A) is non-singular, the A matrix is full rank We can also see that the vectors are not parallel in figure.

All Equations must be satisfied

Summary General LaGrange Function L(x,v,u,s) Necessary Conditions for Min Use switching conditions Check results