Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS5321 Numerical Optimization

Similar presentations


Presentation on theme: "CS5321 Numerical Optimization"— Presentation transcript:

1 CS5321 Numerical Optimization
12 Theory of Constrained Optimization 5/13/2019

2 General form Feasible set ={x |ci(x)=0, iE; ci(x)0, iI} Outline
2 R f ( ) s u b j e c t o = E I E, I are index sets for equality and inequality constraints Feasible set ={x |ci(x)=0, iE; ci(x)0, iI} Outline Equality and inequality constraints Lagrange multipliers Linear independent constraint qualification First/second order optimality conditions Duality 5/13/2019

3 A single equality constraint
An example subject to The optimal solution is at Optimal condition: *=−1/2 in the example m i n x f ( ) = 1 + 2 c 1 = x 2 + r f ( x ) = 1 r c 1 ( x ) = 2 x 1 2 = r f ( x ) = c 1 5/13/2019

4 A single inequality constraint
 f c1 s Example: subject to Case 1: the solution is inside c1. Unconstrained optimization Case 2: the solution is on the boundary of c1. Equality constraint Complementarity condition: *c1(x*)=0 Let *=0 in case 1. m i n x f ( ) = 1 + 2 c 1 : 2 x r f ( x ) = c 1 ( x ) = r f 5/13/2019

5 Lagrangian function Define L (x, ) = f (x) − c1(x)
L (x, ) = −c1(x) xL (x, ) = f (x) − c1(x) The optimality conditions of inequality constraint L (x, ) is called the Lagrangian function;  is called the Lagrangian multiplier 1 c ( x ) = r f 5/13/2019

6 Lagrangian multiplier
At optimal solution x*, f (x*) = 1c1(x*) If c1(.) changes  unit, f (.) changes 1 unit. 1 is the change of f per unit change of c1. 1 means the sensitivity of f to ci. 1 is called the shadow price or the dual variable. 5/13/2019

7 Constraint qualification
Active set A(x) = E{iI | ci(x)=0} Linearly Independent Constraint Qualification The gradients of constraints in the active set {ci(x) | iA(x)} are linearly independent A point is called regular if it satisfies LICQ. Other constraint qualifications may be used. c 1 ( x ) = 2 ; 5/13/2019

8 First order conditions
A regular point that is a minimizer of the constrained problem must satisfies the KKT condition (Karush-Kuhn-Tucker) The last one is called complementarity condition r x L ( ; ) = c i f o a l 2 E I 5/13/2019

9 Second order conditions
The critical cone C(x*, *) is a set of vectors that Suppose x* is a solution to a constrained optimization problem and * satisfies KKT conditions. For all w  C (x*,*), w 2 C ( x ; ) , 8 < : r c i T = E I \ A t h > Meaning of c(x,lambda) w T r 2 x L ( ; ) 5/13/2019

10 Projected Hessian Let A(x*)=[ci(x*)]iA(x*): A(x*) is the active set.
It can be shown that C(x*, *) = Null A(x*) Null A(x*) can be computed via QR decomposition span{Q2} = Null A(x*) Define projected Hessian The second order optimality condition is that H is positive semidefinite. A ( x ) T = Q R 1 2 H = Q T 2 r x L ( ; ) 5/13/2019

11 Duality: An example subject to
Find a lower bound for f (x) via the constraints Ex: f (x) >3c1  3x1+4x2 >3x1+3x2  9 Ex: f (x) >2c1 +0.5c2  3x1+4x2 >3x1+1.25x2  7 What is the maximum lower bound for f (x)? m i n x f ( ) = 3 1 + 4 2 c 1 : x + 2 3 5 m a x y g ( ) = 3 1 + 2 s.t. y 1 + 2 3 : 5 4 5/13/2019

12 Dual problem Primal problem
No equality constraints. Inequality constraints ci are concave. (−ci are convex) Let c(x)=[c1(x), …cm(x)]T. The Lagrangian is L (x,) = f (x) − Tc(x), Rm. The dual object function is q()=infxL (x,) If L (:,) is convex, infxL (x,) is at xL (x,)=0 The dual problem is If L (:,) is convex, additional constraint is xL(x,)=0 m i n x f ( ) s . t c m a x q ( ) s . t 5/13/2019


Download ppt "CS5321 Numerical Optimization"

Similar presentations


Ads by Google