Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Linear programming, quadratic programming, sequential quadratic programming.

Similar presentations


Presentation on theme: "CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Linear programming, quadratic programming, sequential quadratic programming."— Presentation transcript:

1 CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Linear programming, quadratic programming, sequential quadratic programming

2 K EY IDEAS Linear programming Simplex method Mixed-integer linear programming Quadratic programming Applications

3 R ADIOSURGERY 3 CyberKnife (Accuray)

4 Tumor Normal tissue Radiologically sensitive tissue

5 Tumor

6

7 O PTIMIZATION F ORMULATION

8 L INEAR P ROGRAM General form min f T x+g s.t. A x  b C x = d A convex polytope A slice through the polytope

9 T HREE CASES InfeasibleFeasible, bounded ? f x*x* f Feasible, unbounded f x*x*

10 S IMPLEX A LGORITHM (D ANTZIG ) Start from a vertex of the feasible polytope “Walk” along polytope edges while decreasing objective on each step Stop when the edge is unbounded or no improvement can be made Implementation details: How to pick an edge (exiting and entering) Solving for vertices in large systems Degeneracy: no progress made due to objective vector being perpendicular to edges

11 C OMPUTATIONAL C OMPLEXITY Worst case exponential Average case polynomial (perturbed analysis) In practice, usually tractable Commercial software (e.g., CPLEX) can handle millions of variables/constraints!

12 S OFT C ONSTRAINTS Dose Penalty Normal Sensitive Tumor

13 S OFT C ONSTRAINTS Dose Auxiliary variable z ijk : penalty at each cell z ijk z ijk  c(D ijk – D normal ) z ijk  0 D ijk

14 S OFT C ONSTRAINTS Dose Auxiliary variable z ijk : penalty at each cell z ijk z ijk  c(D ijk – D normal ) z ijk  0 f ijk Introduce term in objective to minimize z ijk

15 M INIMIZING AN A BSOLUTE V ALUE Absolute value min x |x 1 | s.t. Ax  b Cx = d x1x1 Objective min v,x v s.t. Ax  b Cx = d x 1  v -x 1  v x1x1 Constraints

16 M INIMIZING AN L-1 OR L- INF NORM L 1 norm L  norm min x ||Fx-g|| 1 s.t. Ax  b Cx = d min x ||Fx-g||  s.t. Ax  b Cx = d Feasible polytope, projected thru F g Fx * Feasible polytope, projected thru F g Fx *

17 M INIMIZING AN L-1 OR L- INF NORM L 1 norm min x ||Fx-g|| 1 s.t. Ax  b Cx = d Feasible polytope, projected thru F g Fx * min e,x 1 T e s.t. Fx + Ie  g Fx - Ie  g Ax  b Cx = d e

18 M INIMIZING AN L-2 NORM L 2 norm min x ||Fx-g|| 2 s.t. Ax  b Cx = d Feasible polytope, projected thru F Fx * g Not a linear program!

19 Q UADRATIC P ROGRAMMING General form min ½ x T Hx + g T x + h s.t. A x  b C x = d Objective: quadratic form Constraints: linear

20 Q UADRATIC PROGRAMS Feasible polytope H positive definite H -1 g

21 Q UADRATIC PROGRAMS Optimum can lie off of a vertex! H positive definite H -1 g

22 Q UADRATIC PROGRAMS Feasible polytope H negative definite

23 Q UADRATIC PROGRAMS Feasible polytope H positive semidefinite

24 S IMPLEX A LGORITHM F OR QP S Start from a vertex of the feasible polytope “Walk” along polytope facets while decreasing objective on each step Stop when the facet is unbounded or no improvement can be made Facet : defined by m  n constraints m=n: vertex m=n-1: line m=1: hyperplane m=0: entire space

25 A CTIVE S ET M ETHOD If x violates a different constraint not in S, add it If  k <0, then drop i k from S

26 P ROPERTIES OF ACTIVE SET METHODS FOR QP S Inherits properties of simplex algorithm Worst case: exponential number of facets Positive definite H: polynomial time in typical case Indefinite or negative definite H: can be exponential time! NP complete problems

27 A PPLYING QP S TO N ONLINEAR P ROGRAMS Recall: we could convert an equality constrained optimization to an unconstrained one, and use Newton’s method Each Newton step: Fits a quadratic form to the objective Fits hyperplanes to each equality Solves for a search direction (  x,  ) using the linear equality-constrained optimization How about inequalities?

28 S EQUENTIAL Q UADRATIC P ROGRAMMING Idea: fit half-space constraints to each inequality g(x)  0 becomes g(x t ) +  g(x t ) T (x-x t )  0 g(x)  0 xtxt g(x t ) +  g(x t ) T (x-x t )  0

29 S EQUENTIAL Q UADRATIC P ROGRAMMING Given nonlinear minimization min x f(x) s.t. g i (x)  0, for i=1,…,m h j (x) = 0, for j=1,…,p At each step x t, solve QP min  x ½  x T  x 2 L(x t, t,  t )  x +  x L(x t, t,  t ) T  x s.t. g i (x t ) +  g i (x t ) T  x  0for i=1,…,m h j (x t ) +  h j (x t ) T  x = 0for j=1,…,p To derive the search direction  x Directions  and  are taken from QP multipliers

30 I LLUSTRATION g(x)  0 xtxt g(x t ) +  g(x t ) T (x-x t )  0 xx

31 I LLUSTRATION g(x)  0 x t+1 g(x t+1 ) +  g(x t+1 ) T (x-x t+1 )  0 xx

32 I LLUSTRATION g(x)  0 x t+2 g(x t+2 ) +  g(x t+2 ) T (x-x t+2 )  0 xx

33 SQP P ROPERTIES Equivalent to Newton’s method without constraints Equivalent to Lagrange root finding with only equality constraints Subtle implementation details: Does the endpoint need to be strictly feasible, or just up to a tolerance? How to perform a line search in the presence of inequalities? Implementation available in Matlab. FORTRAN packages too =(


Download ppt "CS B553: A LGORITHMS FOR O PTIMIZATION AND L EARNING Linear programming, quadratic programming, sequential quadratic programming."

Similar presentations


Ads by Google