The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no.

Slides:



Advertisements
Similar presentations
Duality for linear programming. Illustration of the notion Consider an enterprise producing r items: f k = demand for the item k =1,…, r using s components:
Advertisements

C&O 355 Lecture 8 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A.
Summary of complementary slackness: 1. If x i ≠ 0, i= 1, 2, …, n the ith dual equation is tight. 2. If equation i of the primal is not tight, y i =0. 1.
1 LP Duality Lecture 13: Feb Min-Max Theorems In bipartite graph, Maximum matching = Minimum Vertex Cover In every graph, Maximum Flow = Minimum.
Geometry and Theory of LP Standard (Inequality) Primal Problem: Dual Problem:
1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Linear Programming (graphical + simplex with duality) Based on Linear optimization in application by Sui lan Tang. Linear Programme (LP) for Optimization.
Page 1 Page 1 ENGINEERING OPTIMIZATION Methods and Applications A. Ravindran, K. M. Ragsdell, G. V. Reklaitis Book Review.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
by Rianto Adhy Sasongko Supervisor: Dr.J.C.Allwright
Optimality conditions for constrained local optima, Lagrange multipliers and their use for sensitivity of optimal solutions Today’s lecture is on optimality.
Separating Hyperplanes
Optimization in Engineering Design 1 Lagrange Multipliers.
CSCI 3160 Design and Analysis of Algorithms Tutorial 6 Fei Chen.
Dual Problem of Linear Program subject to Primal LP Dual LP subject to ※ All duality theorems hold and work perfectly!
Support Vector Classification (Linearly Separable Case, Primal) The hyperplanethat solves the minimization problem: realizes the maximal margin hyperplane.
Linear Regression  Using a linear function to interpolate the training set  The most popular criterion: Least squares approach  Given the training set:
The Widrow-Hoff Algorithm (Primal Form) Repeat: Until convergence criterion satisfied return: Given a training set and learning rate Initial:  Minimize.
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
Binary Classification Problem Linearly Separable Case
Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and.
Exploiting Duality (Particularly the dual of SVM) M. Pawan Kumar VISUAL GEOMETRY GROUP.
EE 685 presentation Optimization Flow Control, I: Basic Algorithm and Convergence By Steven Low and David Lapsley Asynchronous Distributed Algorithm Proof.
7(2) THE DUAL THEOREMS Primal ProblemDual Problem b is not assumed to be non-negative.
Unconstrained Optimization Problem
D Nagesh Kumar, IIScOptimization Methods: M2L5 1 Optimization using Calculus Kuhn-Tucker Conditions.
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
Optimization Theory Primal Optimization Problem subject to: Primal Optimal Value:
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Classification and Regression
Computational Optimization
KKT Practice and Second Order Conditions from Nash and Sofer
Duality Theory 對偶理論.
Chapter 6 Linear Programming: The Simplex Method Section R Review.
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Linear Fractional Programming. What is LFP? Minimize Subject to p,q are n vectors, b is an m vector, A is an m*n matrix, α,β are scalar.
Advanced Operations Research Models Instructor: Dr. A. Seifi Teaching Assistant: Golbarg Kazemi 1.
Part 4 Nonlinear Programming 4.5 Quadratic Programming (QP)
EASTERN MEDITERRANEAN UNIVERSITY Department of Industrial Engineering Non linear Optimization Spring Instructor: Prof.Dr.Sahand Daneshvar Submited.
EE 685 presentation Optimization Flow Control, I: Basic Algorithm and Convergence By Steven Low and David Lapsley.
Survey of Kernel Methods by Jinsan Yang. (c) 2003 SNU Biointelligence Lab. Introduction Support Vector Machines Formulation of SVM Optimization Theorem.
Linear Program Set Cover. Given a universe U of n elements, a collection of subsets of U, S = {S 1,…, S k }, and a cost function c: S → Q +. Find a minimum.
C&O 355 Mathematical Programming Fall 2010 Lecture 5 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A.
CPSC 536N Sparse Approximations Winter 2013 Lecture 1 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAA.
OR Relation between (P) & (D). OR optimal solution InfeasibleUnbounded Optimal solution OXX Infeasible X( O )O Unbounded XOX (D) (P)
Introduction to Optimization
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Sullivan Algebra and Trigonometry: Section 12.9 Objectives of this Section Set Up a Linear Programming Problem Solve a Linear Programming Problem.
Generalization Error of pac Model  Let be a set of training examples chosen i.i.d. according to  Treat the generalization error as a r.v. depending on.
OR II GSLM
2.5 The Fundamental Theorem of Game Theory For any 2-person zero-sum game there exists a pair (x*,y*) in S  T such that min {x*V. j : j=1,...,n} =
Approximation Algorithms based on linear programming.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Part 3 Linear Programming 3.3 Theoretical Analysis.
1 Support Vector Machines: Maximum Margin Classifiers Machine Learning and Pattern Recognition: September 23, 2010 Piotr Mirowski Based on slides by Sumit.
Chap 10. Sensitivity Analysis
Chapter 11 Optimization with Equality Constraints
Computational Optimization
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Proving that a Valid Inequality is Facet-defining
Duality for linear programming.
Chap 9. General LP problems: Duality and Infeasibility
CS5321 Numerical Optimization
The Lagrange Multiplier Method
Chapter 5. The Duality Theorem
Linear Programming Example: Maximize x + y x and y are called
Proving that a Valid Inequality is Facet-defining
CS5321 Numerical Optimization
Constraints.
Presentation transcript:

The Most Important Concept in Optimization (minimization)  A point is said to be an optimal solution of a unconstrained minimization if there exists no decent direction  A point is said to be an optimal solution of a constrained minimization if there exists no feasible decent direction  There might exist decent direction but move along this direction will leave out the feasible region

Decent Direction of  Move alone the decent direction for a certain stepsize will decrease the objective function value i.e.,

Feasible Direction of  Move alone the feasible direction from for a certain stepsize will not leave the feasible region i.e., where is the feasible region.

Steep Decent with Exact Line Search Start with any. Having stop if (i) Steep Decent Direction : (ii) Finding Stepsize by Exact Line Search :

Minimum Principle Let be a convex and differentiable function be the feasible region. Example:

Minimization Problem vs. Kuhn-Tucker Stationary-point Problem Findsuch that KTSP: MP: such that

Lagrangian Function  For a fixed Let and  If are convex then is convex., if then  Above result is a sufficient condition if is convex.

KTSP with Equality Constraints? (Assume are linear functions) KTSP: Find such that

KTSP with Equality Constraints KTSP: Findsuch that  Ifandthen is free variable

Generalized Lagrangian Function  For fixed, if then Letand  If are convex andis linear then is convex.  Above result is a sufficient condition if is convex.

Lagrangian Dual Problem subject to

Lagrangian Dual Problem subject to where

Weak Duality Theorem Let be a feasible solution of the primal problem and a feasible solution of the dual problem. Then Corollary:

Weak Duality Theorem Corollary:If where and, thenand solve the primal and dual problem respectively. In this case,

Saddle Point of Lagrangian Let satisfying Then is called The saddle point of the Lagrangian function

Dual Problem of Linear Program subject to Primal LP Dual LP subject to ※ All duality theorems hold and work perfectly!

Application of LP Duality (I) Farkas ’ Lemma For any matrixand any vector either or but never both.

Application of LP Duality (II) LSQ-Normal Equation Always Has a Solution For any matrixand any vector consider Claim: always has a solution.

Dual Problem of Strictly Convex Quadratic Program subject to Primal QP With strictly convex assumption, we have Dual QP subject to