Introduction to Network Mathematics (1) - Optimization techniques Yuedong Xu 10/08/2012.

Slides:



Advertisements
Similar presentations
1 LP, extended maxflow, TRW OR: How to understand Vladimirs most recent work Ramin Zabih Cornell University.
Advertisements

Geometry and Theory of LP Standard (Inequality) Primal Problem: Dual Problem:
Introduction to Algorithms
Linear Programming: Simplex Method and Sensitivity Analysis
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Optimization Problems 虞台文 大同大學資工所 智慧型多媒體研究室. Content Introduction Definitions Local and Global Optima Convex Sets and Functions Convex Programming Problems.
1 EL736 Communications Networks II: Design and Algorithms Class8: Networks with Shortest-Path Routing Yong Liu 10/31/2007.
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
Separating Hyperplanes
Introduction to Linear and Integer Programming
Chapter 4: Network Layer
Decomposable Optimisation Methods LCA Reading Group, 12/04/2011 Dan-Cristian Tomozei.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
Lecture 9. Unconstrained Optimization Need to maximize a function f(x), where x is a scalar or a vector x = (x 1, x 2 ) f(x) = -x x 2 2 f(x) = -(x-a)
1 Introduction to Linear and Integer Programming Lecture 9: Feb 14.
Introduction to Linear and Integer Programming Lecture 7: Feb 1.
Approximation Algorithms
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 99 Chapter 4 The Simplex Method.
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Exploiting Duality (Particularly the dual of SVM) M. Pawan Kumar VISUAL GEOMETRY GROUP.
Computational Methods for Management and Economics Carla Gomes
EE 685 presentation Optimization Flow Control, I: Basic Algorithm and Convergence By Steven Low and David Lapsley Asynchronous Distributed Algorithm Proof.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Distributed Combinatorial Optimization
Optimality Conditions for Nonlinear Optimization Ashish Goel Department of Management Science and Engineering Stanford University Stanford, CA 94305, U.S.A.
Review of Reservoir Problem OR753 October 29, 2014 Remote Sensing and GISc, IST.
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Chapter 4 The Simplex Method
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2008 Lecture 9 Tuesday, 11/18/08 Linear Programming.
LP formulation of Economic Dispatch
Flow Models and Optimal Routing. How can we evaluate the performance of a routing algorithm –quantify how well they do –use arrival rates at nodes and.
Operations Research Assistant Professor Dr. Sana’a Wafa Al-Sayegh 2 nd Semester ITGD4207 University of Palestine.
MIT and James Orlin1 NP-completeness in 2005.
15.082J and 6.855J and ESD.78J Lagrangian Relaxation 2 Applications Algorithms Theory.
Linear Programming Erasmus Mobility Program (24Apr2012) Pollack Mihály Engineering Faculty (PMMK) University of Pécs João Miranda
Advanced Operations Research Models Instructor: Dr. A. Seifi Teaching Assistant: Golbarg Kazemi 1.
EE 685 presentation Optimization Flow Control, I: Basic Algorithm and Convergence By Steven Low and David Lapsley.
Branch-and-Cut Valid inequality: an inequality satisfied by all feasible solutions Cut: a valid inequality that is not part of the current formulation.
Optimization - Lecture 4, Part 1 M. Pawan Kumar Slides available online
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Chapter 4 Sensitivity Analysis, Duality and Interior Point Methods.
1/74 Lagrangian Relaxation and Network Optimization Cheng-Ta Lee Department of Information Management National Taiwan University September 29, 2005.
Lecture.6. Table of Contents Lp –rounding Dual Fitting LP-Duality.
1 Slides by Yong Liu 1, Deep Medhi 2, and Michał Pióro 3 1 Polytechnic University, New York, USA 2 University of Missouri-Kansas City, USA 3 Warsaw University.
1 Optimization Techniques Constrained Optimization by Linear Programming updated NTU SY-521-N SMU EMIS 5300/7300 Systems Analysis Methods Dr.
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Linear Programming Short-run decision making model –Optimizing technique –Purely mathematical Product prices and input prices fixed Multi-product production.
Algorithmic Game Theory and Internet Computing Vijay V. Vazirani 3) New Market Models, Resource Allocation Markets.
IE 312 Review 1. The Process 2 Problem Model Conclusions Problem Formulation Analysis.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Support Vector Machine: An Introduction. (C) by Yu Hen Hu 2 Linear Hyper-plane Classifier For x in the side of o : w T x + b  0; d = +1; For.
Approximation Algorithms Duality My T. UF.
Linear Programming Piyush Kumar Welcome to CIS5930.
Business Mathematics MTH-367 Lecture 14. Last Lecture Summary: Finished Sec and Sec.10.3 Alternative Optimal Solutions No Feasible Solution and.
Approximation Algorithms based on linear programming.
1 Transport Bandwidth Allocation 3/29/2012. Admin. r Exam 1 m Max: 65 m Avg: 52 r Any questions on programming assignment 2 2.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Lap Chi Lau we will only use slides 4 to 19
Signal processing and Networking for Big Data Applications: Lecture 9 Mix Integer Programming: Benders decomposition And Branch & Bound NOTE: To change.
Topics in Algorithms Lap Chi Lau.
Chapter 11 Optimization with Equality Constraints
The minimum cost flow problem
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Chapter 6. Large Scale Optimization
Chapter 5. The Duality Theorem
Chapter 6. Large Scale Optimization
Linear Constrained Optimization
Constraints.
Presentation transcript:

Introduction to Network Mathematics (1) - Optimization techniques Yuedong Xu 10/08/2012

Purpose Many networking/system problems boil down to optimization problems – Bandwidth allocation – ISP traffic engineering – Route selection – Cache placement – Server sleep/wakeup – Wireless channel assignment – … and so on (numerous)

Purpose Optimization as a tool – shedding light on how good we can achieve – guiding design of distributed algorithms (other than pure heuristics) – providing a bottom-up approach to reverse- engineer existing systems

Outline Toy Examples Overview Convex Optimization Linear Programming Linear Integer Programming Summary

Toys Toy 1: Find x to minimize f(x) := x 2 x * =0 if no restriction on x x * =2 if 2≤x≤4..

Toys Toy 2: Find the minimum minimize subject to Optimal solution ?

Toys Toy 3: Find global minimum local minimum global minimum

Outline Toy Examples Overview Convex Optimization Linear Programming Linear Integer Programming

Overview Ingredients! Objective Function – A function to be minimized or maximized Unknowns or Variables – Affect value of objective function Constraints – Restrict unknowns to take on certain values but exclude others

Overview Formulation: Minimize f 0 (x) subject to f i (x) ≤ 0; i=1,…,m h i (x) = 0; i=1,…,p Objective function x : Decision variables Inequality constraint Equality constraint

Overview Optimization tree

Overview Our coverage Nonlinear Programs Convex Programs Linear Programs (Polynomial) Integer Programs (NP-Hard) Flow and Matching

Outline Toy Examples Overview Convex Optimization Linear Programming Linear Integer Programming

Convex Optimization Concepts – Convex combination: x, y  R n 0   1 z = x +(1  )y A convex combination of x, y. A strict convex combination of x, y if  0, 1.

Convex Optimization Concepts – Convex set: S  RnS  Rn is convex if it contains all convex combinations of pairs x, y  S. convex nonconvex

Convex Optimization Concepts – Convex set: more complicated case The intersection of any number of convex sets is convex.

Convex Optimization Concepts – Convex function:( 大陆教材定义与国外相反! ) c xy x +(1  )y c(x)c(x) c(y)c(y) c(x) + (1  )c(y) c( x +(1  )y) S  RnS  Rn a convex set c:S  Rc:S  R a convex function if c( x +(1  )y)  c(x) + (1  )c(y),0   1

Convex Optimization Concepts – convex functions We are lucky to find that many networking problems have convex obj.s Exponential: e ax is convex on R Powers: x a is convex on R + when a≥1 or a≤0, and concave for 0≤a≤1 Logarithm: log x is concave on R + Jensen’s inequality: –if f() is convex, then f(E[x]) <= E[f(x)] –You can also check it by taking 2 nd order derivatives

Convex Optimization Concepts – Convex Optimization: – A fundamental property: local optimum = global optimum Convex Linear/Affine Minimize f 0 (x) subject to f i (x) ≤ 0; i=1,…,m h i (x) = 0; i=1,…,p

Convex Optimization Method – You may have used Gradient method Newton method to find optima where constraints are real explicit ranges (e.g. 0≤x≤10, 0≤y≤20, -∞≤z≤∞ ……). – Today, we are talking about more generalized constrained optimization

Convex Optimization How to solve a constrained optimization problem? Enumeration? Maybe for small, finite feasible set Use constraints to reduce number of variables? Works occasionally Lagrange multiplier method – a general method (harder to understand)

Convex Optimization Example: Minimize x 2 + y 2 subject to x + y = 1 Lagrange multiplier method: Change problem to an unconstrained problem: L(x, y, p) = x 2 + y 2 + p(1-x-y) Think of p as “price”, (1-x-y) as amount of “violation” of the constraint Minimize L(x,y,p) over all x and y, keeping p fixed Obtain x*(p), y*(p) Then choose p to make sure constraint met Magically, x*(p*) and y*(p*) is the solution to the original problem!

Convex Optimization Example: Minimize x 2 + y 2 subject to x + y = 1 Lagrangian: L(x, y, p) = x 2 + y 2 + p(1-x-y) Setting dL/dx and dL/dy to 0, we get x = y = p/2 Since x+y=1, we get p=1 Get the same solution by substituting y=1-x

Convex Optimization General case: minimize f 0 (x) subject to f i (x) ≤ 0, i = 1, …, m (Ignore equality constraints for now) Optimal value denoted by f* Lagrangian: L(x, p) = f 0 (x) + p 1 f 1 (x) + … + p m f m (x) Define g(p) = inf x (f 0 (x) + p 1 f 1 (x) + … + p m f m (x)) If p i ≥0 for all i, and x feasible, then g(p) ≤ f*

Convex Optimization Revisit earlier example L(x,p) = x 2 + y 2 + p(1-x-y) x* = y* = p/2 g(p) = p(1-p/2) This is a concave function, with g(p*) = 1/2 We know f* is 1/2, and g(p) is a lower bound for f* with different values of p – the tightest lower bound occurs at p=p*.

Convex Optimization Duality For each p, g(p) gives a lower bound for f* We want to find as tight a lower bound as possible: maximize g(p) subject to p≥0 a) This is called the (Lagrange) “dual” problem, original problem the “primal” problem b) Let the optimal value of dual problem be denoted d*. We always have d* ≤ f* (called “weak duality”) c) If d* = f*, then we have “strong duality”. The difference (f*-d*) is called the “duality gap”

Convex Optimization Price Interpretation Solving the constrained problem is equivalent to obeying hard resource limits Imaging the resource limits can be violated; you can pay a marginal price per unit amount of violation (or receive an amount if the limit not met) When duality gap is nonzero, you can benefit from the 2 nd scenario, even if the price are set in unfavorable terms

Convex Optimization Duality in algorithms An iterative algorithm produces at iteration j A primal feasible x (j) A dual feasible p (j) With f 0 (x (j) )-g(p (j) ) -> 0 as j -> infinity The optimal solution f* is in the interval [g(p (j) ), f 0 (x (j) )]

Convex Optimization Complementary Slackness To make duality gap zero, we need to have p i f i (x) = 0 for all i This means p i > 0 => f i (x*) = 0 f i (x*) p i = 0 If “price” is positive, then the corresponding constraint is limiting If a constraint is not active, then the “price” must be zero

Convex Optimization KKT Optimality Conditions Satisfy primal and dual constraints f i (x*) ≤0, p i * ≥0 Satisfy complementary slackness p i * f i (x*) = 0 Stationary at optimal f 0 ’(x*) + Σ i p i * f i ’(x*) = 0 If primal problem is convex and KKT conditions met, then x* and p* are dual optimal with zero duality gap KKT = Karush-Kuhn-Tucker

Convex Optimization Method – Take home messages Convexity is important (non-convex problems are very difficult) Primal problem is not solved directly due to complicated constraints Using Lagrangian dual approach to obtain dual function KKT condition is used to guarantee the strong duality for convex problem

Convex Optimization Application: TCP Flow Control – What’s in your mind about TCP? Socket programming? Sliding window? AIMD congestion control? Retransmission? Anything else?

Convex Optimization Application: TCP Flow Control – Our questions: It seems that TCP works as Internet scales. Why? How does TCP allocate bandwidth to flows traversing the same bottlenecks? Is TCP optimal in terms of resource allocation?

Convex Optimization Application: TCP Flow Control Network Links l of capacities c l Sources i L(s) - links used by source i (routing) U i (x i ) - utility if source rate = x i i). The larger rate x i, the more happiness; ii). The increment of happiness is shrinking as x i increases.

Convex Optimization Application: TCP Flow Control – A simple network c1c1 c2c2 x1x1 x2x2 x3x3

Convex Optimization Application: TCP Flow Control – Primal problem

Convex Optimization Application: TCP Flow Control – Lagrangian dual problem You can solve it using KKT conditions! You need whole information!

Convex Optimization Application: TCP Flow Control – Question: Is distributed flow control possible without knowledge of the network?

Convex Optimization Application: TCP Flow Control – Primal-dual approach: Source updates sending rate x i Link generates congestion signal – Source’s action: If congestion severe, then reduce x i If no congestion, then increase x i – Congestion measures: Packet loss (TCP Reno) RTT (TCP Vegas)

Convex Optimization Application: TCP Flow Control – Gradient based Primal-dual algorithm: Source updates rate given congestion signal Link updates congestion signal Price of congestion at a link: packet drop prob., etc. Total prices of a flow along its route

Convex Optimization Application: TCP Flow Control – Relation to real TCP

Convex Optimization Application: Fairness concerns – Many concepts regarding fairness Max-min, proportional fairness, etc. (add some examples to explain the concepts)

Convex Optimization Application: Fairness concerns – How does fairness relate to optimization? Reflected by utility function – Esp. max-min fairness as alpha  infinity

Convex Optimization Application: TCP Flow Control Take home messages: – TCP is solving a distributed optimization problem! – Primal-dual algorithm can converge! – Packet drop, and RTT are carrying “price of congestion”! – Fairness reflected by utility function

Outline Toy Examples Overview Convex Optimization Linear Programming Linear Integer Programming Summary

Linear Programming A subset of convex programming Goal: maximize or minimize a linear function on a domain defined by linear inequalities and equations Special properties

Linear Programming Formulation – General form Max or min Subject to ……

Linear Programming Formulation – General form Max or min Subject to …… A linear objective function A set of linear constraints linear inequalities or linear equations

Linear Programming Formulation – Canonical form Min subject to “greater than” only Non-negative

Linear Programming Example Maximize Subject to x y

Linear Programming Example Maximize Subject to x y

Linear Programming Example – Observations: Optimum is at the corner! – Question: When can optimum be a point not at the corner? How to find optimum in a more general LP problem? feasible region

Linear Programming (Dantzig 1951) Simplex method For USA Air Force Very efficient in practice Exponential time in worst case (Khachiyan 1979) Ellipsoid method Not efficient in practice Polynomial time in worst case

Linear Programming Simplex method – Iterative method to find optimum – Finding optimum by moving around the corner of the convex polytope in cost descent sense. – Local optimum = global optimum (convexity) basic feasible solution at a corner

Linear Programming Application – Max flow problem

Outline Toy Examples Overview Convex Optimization Linear Programming Linear Integer Programming Summary

Linear Integer Programming Linear programming is powerful in many areas But ………

Linear Integer Programming Random variables are integers – Building a link between two nodes, 0 or 1? – Selecting a route, 0 or 1? – Assigning servers in a location – Caching a movie, 0 or 1? – Assigning a wireless channel, 0 or 1? – and so on…… No fractional solution !

Linear Integer Programming Graphical interpretation Only discrete values allowed

Linear Integer Programming Method – Enumeration – Tree Search, Dynamic Programming etc. Guaranteed “feasible” solution Complexity grows exponentially x 1 =0 X 2 =0X 2 =2X 2 =1 x 1 =1 x 1 =2 X 2 =0X 2 =2X 2 =1X 2 =0X 2 =2X 2 =1

Linear Integer Programming Method – Linear relaxation plus rounding -c T x1x1 x2x2 LP Solution Integer Solution a). Variables being continuous b). Solving LP efficiently c). Rounding the solution No guarantee of gap to optimality!

Linear Integer Programming Combined Method – Branch and bound algorithm Step 1: Solve LP relaxation problem Step 2:

Linear Integer Programming Example – Knapsack problem Which boxes should be chosen to maximize the amount of money while still keeping the overall weight under 15 kg ?

Linear Integer Programming Example – Knapsack problem Objective Function Unknowns or Variables Constraints

Linear Integer Programming Example – Knapsack problem Is the problem difficult? What is the complexity in your approach? NP hard!

Outline Toy Examples Overview Convex Optimization Linear Programming Linear Integer Programming Summary

Our course covers a macroscopic view of optimizations in networking ways of solving optimizations applications to networking research

Summary Keywords Convex optimization – Lagrangian dual problem, KKT Linear programming – Simplex, Interior-point Integer linear programming – Dynamic programming, Rounding, or Branch and bound

Summary Complexity (“solve” means solving optimum) Linear programming – [P], fast to solve Nonlinear programming – Convex: [P], easy to solve – Non-convex: usually [NP], difficult to solve (Mixed) Integer linear programming – Usually [NP], difficult to solve

Summary Thanks!

Backup slides

Convex Optimization Concepts – Convex function: more examples (see Byod’s book) We are lucky to find that many networking problems have convex obj.s

Convex Optimization Example – waterfilling problem To Tom: you can write The Lagrangian dual function On the white board.

Convex Optimization Method – Roadmap Treat the original optimization problem as the Primal problem Transform primal problem into dual problem Solve dual problem and validate the optimality

Convex Optimization Method – Lagrangian dual function

Convex Optimization Method – For given, solving dual problem

Convex Optimization Method – Comparison Min f 0 (x) s.t. f i (x) ≤ 0; i=1,…,m h i (x) = 0; i=1,…,p Primal:Dual: Min If λ ≥ 0, dual minimum is lower bound of primal minimum What shall we do next?

Convex Optimization Method – The Lagrangian dual problem Much easier after getting rid of ugly constraints!

Convex Optimization Method – Question: are these two problems the same?

Convex Optimization Method – Karush-Kuhn-Tucker (KKT) conditions For the convex problem, satisfying KKT means strong duality!