Seminar 236813: Approximation algorithms for LP/IP optimization problems Reuven Bar-Yehuda Technion IIT Slides and papers at:

Slides:



Advertisements
Similar presentations
Unit-iv.
Advertisements

The Primal-Dual Method: Steiner Forest TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A AA A A A AA A A.
IBM LP Rounding using Fractional Local Ratio Reuven Bar-Yehuda
Algorithm Design Methods (I) Fall 2003 CSE, POSTECH.
Algorithm Design Methods Spring 2007 CSE, POSTECH.
C&O 355 Lecture 23 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
Introduction to Algorithms
Primal-Dual Algorithms for Connected Facility Location Chaitanya SwamyAmit Kumar Cornell University.
Optimization Problems 虞台文 大同大學資工所 智慧型多媒體研究室. Content Introduction Definitions Local and Global Optima Convex Sets and Functions Convex Programming Problems.
Chapter 3 The Greedy Method 3.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Introduction to Approximation Algorithms Lecture 12: Mar 1.
Reuven Bar-Yehuda Gleb Polevoy Gleb Polevoy Dror Rawitz Technion Technion 1.
Reuven Bar-Yehuda Gleb Polevoy Gleb Polevoy Dror Rawitz Technion Technion 1.
A general approximation technique for constrained forest problems Michael X. Goemans & David P. Williamson Presented by: Yonatan Elhanani & Yuval Cohen.
1 Throughput Maximization in 4G Cellular Networks Prof. Reuven Bar-Yehuda January 13, 2008 Technion IIT
Approximation Algorithms
1 Seminar : Approximation algorithms for LP optimization problems Reuven Bar-Yehuda Technion IIT Slides and paper at:
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
1 A Unified Approach to Approximating Resource Allocation and Scheduling Amotz Bar-Noy.……...AT&T and Tel Aviv University.
ISMP LP Rounding using Fractional Local Ratio Reuven Bar-Yehuda
1 New Developments in the Local Ratio Technique Reuven Bar-Yehuda
A Unified Approach to Approximating Resource Allocation and Scheduling Amotz Bar-Noy.……...AT&T and Tel Aviv University Reuven Bar-Yehuda….Technion IIT.
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
Using Homogeneous Weights for Approximating the Partial Cover Problem
Linear Programming – Max Flow – Min Cut Orgad Keller.
1 Approximation Algorithms for Bandwidth and Storage Allocation Reuven Bar-Yehuda Joint work with Michael Beder, Yuval Cohen.
אחד במחיר של שניים : גישה מאוחדת לפיתוח אלגוריתמי קירוב ראובן בר - יהודה מכללת ת " א יפו לזכרו של פרופ ' שמעון אבן מורי ורבי.
Assignment 4. (Due on Dec 2. 2:30 p.m.) This time, Prof. Yao and I can explain the questions, but we will NOT tell you how to solve the problems. Question.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Greedy methods Prudence Wong
V. V. Vazirani. Approximation Algorithms Chapters 3 & 22
Design Techniques for Approximation Algorithms and Approximation Classes.
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Topics in Algorithms 2005 Constructing Well-Connected Networks via Linear Programming and Primal Dual Algorithms Ramesh Hariharan.
Theory of Computing Lecture 13 MAS 714 Hartmut Klauck.
Approximation Algorithms
Chapter 8 PD-Method and Local Ratio (4) Local ratio This ppt is editored from a ppt of Reuven Bar-Yehuda. Reuven Bar-Yehuda.
1 New Developments in the Local Ratio Technique Reuven Bar-Yehuda
LR for Packing problems Reuven Bar-Yehuda
Chapter 2 Greedy Strategy I. Independent System Ding-Zhu Du.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness and course wrap up.
1 A Unified Approach to Approximating Resource Allocation and Scheduling Amotz Bar-Noy.……...AT&T and Tel Aviv University.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
Iterative Rounding in Graph Connectivity Problems Kamal Jain ex- Georgia Techie Microsoft Research Some slides borrowed from Lap Chi Lau.
Steiner Tree Problem Given: A set S of points in the plane = terminals
Exploiting Locality: Approximating Sorting Buffers Reuven Bar Yehuda Jonathan Laserson Technion IIT.
C&O 355 Lecture 19 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
Models of Greedy Algorithms for Graph Problems Sashka Davis, UCSD Russell Impagliazzo, UCSD SIAM SODA 2004.
Chapter 8 PD-Method and Local Ratio (5) Equivalence This ppt is editored from a ppt of Reuven Bar-Yehuda. Reuven Bar-Yehuda.
COMP108 Algorithmic Foundations Greedy methods
Introduction to Algorithms
Greedy Technique.
Greedy function greedy { S <- S0 //Initialization
Chapter 8 Local Ratio II. More Example
Lecture 22 Network Flow, Part 2
Seminar : Approximation algorithms for LP/IP optimization problems
EMIS 8373: Integer Programming
ICS 353: Design and Analysis of Algorithms
Linear Programming and Approximation
Lecture 11 Overview Self-Reducibility.
Lecture 11 Overview Self-Reducibility.
Approximation Algorithms
Problem Solving 4.
A Unified Approach to Approximating Resource Allocation and Scheduling
Lecture 19 Linear Program
Winter 2019 Lecture 11 Minimum Spanning Trees (Part II)
Lecture 22 Network Flow, Part 2
Autumn 2019 Lecture 11 Minimum Spanning Trees (Part II)
Presentation transcript:

Seminar : Approximation algorithms for LP/IP optimization problems Reuven Bar-Yehuda Technion IIT Slides and papers at:

Example VC Given a graph G=(V,E) penalty p v  Z for each v  V Min  p v ·x v S.t.: x v  {0,1} x v + x u  1  {v,u}  E

Linear Programming (LP) Integer Programming (IP) Linear Programming (LP) Integer Programming (IP) Given a profit [penalty] vector p. Maximize[Minimize] p·x Subject to:Linear Constraints F(x) IP: where “x is an integer vector” is a constraints

Example VC Given a graph G=(V,E) and penalty vector p  Z n Minimize p·x Subject to: x  {0,1} n x i + x j  1  {i,j}  E

Example SC Given a Collection S 1, S 2,…,S n of all subsetsof {1,2,3,…,m} and penalty vector p  Z n Minimize p·x Subject to: x  {0,1} n  x i  1  j=1..m j  Si

Example Min Cut Given Network N(V,E) s,t  V and capasity vector p  Z |E| Minimize p·x Subject to: x  {0,1} |E|  x e  1  s  t path P e  P

Example Min Path Given digraph G(V,E) s,t  V and length vector p  Z |E| Minimize p·x Subject to: x  {0,1} |E|  x e  1  s  t cut P e  P

Example MST (Minimum Spanning Tree) Given graph G(V,E) and length vector p  Z |E| Minimize p·x Subject to: x  {0,1} |E|  x e  1  cut P e  P

Example Minimum Steiner Tree Given graph G(V,E) T  V and length vector p  Z |E| Minimize p·x Subject to: x  {0,1} |E|  x e  1  T’s cut P e  P

Example Generalized Steiner Forest Given graph G(V,E) T 1 T 1 …T k  V and length vector p  Z |E| Min p·x S.t.: x  {0,1} |E|  x e  1  i  T i ’s cut P e  P

Example IS (Maximum Independent Set) Given a graph G=(V,E) and profit vector p  Z n Maximaize p·x Subject to: x  {0,1} n x i + x j  1  {i,j}  E

Maximum Independent Set in Interval Graphs Maximum Independent Set in Interval Graphs Activity9 Activity8 Activity7 Activity6 Activity5 Activity4 Activity3 Activity2 Activity1 time Maximize s.t. For each instance I: For each time t:

The Local-Ratio Technique: Basic definitions The Local-Ratio Technique: Basic definitions Given a penalty [profit] vector p. Minimize [Maximize] p·x Subject to:feasibility constraints F(x) x is r-approximation if F(x) and p·x  r · p·x* An algorithm is r-approximation if for any p, F it returns an r-approximation

The Local-Ratio Theorem: The Local-Ratio Theorem: x is an r-approximation with respect to p 1 x is an r-approximation with respect to p- p 1  x is an r-approximation with respect to p Proof: ( For minimization) p 1 · x  r × p 1 * p 2 · x  r × p 2 *  p · x  r × ( p 1 *+ p 2 *)  r × ( p 1 + p 2 )*

The Local-Ratio Theorem: (Proof2) The Local-Ratio Theorem: (Proof2) x is an r-approximation with respect to p 1 x is an r-approximation with respect to p- p 1  x is an r-approximation with respect to p Proof2: ( For minimization) Let x*, x 1 *, x 2 * be optimal solutions for p, p 1, p 2 respectively p 1 · x  r × p 1 x 1 * p 2 · x  r × p 2 x 2 *  p · x  r × ( p 1 x 1 *+ p 2 x 2 * )  r × ( p 1 x*, + p 2 x*) = px*

Special case: Optimization is 1-approximation Special case: Optimization is 1-approximation x is an optimum with respect to p 1 x is an optimum with respect to p- p 1 x is an optimum with respect to p

A Local-Ratio Schema for Minimization[Maximization] problems: A Local-Ratio Schema for Minimization[Maximization] problems: Algorithm r-ApproxMin[Max]( Set, p ) If Set = Φ then return Φ ; If  I  Set p(I)=0 then return {I}  r-ApproxMin( Set-{I}, p ) ; [If  I  Set p(I)  0 then return r-ApproxMax( Set-{I}, p ) ;] Define “good” p 1 ; REC = r-ApproxMax[Min]( Set, p- p 1 ) ; If REC is not an r-approximation w.r.t. p 1 then “fix it”; return REC;

The Local-Ratio Theorem: Applications Applications to some optimization algorithms (r = 1): ( MST) Minimum Spanning Tree (Kruskal) MST ( SHORTEST-PATH) s-t Shortest Path (Dijkstra) SHORTEST-PATH (LONGEST-PATH) s-t DAG Longest Path (Can be done with dynamic programming)(LONGEST-PATH) (INTERVAL-IS) Independents-Set in Interval Graphs Usually done with dynamic programming)(INTERVAL-IS) (LONG-SEQ) Longest (weighted) monotone subsequence (Can be done with dynamic programming)(LONG-SEQ) ( MIN_CUT) Minimum Capacity s,t Cut (e.g. Ford, Dinitz) MIN_CUT Applications to some 2-Approximation algorithms: (r = 2) ( VC) Minimum Vertex Cover (Bar-Yehuda and Even) VC ( FVS) Vertex Feedback Set (Becker and Geiger) FVS ( GSF) Generalized Steiner Forest (Williamson, Goemans, Mihail, and Vazirani) GSF ( Min 2SAT) Minimum Two-Satisfibility (Gusfield and Pitt) Min 2SAT ( 2VIP) Two Variable Integer Programming (Bar-Yehuda and Rawitz) 2VIP ( PVC) Partial Vertex Cover (Bar-Yehuda) PVC ( GVC) Generalized Vertex Cover (Bar-Yehuda and Rawitz) GVC Applications to some other Approximations: ( SC) Minimum Set Cover (Bar-Yehuda and Even) SC ( PSC) Partial Set Cover (Bar-Yehuda) PSC ( MSP) Maximum Set Packing (Arkin and Hasin) MSP Applications Resource Allocation and Scheduling : ….

The creative part… find  -Effective weights p 1 is  -Effective if every feasible solution is  -approx w.r.t. p 1 i.e. p 1 ·x   p 1 * VC (vertex cover) Edge Matching Greedy Homogenious

VC (V, E, p) If E=  return  ; If  p(v)=0 return {v}+VC(V-{v}, E-E(v), p); Let (x,y)  E; Let  = min{p(x), p(y)}; Define p 1 (v) =  if v=x or v=y and 0 otherwise; Return VC(V, E, p- p 1 ) VC: Recursive implementation (edge by edge)  

VC: Iterative implementation (edge by edge)   VC (V, E, p) for each e  E; let  = min{p(v)| v  e}; for each v  e p(v) = p(v) -  ; return {v| p(v)=0};  

Min 5x Bisli +8x Tea +12x Water +10x Bamba +20x Shampoo +15x Popcorn +6x Chocolate s.t. x Shampoo + x Water  1

Movie: 1 4 the price of 2

VC: Iterative implementation (edge by edge)   VC (V, E, p) for each e  E; let  = min{p(v)| v  e}; for each v  e p(v) = p(v) -  ; return {v| p(v)=0};

VC: Greedy ( O(H(  )) - approximation) H(  )=1/2+1/3+…+1/  = O(ln  ) Greedy_VC (V, E, p) C =  ; while E  let v=arc min p(v)/d(v) C = C + {v}; V = V – {v}; return C;  n/  n/4 n/3 n/2 n … …  … … 

VC: LR-Greedy (star by star) LR_Greedy_VC (V, E, p) C =  ; while E  let v=arc min p(v)/d(v) let  = p(v)/d(v); C = C + {v}; V = V – {v}; for each u  N(v) p(v) = p(v) -  ; return C; 44    

VC: LR-Greedy by reducing 2-effective homogenious Homogenious = all vertices have the same “greedy value” LR_Greedy_VC (V, E, p) C =  ; Repeat Let  = Min p(v)/d(v); For each v  V p(v) = p(v) –  d(v); Move from V to C all zero weight vertices; Remove from V all zero degree vertices; Until E=  Return C; 44 66 44 55 33 33 33 22

Example MST (Minimum Spanning Tree) Given graph G(V,E) and length vector p  Z |E| Minimize p·x Subject to: x  {0,1} |E|  x e  1  cut P e  P

MST (V, E, p) If V=  return  ; If  self-loop e return MST(V, E-{e}, p); If  p(e)=0 return {e}+MST(V shrink(e), E shrink(e), p); Let  = min{p(e) : e  E}; Define p 1 (e) =  for all e  E; Return MST(V, E, p- p 1 ) MST: Recursive implementation (Homogenious)

MST (V, E, p) Kruskal MST: Iterative implementation (Homogenious)

Some effective weights VCISMSTS. PathSteinerFVSMin Cut Edge 2  Matching 2  Cycle 2-1/k  Clique (k-1)/k  Star 2 (k+1)/2  1  Homogenious 2k 1  22  Special trik1