1 The instructor will be absent on March 29 th. The class resumes on March 31 st.

Slides:



Advertisements
Similar presentations
Approximation algorithms for geometric intersection graphs.
Advertisements

Covers, Dominations, Independent Sets and Matchings AmirHossein Bayegan Amirkabir University of Technology.
Minimum Clique Partition Problem with Constrained Weight for Interval Graphs Jianping Li Department of Mathematics Yunnan University Jointed by M.X. Chen.
Minimum Vertex Cover in Rectangle Graphs
Approximation Algorithms
Greedy Algorithms Clayton Andrews 2/26/08. What is an algorithm? “An algorithm is any well-defined computational procedure that takes some value, or set.
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
9/27/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 14 Dynamic.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
5-1 Chapter 5 Tree Searching Strategies. 5-2 Satisfiability problem Tree representation of 8 assignments. If there are n variables x 1, x 2, …,x n, then.
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
The Stackelberg Minimum Spanning Tree Game Jean Cardinal · Erik D. Demaine · Samuel Fiorini · Gwenaël Joret · Stefan Langerman · Ilan Newman · OrenWeimann.
Chapter 3 The Greedy Method 3.
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
Greedy vs Dynamic Programming Approach
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Minimum Vertex Cover in Rectangle Graphs R. Bar-Yehuda, D. Hermelin, and D. Rawitz 1.
Complexity Theory CSE 331 Section 2 James Daly. Reminders Project 4 is out Due Friday Dynamic programming project Homework 6 is out Due next week (on.
Approximation Algorithms
Branch and Bound Searching Strategies
Parameterized Approximation Scheme for the Multiple Knapsack Problem by Klaus Jansen (SODA’09) Speaker: Yue Wang 04/14/2009.
Polynomial-Time Approximation Schemes for Geometric Intersection Graphs Authors: T. Erlebach, L. Jansen, and E. Seidel Presented by: Ping Luo 10/17/2005.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
1 Combinatorial Dominance Analysis The Knapsack Problem Keywords: Combinatorial Dominance (CD) Domination number/ratio (domn, domr) Knapsack (KP) Incremental.
Polynomial time approximation scheme Lecture 17: Mar 13.
Optimization problems INSTANCE FEASIBLE SOLUTIONS COST.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
1 Branch and Bound Searching Strategies 2 Branch-and-bound strategy 2 mechanisms: A mechanism to generate branches A mechanism to generate a bound so.
1 Separator Theorems for Planar Graphs Presented by Shira Zucker.
9-1 Chapter 9 Approximation Algorithms. 9-2 Approximation algorithm Up to now, the best algorithm for solving an NP-complete problem requires exponential.
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
Approximation Algorithms
Improved results for a memory allocation problem Rob van Stee University of Karlsruhe Germany Leah Epstein University of Haifa Israel WADS 2007 WAOA 2007.
The Theory of NP-Completeness 1. Nondeterministic algorithms A nondeterminstic algorithm consists of phase 1: guessing phase 2: checking If the checking.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
Approximation schemes Bin packing problem. Bin Packing problem Given n items with sizes a 1,…,a n  (0,1]. Find a packing in unit-sized bins that minimizes.
APPROXIMATION ALGORITHMS VERTEX COVER – MAX CUT PROBLEMS
Design Techniques for Approximation Algorithms and Approximation Classes.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Approximation Algorithms for Knapsack Problems 1 Tsvi Kopelowitz Modified by Ariel Rosenfeld.
Chapter 15 Approximation Algorithm Introduction Basic Definition Difference Bounds Relative Performance Bounds Polynomial approximation Schemes Fully Polynomial.
Approximation Algorithms
Speaker: Yoni Rozenshein Instructor: Prof. Zeev Nutov.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Restricted Track Assignment with Applications 報告人:林添進.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Approximation Algorithms These lecture slides are adapted from CLRS.
Approximation Algorithms for NP-hard Combinatorial Problems Magnús M. Halldórsson Reykjavik University Local Search, Greedy and Partitioning
1 Prune-and-Search Method 2012/10/30. A simple example: Binary search sorted sequence : (search 9) step 1  step 2  step 3  Binary search.
NP-Complete Problems. Running Time v.s. Input Size Concern with problems whose complexity may be described by exponential functions. Tractable problems.
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
Dynamic Programming.  Decomposes a problem into a series of sub- problems  Builds up correct solutions to larger and larger sub- problems  Examples.
The full Steiner tree problem Theoretical Computer Science 306 (2003) C. L. Lu, C. Y. Tang, R. C. T. Lee Reporter: Cheng-Chung Li 2004/06/28.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
1 Approximation algorithms Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij TexPoint fonts used in EMF. Read the TexPoint manual.
Branch and Bound Searching Strategies
Non-LP-Based Approximation Algorithms Fabrizio Grandoni IDSIA
Feb 23, 2010 University of Liverpool Minimizing Total Busy Time in Parallel Scheduling with Application to Optical Networks Michele Flammini – University.
1 Minimum Routing Cost Tree Definition –For two nodes u and v on a tree, there is a path between them. –The sum of all edge weights on this path is called.
The Theory of NP-Completeness
8.3.2 Constant Distance Approximations
Computability and Complexity
CS 3343: Analysis of Algorithms
Coverage Approximation Algorithms
Polynomial time approximation scheme
Ch09 _2 Approximation algorithm
The Greedy Approach Young CS 530 Adv. Algo. Greedy.
Presentation transcript:

1 The instructor will be absent on March 29 th. The class resumes on March 31 st.

2 Section 9-7 PTAS Error Ratio Polynomial-Time Approximation Scheme –A family of approximation algorithms –For any pre-specified E, there exists an approximation algorithm with error ratio E –And the complexity is still polynomial O(n/E)

3 Planar Graph Definition: A graph is said to be embedded on a surface S if it can be drawn on S so that its edges intersect only at their end vertices. A graph is a planar graph if it can be embedded on a plane.

4 Examples

5 Face Definition: A face is a region defined by a planar embedding. –The unbound face is called exterior face e.g –All other faces are called interior faces. e.g

6 k-outerplanar In a planar graph, we can associate each node with a level. –Node on exterior face are of level 1. A graph is k-outerplanar if it has no nodes with level greater than k

7 Example

8 Level

9 Level

10 Level

11 Max Independent Set Problem c.f Maximum & Maximal. Maximum independent set problem on planar graphs is NP-hard. For a k-outerplanar graph, an optimal solution for the maximum independent set problem can be found in O(8 k n) time, –through the dynamic programming approach –n is the number of vertices.

A Planar Graph with 9 Levels

Nodes on Level 1,4,7

Nodes on Level 2,5,8

Nodes on Level 3,6,9

Graph Obtained by Removing Nodes on Level 3,6,9 This is a 2-outerplanar graph. Maximum Independent Set Problem can be solved in O(8 k n) time, where k=2.

17 Algorithm 9-7 An Approximation Algorithm to Solve the Max Independent Set Problem on Planar Graph Step 1. For all i = 0, 1, 2, …, k, do –(1.1) Let G i be the graph obtained by deleting all nodes with levels congruent to i (mod k+1). The remaining subgraphs are all k-outerplanar graphs. –(1.2) For each k-outerplanar graph, find its maximum independent set. Let S i denote the union of these solutions. Step 2. Among S 0, S 1, …, S k, choose the S j with the maximum size and let it be our approximation solution S APX. Time Complexity: O(8 k kn)

18 Analysis All nodes are divided into (k+1) classes –Each class corresponds to a level congruent to i (mod k+1), for i = 0, 1, …, k. For every independent set S, the average number of nodes in this set for each class is |S|/(k+1). There is at least one r, such that at most 1/(k+1) of vertices in S OPT are at a level which is congruent to r (mod k+1).

19 Analysis (cont.) Because at most |S OPT |/(k+1) nodes are deleted, the solution S r obtained by deleting the nodes in class r from S OPT will have at least |S OPT |(1 – 1/(k+1)) = |S OPT | k/(k+1) nodes. |S r |  |S OPT | k/(k+1) According to our algorithm, |S APX |  |S r |

20 PTAS If we set k = ceiling(1/E) –1, the above formula becomes Thus for every given error bound E, we have a corresponding k to guarantee that the approximate solution differs from the optimum one within this error ratio. No matter how small the error is, the complexity of the algorithm is O(8 k kn), which is polynomial with respect to n.

21 0/1 Knapsack Problem n objects, each with a weight w i > 0 a profit p i > 0 capacity of knapsack : M Maximize Subject to x i = 0 or 1, 1  i  n

22 Greedy on Density M=2k, APX=2k+3, OPT=4k M=2k+1, APX=OPT=4k P2k+32k Wk+1kk

PTAS of 0/1 Knapsack Problem We shall demonstrate how to obtain an approximation algorithm with error ratio , no matter how small  is.

Step 1: Sort the items according to the density. i pipi wiwi p i /w i

25 Step 2: Calculate a number Q Find the largest d such that W=w 1 +w 2 +…+w d  M. If d=n or W=M, then –Set P APX =p 1 +p 2 +…+p d and INDICES={1,2,…,d} and stop. –In this case, P OPT =P APX Otherwise, set Q=p 1 +p 2 +…+p d +p d+1 For our case, d=3 and Q= =234

26 Characteristics of Q 1.p 1 +p 2 +…+p d  P OPT 2. w d+1  M, therefore p d+1 is a feasible solution p d+1  P OPT 3.Q= p 1 +p 2 +… +p d +p d+1  2P OPT 4.P OPT is a feasible solution and Q is not, so P OPT  Q Q/2  P OPT  Q

27 Step 3: Calculate a normalizing factor   =Q(  /3) 2 Let  to be 0.6 –In our case,  =234(0.6/3) 2 =234(0.2) 2 =9.36 Calculate Set T=Q(  /3) –In our case, T = 234(0.6/3)=46.8

28 Step 4.1: SMALL & BIG Let SMALL collect all items whose profits are smaller than or equial to T. Collect all other items into BIG. In our case, –SMALL = {4, 5, 6, 7, 8} –BIG = {1, 2, 3}

29 Step 4.2 Normalize items in BIG In our case,

30 Step 4.3 Initialize an array A Array A has size g. Each entry corresponds to a combination of p i ’ s. Each entry A[i] consists of three fields –I: index of the combination –P: sum of profits –W: sum of weights

31 Step 4.4 Run a dynamic programming on items in BIG When i=1, p 1 ’ =9

32 When i=2, p 2 ’ =6

33 When i=3, p 3 ’ =5

34 Step 5. Add items in SMALL using the greedy algorithm Step 6: Pick the largest profit to be our approximate solution

35 Why is it polynomial? Step 4.4 is an exhaustive scanning step. Intuitively, it will take exponential time: –2 |BIG| Actually, the size of array A is not longer than g.

36 Time Complexity Step 1: O(n log n) Step 2: O(n) Step 4.1 to 4.2: O(n) Step 4.3: O(g) Step 4.4: O(ng) Step 5: O(ng) Step 6: O(g) Total: O(n log n) + O(ng) = O(n log n) + O(n(3/  ) 2 ).

37 Error Analysis

38 Exercise Use the example given in P.493, –Run the PTAS with  =0.75. –What is the solution you obtain? –What is the error ratio? Due: April 14 th The instructor will be absent in next week.

39 Exercise Write a program which runs the PTAS of 0/1 Knapsack Problem. (Due: April 19 th ) Input format: