# Optimization problems INSTANCE FEASIBLE SOLUTIONS COST.

## Presentation on theme: "Optimization problems INSTANCE FEASIBLE SOLUTIONS COST."— Presentation transcript:

Optimization problems INSTANCE FEASIBLE SOLUTIONS COST

Vertex Cover problem INSTANCE graph G FEASIBLE SOLUTIONS S  V, such that (  e  E) S  e  COST c(S) = |S|

Set Cover problem INSTANCE family of sets A 1,...,A n  FEASIBLE SOLUTIONS S  [n], such that  A i  COST c(S) = |S| iSiS

Set Cover problem INSTANCE family of sets A 1,...,A n  FEASIBLE SOLUTIONS S  [n], such that  A i  COST c(S) = |S| INSTANCE graph G FEASIBLE SOLUTIONS S  V, such that (  e  E) S  e  COST c(S) = |S| Vertex Cover problem iSiS

Set Cover problem INSTANCE family of sets A 1,...,A n  FEASIBLE SOLUTIONS S  [n], such that  A i  COST c(S) = |S| INSTANCE graph G FEASIBLE SOLUTIONS S  V, such that (  e  E) S  e  COST c(S) = |S| Vertex Cover problem iSiS  = E A i  E is the set of edges adjacent to i  V

Optimization problems INSTANCE FEASIBLE SOLUTIONS COST OPTIMAL SOLUTIONOPT= min c(T) T  FEASIBLE SOLUTIONS

 approximation algorithm INSTANCE  T such that c(T)  OPT

Last Class: 2-approximation algorithm for Vertex-Cover 2-approximation algorithm for Metric TSP 1.5-approximation algorithm for Metric TSP

This Class: (1+  )-approximation algorithm for Knapsack O(log n)-approximation algorithm for Set Cover

Knapsack INSTANCE: value v i, weight w i, for i  {1,...,n} weight limit W FEASIBLE SOLUTION: collection of items S  {1,...,n} with total weight  W COST (MAXIMIZE): sum of the values of items in S

Knapsack INSTANCE: value v i, weight w i, for i  {1,...,n} weight limit W FEASIBLE SOLUTION: collection of items S  {1,...,n} with total weight  W COST (MAXIMIZE): sum of the values of items in S We had: pseudo-polynomial algorithm, time = O(Wn) pseudo-polynomial algorithm, time = O(Vn), where V = v 1 +... +v n

Knapsack INSTANCE: value v i, weight w i, for i  {1,...,n} weight limit W FEASIBLE SOLUTION: collection of items S  {1,...,n} with total weight  W COST (MAXIMIZE): sum of the values of items in S pseudo-polynomial algorithm, time = O(Vn), where V = v 1 +... +v n GOAL convert into an approximation algorithm IDEA = rounding

Knapsack wlog all w i  W M = maximum of v i v i  v i ’ :=  n v i / (M  )  OPT’  n 2 /  S = optimal solution in original S’ = optimal solution in modified Will show: optimal solution in modified is an approximately optimal solution in original

Knapsack v i  v i ’ :=  n v i / (M  )  S = optimal solution in original S’ = optimal solution in modified (n/(M  ))   v i   v i ’   v i ’   ( nv i / (M  ) -1 ) i  S’ i  S Will show: optimal solution in modified is an approximately optimal solution in original

Knapsack v i  v i ’ :=  n v i / (M  )  S = optimal solution in original S’ = optimal solution in modified (n/(M  ))   v i   v i ’   v i ’   ( nv i / (M  ) -1 ) i  S’ i  S Will show: optimal solution in modified is an approximately optimal solution in original   v i   ( v i - ) i  S’ i  S n/(M  ) 1  OPT – M  OPT(1–  )

Running time? pseudo-polynomial algorithm, time = O(V’n), where V’ = v’ 1 +... +v’ n M = maximum of v i v i  v i ’ :=  n v i / (M  ) 

Running time? pseudo-polynomial algorithm, time = O(V’n), where V’ = v’ 1 +... +v’ n M = maximum of v i v i  v i ’ :=  n v i / (M  )  v’ i  n/  V’  n   running time = O(n 3 /  )

FPTAS Fully polynomial-time approximation scheme (1+  )-approximation algorithm running in time poly(INPUT,1/  ) We have an algorithm for the Knapsack problem, which outputs a solution with value  (1-  ) OPT and runs in time O(n 3 /  )

Weighted set cover problem INSTANCE: A 1,...,A m  , weights w 1,...,w m FEASIBLE SOLUTION: collection S of the A i covering  OBJECTIVE (minimize): the cost of the collection (in the unweighted version we have w i =1)

Weighted set cover problem Greedy algorithm: pick A i with minimal w i / |A i | remove elements in A i from  repeat

Negative example (last class) approximation ratio =  (log n)

Weighted set cover problem Greedy algorithm: pick A i with minimal w i / |A i | remove elements in A i from  repeat Theorem: O(log n) approximation algorithm.

Weighted set cover problem Greedy algorithm: pick A i with minimal w i / |A i | remove elements in A i from  repeat Theorem: O(log n) approximation algorithm. AiAi everybody pays w i / |A i | when A i picked, cost of the solution increases by w i

Weighted set cover problem Let B be a set of weight w. How much did the guys in B pay? B pick me! cost=w/ B pick me! cost=w/ B

Weighted set cover problem Greedy algorithm: pick A i with minimal w i / |A i | remove elements in A i from  repeat Theorem: O(log n) approximation algorithm. B pick me! cost=w/ B pick me! cost=w/ B sorry A i was cheaper sorry A i was cheaper AiAi paid less than w/B

Weighted set cover problem Greedy algorithm: pick A i with minimal w i / |A i | remove elements in A i from  repeat Theorem: O(log n) approximation algorithm. B

Weighted set cover problem continue, size of B went down by 1 B pick me! cost=w/(B-1) pick me! cost=w/(B-1)

Weighted set cover problem Greedy algorithm: pick A i with minimal w i / |A i | remove elements in A i from  repeat Theorem: O(log n) approximation algorithm. B sorry A j was cheaper sorry A j was cheaper AjAj paid less than w/(B-1) pick me! cost=w/(B-1) pick me! cost=w/(B-1)

Weighted set cover problem Greedy algorithm: pick A i with minimal w i / |A i | remove elements in A i from  repeat Theorem: O(log n) approximation algorithm. B

Weighted set cover problem continue, size of B went down by 1 B pick me! cost=w/(B-2) pick me! cost=w/(B-2)

Weighted set cover problem B vertices in order they are covered by greedy paid  w/B paid  w/(B-1) paid  w/(B-2) paid  w paid  w/2

Weighted set cover problem B TOTAL PAID  w (1/B + 1/(B-1) +... +1/2 + 1) = w O(ln B) = w O(ln n) paid  w/B paid  w/(B-1) paid  w/(B-2) paid  w paid  w/2

Weighted set cover problem INSTANCE: A 1,...,A m  , weights w 1,...,w m FEASIBLE SOLUTION: collection S of the A i covering  OBJECTIVE (minimize): the cost of the collection Greedy algorithm: pick A i with minimal w i / |A i | remove elements in A i from  repeat Theorem: O(log n) approximation algorithm.

Clustering n points in R m d(i,j) = distance between points i,j partition the points into k clusters of small diameter diam(C) = max d(i,j) i,j  C

Clustering k = 3

Clustering k = 2

k-Clustering INSTANCE n points in R m FEASIBLE SOLUTION partition of [n] into C 1,...,C k COST max diam(C i ) i  [k] diam(C) = max d(i,j) i,j  C

k-Clustering GREEDY ALGORITHM pick s 1  [n] for i from 2 to k do pick s i the farthest point from s 1,...,s i-1 C i = {x  [n] whose closest center is s i }

k-Clustering GREEDY ALGORITHM pick s 1  [n] for i from 2 to k do pick s i the farthest point from s 1,...,s i-1 C i = {x  [n] whose closest center is s i } s1s1

k-Clustering GREEDY ALGORITHM pick s 1  [n] for i from 2 to k do pick s i the farthest point from s 1,...,s i-1 C i = {x  [n] whose closest center is s i } s1s1 s2s2

k-Clustering GREEDY ALGORITHM pick s 1  [n] for i from 2 to k do pick s i the farthest point from s 1,...,s i-1 C i = {x  [n] whose closest center is s i } s1s1 s2s2 s3s3

k-Clustering GREEDY ALGORITHM pick s 1  [n] for i from 2 to k do pick s i the farthest point from s 1,...,s i-1 C i = {x  [n] whose closest center is s i } s1s1 s2s2 s3s3

k-Clustering GREEDY ALGORITHM pick s 1  [n] for i from 2 to k do pick s i the farthest point from s 1,...,s i-1 C i = {x  [n] whose closest center is s i } Theorem: GREEDY ALGORITHM IS A 2-APPROXIMATION ALGORITHM

Theorem: GREEDY ALGORITHM IS A 2-APPROXIMATION ALGORITHM s1s1 s2s2 sksk s k+1 d(s i,s j )  d(s k+1,{s 1,...,s k }) = r OPT  r cost of greedy  2r