Presentation is loading. Please wait.

Presentation is loading. Please wait.

Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou.

Similar presentations


Presentation on theme: "Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou."— Presentation transcript:

1 Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou (CMU)

2 Constraint satisfaction problems (CSPs) In Max- k CSP, given: – a set of variables: V = {v 1, v 2, v 3, …, v n } – the domain of variables: D – a set of arity- k “local” constraints: C Goal: find an assignment α : V  D to maximize #satisfied constraints in C

3 Constraint satisfaction problems (CSPs) In Max- k CSP, given: – a set of variables: V = {v 1, v 2, v 3, …, v n } – the domain of variables: D – a set of arity- k “local” constraints: C Goal: find an assignment α : V  D to maximize #satisfied constraints in C Example: MaxCut – D = {0, 1} – p (i,j) = 1[v i ≠ v j ], Max-3SAT, UniqueGames, …

4 Assignment problems (APs) In Max- k AP, given – a set of variables V = {v 1, v 2, v 3, …, v n } – a set of arity- k “local” constraints C Goal: find a bijection π : V  {1, 2, …, n} (i.e. permutaion) to maximize #satisfied constraints in C

5 Assignment problems (APs) Examples –MaxAcyclicSubgraph ( MAS ) π(u) < π(v) –Betweenness π(u) < π(v) < π(w) or π(w) < π(v) < π(u) –MaxGraphIsomorphism ( Max-GI ) (π(u), π(v)) ∈ E(H), where H is a fixed graph –Dense k Subgraph ( D k S ) (π(u), π(v)) ∈ E(K k ), where K k is a k -clique

6 Approximate schemes Max- k CSP and Max- k AP are NP-Hard in general Polynomial-time approximation scheme (PTAS): for any constant ε > 0, the algorithm runs in n O(1) time and gives (1-ε)-approximation Quasi-PTAS: the algorithm runs in n O(log n) time Max- k CSP / Max- k AP admits PTAS or quasi-PTAS when the instance is “dense” or “metric”

7 PTAS for dense/metric Max- k CSP Max- k CSP is dense: has Ω(n k ) constraints. – PTAS for dense MaxCut [dlV96] – PTAS for dense Max- k CSP [AKK99, FK96, AdlVKK03] Max-2CSP is metric: edge weight ω satisfies ω(u, v) ≤ ω(u, w)+ω(w, v) – PTAS for metric MaxCut [dlVK01] – PTAS for metric MaxBisection [FdlVKK04] – PTAS for locally dense Max- k CSP (a generalized definition of “metric”) [dlVKKV05]

8 Quasi-PTAS for dense Max- k AP Max- k AP is dense: – roughly speaking, the instance has Ω(n k ) constraints In [AFK02] – (1-ε)-approximate dense MAS, Betweenness in n O(1/ε^2) time – (1-ε)-approximate dense D k S, Max-GI, Max- k AP in n O(log n/ε^2) time

9 Previous techniques Exhaustive search on a small set of variables [AKK99] Weak Szemerédi’s regularity lemma [FK96] Copying important variables [dlVK01] A variant of SVD [dlVKKV05] Linear programming relaxation for “assignment problems with extra constraints” [AFK02] In this paper, we show: The standard Sherali-Adams LP relaxation hierarchy is a unified approach to all these results!

10 Sherali-Adams LP relaxation hierarchy A systematic way to write tighter and tighter LP relaxations: [SA90] In an r-round SA LP relaxation, – For each set S = {v 1, …, v r } of r variables, we have a distribution of assignments μ S = μ {v1, …, vr} – For any two sets S and T, marginal distributions are consistent: μ S (S∩T) = μ T (S∩T) Solving an r-round LP relaxation takes n O(r) time.

11 Our results Sherali-Adams LP-based proof for known results – O(1/ε 2 )-round SA LP relaxation gives (1-ε)-approximation to dense or locally dense Max- k CSP, and Max- k CSP with global cardinality constraints such as MaxBisection – O(log n/ε 2 )-round SA LP relaxation gives (1-ε)- approximation to dense or locally dense Max- k AP New algorithms – Quasi-PTAS for Max k -HypergraphIsomorphism when one graph is dense and the other one is locally dense

12 Our techniques Solve the Sherali-Adams LP relaxation for sufficiently many rounds (Ω(1/ε 2 ) or Ω((log n)/ε 2 )) Randomized conditioning operation to bring down the pair-wise correlations Independent rounding for Max- k CSP Special rounding for Max- k AP

13 Conditioning operation Randomly choose v from V, sample a ~ μ v For each local distribution μ {v1, …, vr}, generate the new local distribution μ {v1, …, vr} |v= a r-round SA solution  (r-1)-round SA solution Essentially from [RT12] : – After t steps of conditioning, – on average, μ {v1, …, vk} is only -far from μ {v1} x … x μ {vk}

14 Independent rounding for Max- k CSP After Ω(1/ε 2 ) steps of conditioning, on average, μ {v1, …, vk} is only ε-far from μ {v1} x … x μ {vk} Sample each v from μ {v}, and we have Therefore, This is a (1-O(ε))-(multiplicative) approximation because of the density

15 Rounding for Max- k AP Independent sampling does not work: – objective value is good, but resulting assignment might not be permutation because of collisions Our special rounding: – View {μ {v} (w)} v,w as a doubly stochastic matrix, therefore a distribution of permutations – Distribution supported on one permutation  ✔ – Two permutations?  Merge them – Even more permutations?  Pick arbitrary two, merge them, and iterate Similar operation in [AFK02]

16 Merging two permutations 1.View the two permutations as disjoint cycles 2.Break long cycles (length > n 1/2 ) into short ones (length ≤ n 1/2 ) 3.In each cycle, choose Permutation 1/Permutation 2 independently Analysis Step 2: modified O(n 1/2 ) entries of Permutation 2, affecting O(n -1/2 )- fraction of the constraints n 1/2

17 Merging two permutations 1.View the two permutations as disjoint cycles 2.Break long cycles (length > n 1/2 ) into short ones (length ≤ n 1/2 ) 3.In each cycle, choose Permutation 1/Permutation 2 independently Analysis Step 3: value of the constraints where each variable from a distinct cycle is preserved because of independence – all but n -1/2 -fraction of them n 1/2

18 Merging two permutations 1.View the two permutations as disjoint cycles 2.Break long cycles (length > n 1/2 ) into short ones (length ≤ n 1/2 ) 3.In each cycle, choose Permutation 1/Permutation 2 independently Analysis Conclusion: In this way, we get a permutation whose objective value is at least (1 – O(n -1/2 )) * [Indep. Sampling] ≥ (1 – O(n -1/2 )) (1 – O(ε)) [Val of LP] n 1/2

19 Future directions Can we solve the Sherali-Adams LP faster (as in [GS12] ) to get a PTAS for dense assignment problems?

20 Thanks


Download ppt "Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou."

Similar presentations


Ads by Google