Approximation Schemes for Dense Variants of Feedback Arc Set, Correlation Clustering, and Other Fragile Min Constraint Satisfaction Problems Warren Schudy.

Slides:



Advertisements
Similar presentations
How to Rank with Fewer Errors A PTAS for Minimum Feedback Arc Set in Tournaments Warren Schudy Claire Mathieu, Warren Schudy Brown University Thanks to:
Advertisements

Guy EvenZvi LotkerDana Ron Tel Aviv University Conflict-free colorings of unit disks, squares, & hexagons.
Inapproximability of MAX-CUT Khot,Kindler,Mossel and O ’ Donnell Moshe Ben Nehemia June 05.
Dana Moshkovitz MIT Joint work with Subhash Khot, NYU 1.
The Max-Cut problem: Election recounts? Majority vs. Electoral College? 7812.
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
Computing Kemeny and Slater Rankings Vincent Conitzer (Joint work with Andrew Davenport and Jayant Kalagnanam at IBM Research.)
Heuristics for the Hidden Clique Problem Robert Krauthgamer (IBM Almaden) Joint work with Uri Feige (Weizmann)
Polynomial Time Approximation Schemes Presented By: Leonid Barenboim Roee Weisbert.
A polylogarithmic approximation of the minimum bisection Robert Krauthgamer The Hebrew University Joint work with Uri Feige.
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
Probabilistically Checkable Proofs (and inapproximability) Irit Dinur, Weizmann open day, May 1 st 2009.
Linear Time Approximation Schemes for the Gale-Berlekamp Game and Related Minimization Problems Marek Karpinski (Bonn) Warren Schudy (Brown) STOC 2009.
1/17 Optimal Long Test with One Free Bit Nikhil Bansal (IBM) Subhash Khot (NYU)
The Stackelberg Minimum Spanning Tree Game Jean Cardinal · Erik D. Demaine · Samuel Fiorini · Gwenaël Joret · Stefan Langerman · Ilan Newman · OrenWeimann.
Approximate Counting via Correlation Decay Pinyan Lu Microsoft Research.
Online Ramsey Games in Random Graphs Reto Spöhel Joint work with Martin Marciniszyn and Angelika Steger.
Introduction to Approximation Algorithms Lecture 12: Mar 1.
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Testing the Diameter of Graphs Michal Parnas Dana Ron.
Semidefinite Programming
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
Online Ramsey Games in Random Graphs Reto Spöhel Joint work with Martin Marciniszyn and Angelika Steger.
An Approximation Algorithm for Requirement cut on graphs Viswanath Nagarajan Joint work with R. Ravi.
Message Passing for the Coloring Problem: Gallager Meets Alon and Kahale Sonny Ben-Shimon and Dan Vilenchik Tel Aviv University AofA June, 2007 TexPoint.
Computational Complexity, Physical Mapping III + Perl CIS 667 March 4, 2004.
Yet another algorithm for dense max cut - go greedy Claire Mathieu Warren Schudy (presenting) Brown University Computer Science SODA 2008.
On Stochastic Minimum Spanning Trees Kedar Dhamdhere Computer Science Department Joint work with: Mohit Singh, R. Ravi (IPCO 05)
The community-search problem and how to plan a successful cocktail party Mauro SozioAris Gionis Max Planck Institute, Germany Yahoo! Research, Barcelona.
1 Combinatorial Dominance Analysis Keywords: Combinatorial Optimization (CO) Approximation Algorithms (AA) Approximation Ratio (a.r) Combinatorial Dominance.
Algoritmi on-line e risoluzione di problemi complessi Carlo Fantozzi
Coloring Algorithms and Networks. Coloring2 Graph coloring Vertex coloring: –Function f: V  C, such that for all {v,w}  E: f(v)  f(w) Chromatic number.
1 Introduction to Approximation Algorithms Lecture 15: Mar 5.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
Approximation Algorithms Motivation and Definitions TSP Vertex Cover Scheduling.
Dana Moshkovitz, MIT Joint work with Subhash Khot, NYU.
Approximating the MST Weight in Sublinear Time Bernard Chazelle (Princeton) Ronitt Rubinfeld (NEC) Luca Trevisan (U.C. Berkeley)
Approximation Algorithms for Stochastic Combinatorial Optimization Part I: Multistage problems Anupam Gupta Carnegie Mellon University.
1 The TSP : NP-Completeness Approximation and Hardness of Approximation All exact science is dominated by the idea of approximation. -- Bertrand Russell.
Approximation Schemes via Sherali-Adams Hierarchy for Dense Constraint Satisfaction Problems and Assignment Problems Yuichi Yoshida (NII & PFI) Yuan Zhou.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Sahand Negahban Sewoong Oh Devavrat Shah Yale + UIUC + MIT.
Online Vertex-Coloring Games in Random Graphs Reto Spöhel (joint work with Martin Marciniszyn; appeared at SODA ’07)
Transitive-Closure Spanner of Directed Graphs Kyomin Jung KAIST 2009 Combinatorics Workshop Joint work with Arnab Bhattacharyya MIT Elena Grigorescu MIT.
Approximation Algorithms
An Efficient Algorithm for Enumerating Pseudo Cliques Dec/18/2007 ISAAC, Sendai Takeaki Uno National Institute of Informatics & The Graduate University.
Princeton University COS 423 Theory of Algorithms Spring 2001 Kevin Wayne Approximation Algorithms These lecture slides are adapted from CLRS.
A graph problem: Maximal Independent Set Graph with vertices V = {1,2,…,n} A set S of vertices is independent if no two vertices in S are.
Partitioning Graphs of Supply and Demand Generalization of Knapsack Problem Takao Nishizeki Tohoku University.
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
NP-Complete problems.
Amplification and Derandomization Without Slowdown Dana Moshkovitz MIT Joint work with Ofer Grossman (MIT)
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
The full Steiner tree problem Theoretical Computer Science 306 (2003) C. L. Lu, C. Y. Tang, R. C. T. Lee Reporter: Cheng-Chung Li 2004/06/28.
Complexity and Efficient Algorithms Group / Department of Computer Science Testing the Cluster Structure of Graphs Christian Sohler joint work with Artur.
A randomized linear time algorithm for graph spanners Surender Baswana Postdoctoral Researcher Max Planck Institute for Computer Science Saarbruecken,
Correlation Clustering Nikhil Bansal Joint Work with Avrim Blum and Shuchi Chawla.
Correlation Clustering Shuchi Chawla Carnegie Mellon University Joint work with Nikhil Bansal and Avrim Blum.
0 Fall, 2016 Lirong Xia Computational social choice The easy-to-compute axiom.
Introduction to Approximation Algorithms
Polynomial integrality gaps for
Approximating the MST Weight in Sublinear Time
Maximum Matching in the Online Batch-Arrival Model
Sum of Squares, Planted Clique, and Pseudo-Calibration
Structural Properties of Low Threshold Rank Graphs
Randomized Algorithms CS648
Introduction to PCP and Hardness of Approximation
On the effect of randomness on planted 3-coloring models
the k-cut problem better approximate and exact algorithms
Online Ranking for Tournament Graphs
Presentation transcript:

Approximation Schemes for Dense Variants of Feedback Arc Set, Correlation Clustering, and Other Fragile Min Constraint Satisfaction Problems Warren Schudy Brown University Computer Science Joint work with Claire Mathieu, Marek Karpinski, and others

Outline Overview –Approximation algorithms –No-regret learning Approximate 2-coloring –Algorithm –Analysis Open problems 2

Optimization and Approximation Combinatorial optimization problems are ubiquitous Many are NP-complete Settle for e.g. 1.1-approximation: Cost(Output) ≤ 1.1 Cost(Optimum) A polynomial-time approximation scheme (PTAS) provides a 1+ε approximation for any ε >0. 3

4 At Microsoft Research Techfest 2009:

NP hard [RV ’08] PTAS runtime n O(1/ε²) [BFK ’03] We give PTAS linear runtime O(n 2 )+2 O(1/ε²) [KS ‘09] Gale-Berlekamp Game Invented by Any Gleason (1958) n/2 Animating… 5 Minimize number of lit light bulbs

“Pessimist’s MAX CUT” or “MIN UNCUT” General case: –O(√ log n) approx is best known [ACMM ‘05] –no PTAS unless P=NP [PY ‘91] Everywhere-dense case (all degrees Θ(n)) –Previous best PTAS: n O(1/ε²) [AKK ’95] –We give PTAS with linear runtime O(n 2 )+2 O(1/ε²) [KS ‘09] Approximate 2-coloring Cost 1 Animating… 6 Minimize number of monochromatic edges

Generalization: Fragile dense MIN-2CSP Min Constraint Satisfaction Problem (CSP): n variables, taking values from constant-sized domain Soft constraints, which each depend on 2 variables Objective: minimize number of unsatisfied constraints Assumptions: Everywhere-dense, i.e. each variable appears in Ω(n) constraints These constraints are fragile, i.e. changing value of a variable makes all satisfied constraints it participates in unsatisfied. (For all assignments.) We give first PTAS for all fragile everywhere-dense MIN-kCSPs. Its runtime is O(input size)+2 O(1/ε²) [KS ‘09] Approx. 2-coloring GB Game 7

8 2.5 approximation [ACN ‘05] No PTAS (in adversarial model) unless P=NP [CGW ‘05] If number of clusters is limited to a constant d: –Previous best PTAS runtime n O(1/ε²) [GG ’06] –We give PTAS with runtime O(n 2 )+2 O(1/ε²) (linear time) [KS ‘09] –Not fragile but rigid [KS ‘09] Correlation Clustering Minimize number of disagreements

More correlation clustering Additional results: –Various approximation results in an online model [MSS ‘10] –Suppose input is generated by adding noise to a base clustering. If all base clusters are size Ω(√n) then the semi-definite program reconstructs the base clustering [MS ‘10] –Experiments with this SDP [ES ‘09] 9

Fully dense feedback arc set Applications –Ranking by pairwise comparisons [Slater ‘61] –Learning to order objects [CSS ‘97] –Kemeny rank aggregation NP-hard [ACN ’05, A ’06, CTY ‘07] We give first PTAS [MS ‘07] ABC Minimize number of backwards edges D

Generalization 1.B between A, C 2.B between A, D 3.A between C, D 4.C between B, D 11 Animating… Example: betweenness. Minimize number of violated constraints A, B, C, D Generalize to soft constraints depending on k objects Assumptions –Complete, i.e. every set of k objects has a soft constraint –The constraints are fragile, i.e. a satisfied constraint becomes unsatisfied if any single object is moved We give first PTAS for all complete fragile min ranking CSPs [KS ‘09]

Summary of PTASs Previous workThis work Every.- dense Fragile Min k-CSP- O(input)+2 O(1/ε²) [KS ‘09] (Essentially optimal) Approx. 2-color, Gale-Berlekamp Game n O(1/ε²) [AKK ‘95, BFK ‘03] Complete Correlation clustering with O(1) clusters n O(1/ε²) [GG ‘06] Fragile Min Ranking k-CSP- Poly(n) 2 Poly(1/ε) [MS ‘07, KS ‘09] Feedback arc set- Betweenness- 12

Outline Overview –Approximation algorithms –No-regret learning Approximate 2-coloring –Algorithm –Analysis Open problems 13

External regret Rock-paper scissors history: Exist algorithms with regret O(√t) after t rounds [FS ‘97] 14 [External] P Regret: 1 − (-2) = 3

Internal regret Regret O(√t) after t rounds using matrix inversion [FV ‘99] … using matrix-vector multiplication [MS ‘10] Currently investigating another no-regret learning problem related to dark pools with Jenn Wortman Vaughan [SV] 15 [Internal] S→P Regret: 2 − (-2) = 4

Outline Overview –Approximation algorithms –No-regret learning Approximate 2-coloring –Algorithm –Analysis Open problems 16

Reminder: approximate 2-coloring Minimize number of monochromatic edges Assume all degrees Ω(n) 17

Some Instances are easy Previously known additive error algorithms: Cost(Output) ≤ Cost(Optimum) + O(ε n 2 ) –[Arora, Karger, Karpinski ‘95] –[Fernandez de la Vega ‘96] –[Goldreich, Goldwasser, Ron ‘98] –[Alon, Fernandez de la Vega, Kannan, Karpinski. ‘99] –[Freize, Kannan ‘99] –[Mathieu, Schudy ‘08] Which instances are easy? 18 When OPT = Ω(n 2 ) Animating…

Previous algorithm (1/3) Let S be random sample of V of size O(1/ε²)·log n For each coloring x 0 of S –Compute coloring x 3 of V somehow… Return the best coloring x 3 found Let x 0 = x* restricted to S – analysis version Assumes OPT ≤ ε κ 0 n 2 where κ 0 is a constant Animating… 19 “exhaustive sampling” V S S G Random sample S Return best x0x0 x3x3 … S G … … S G Return

Previous algorithm (2/3) 20 x0x0 partial coloring x 2 ← if margin of v w.r.t. x 0 is large then color v greedily w.r.t. x 0 else label v “ambiguous” x3x3 S G G 2 to 1 3 to 0 Etc. Define the margin of vertex v w.r.t. coloring x to be |(number of blue neighbors of v in x) - (number of red neighbors of v in x)|.

Previous algorithm (3/3) 21 x0x0 x2x2 x 3 extends x 2 greedily S G G

Previous algorithm Let S be random sample of V of size O(1/ε²)·log n For each coloring x 0 of S –partial coloring x 2 ← if margin of v w.r.t. x 0 is large then color v greedily w.r.t. x 0 else label v “ambiguous” –Extend x 2 to a complete coloring x 3 greedily Return the best coloring x 3 found Our κ2κ2 –x 1 ← greedy w.r.t. x 0 using an existing additive error algorithm Intermediate Assume OPT ≤ ε κ 0 n 2 Idea: use additive error algorithm to color ambiguous vertices. κ 1 n 2 Idea: two greedy phases before assigning ambiguity allows constant sample size Animating…

Outline Overview –Approximation algorithms –No-regret learning Approximate 2-coloring –Algorithm –Analysis Open problems 23

Plan of analysis Main Lemma: 1.Coloring x 2 agrees with the optimal coloring x* 2.Few mistakes are made when coloring the ambiguous vertices 24

Lemma 2: with probability at least 90% all vertices have margin w.r.t. x* within O(δ n) of margin w.r.t. x 1. Proof plan: bound num. miscolored vertices by O(δ n) Proof: Relating x 1 to OPT coloring 25 C A B D E F Optimum assignment x*: Case 1: |1-3| > δ n / 3 “F unbalanced” Chernoff and Markov bounds 1 3 Case 2: |1-3| ≤ δ n / 3 “F balanced” Fragility & density Few miscolored because:

Proof that x 2 agrees with the optimal coloring x* 1. Assume F colored by x 2 26 C A B D E F 1 3 C A B D E F >>0 and F blue by def’n x 2 4.F blue by optimality of x* ≈ 3-1 by Lemma 2 x* x1x1

Proof that x 2 agrees with the optimal coloring x* 1. Assume F colored by x 2 27 C A B D E F 1 3 C A B D E F >>0 and F blue by def’n x 2 4.F blue by optimality of x* ≈ 3-1 by Lemma 2 x* x1x1

Proof ideas: few mistakes are made when coloring the ambiguous vertices Similar techniques imply every ambiguous vertex is balanced Few such vertices 28

Outline Overview –Approximation algorithms –No-regret learning Approximate 2-coloring –Algorithm –Analysis Open problems 29

Impossible extensions Our results: Fragile everywhere-denseMin CSP Fragile fully-denseMin Rank CSP Impossible extensions unless P=NP: Fragile everywhere-denseMin CSP Fragile fully-denseMin Rank CSP Fragile average-denseMin CSP Fragile everywhere-denseMin Rank CSP everywhere-dense Correlation Clustering 30

Kemeny Rank Aggregation (1959) 1.Voters submit rankings of candidates 2.Translate rankings into graphs 3.Add those graphs together 4.Find feedback arc set of resulting weighted graph A>B>C A B C C>A>B A B C A>C>B A B C A B C A B C Nice properties, e.g. Condorcet [YL ’78, Y ‘95] We give first PTAS [MS ‘07]

An Open Question Real rankings often have ties, e.g. restaurant guides with ratings 1-5 Exists 1.5-approx [A ‘07] Interesting but difficult open question: Is there a PTAS? A B C A: 5 C: 4 B: 5 D: 3 D

Summary of PTASs Previous workThis work Everywhere- dense Fragile Min k-CSP- O(input)+2 O(1/ε²) [KS ‘09] (Essentially optimal) Approx. 2-color, Multiway cut, Gale-Berlekamp Game, Nearest codeword, MIN-kSAT n O(1/ε²) [AKK ‘95, BFK ‘03] Unique Games- Fully-dense Rigid Min 2-CSP- Correlation clustering with O(1) clusters n O(1/ε²) [GG ‘06] Consensus clust. with O(1) cl.n O(1/ε²) [BDD ‘09] Hierarchical clust. with O(1) cl.- Fully- dense Fragile Min Ranking k-CSP- Poly(n) 2 Poly(1/ε) [MS ‘07, KS ‘09] Feedback arc set- Betweenness- 33

Questions? 34

My publications (not the real titles) Correlation clustering and generalizations: K and S. PTAS for everywhere-dense fragile CSPs. In STOC Elsner and S. Correlation clustering experiments. In ILP for NLP M and S. Correlation clustering with noisy input. In SODA M, Sankur, and S. Online correlation clustering. To appear in STACS Feedback arc set and generalizations: M and S. PTAS for fully dense feedback arc set. In STOC K and S. PTAS for fully dense fragile Min Rank CSP. Arxiv preprint Additive error: M and S. Yet Another Algorithm for Dense Max Cut. In SODA No-regret learning: Greenwald, Li, and S. More efficient internal-regret-minimizing algorithms. In COLT S and Vaughan. Regret bounds for the dark pools problem. In preparation. Other: S. Finding strongly connected components in parallel using O(log 2 n) reachability queries. In SPAA S. Optimal restart strategies for tree search. In preparation. K. = Karpinski, M. = Mathieu, S. = Schudy

References [A ‘06] = Alon. SIAM J. Discrete Math, [ACMM ’05] = Agarwal, Charikar, and Makarychev (x2). STOC [ACN ‘05] = Ailon, Charikar and Newman. STOC [AFKK ‘03] = Alon, Fernandez de la Vega, Kannan, and Karpinski. JCSS, [AKK ‘95] = Arora, Karger and Karpinski. STOC [BFK ‘03] = Bazgan, Fernandez de la Vega and Karpinski. Random Structures and Algorithms, [CGW ‘05] = Charikar, Guruswami and Wirth. JCSS, [CS ‘98] = Chor and Sudan. SIAM J. Discrete Math, [CTY ‘06] = Charbit, Thomassé and Yeo. Comb., Prob. and Comp., [GG ‘06] = Giotis and Guruswami. Theory of Computing, [F ‘96] = Fernandez de la Vega. Random Structures and Algorithms, [FK ‘99] = Frieze and Kannan. Combinatorica, [FS ‘97] = Freund and Schapire. JCSS, [FV ‘99] = Foster Vohra. Games and Economic Behavior, [GGR ‘98] = Goldreich, Goldwasser and Ron. JACM [O ‘79] = Opatrny. SIAM J. Computing, [PY ‘91] =Papadimitriou and Yannakakis. JCSS, 2001 [RV ‘08] = Roth and Viswanathan. IEEE Trans. Info Thoery,

Appendix 37

Not fragile Dense MIN-3-UNCUT is at least as hard as general MIN- 2-UNCUT so no PTAS unless P=NP Approximate 3-coloring (MIN-3-UNCUT) Uncut (monochromatic) edge 10n 2 vert. General MIN-2-UNCUT instance Dense MIN-3-UNCUT instance Reduction 10n 2 vert. n vertices Complete tripartite graph n vertices 38