Presentation is loading. Please wait.

Presentation is loading. Please wait.

Approximations for Isoperimetric and Spectral Profile and Related Parameters Prasad Raghavendra MSR New England S David Steurer Princeton University Prasad.

Similar presentations


Presentation on theme: "Approximations for Isoperimetric and Spectral Profile and Related Parameters Prasad Raghavendra MSR New England S David Steurer Princeton University Prasad."— Presentation transcript:

1 Approximations for Isoperimetric and Spectral Profile and Related Parameters Prasad Raghavendra MSR New England S David Steurer Princeton University Prasad Tetali Georgia Tech joint work with

2 Graph Expansion d-regular graph G with n vertices d expansion(S) = # edges leaving S d |S| vertex set S A random neighbor of a random vertex in S is outside of S with probability expansion(S) Ф G = expansion(S) minimum |S| ≤ n/2 Conductance of Graph G Uniform Sparsest Cut Problem Given a graph G compute Ф G and find the set S achieving it. Extremely well-studied, many different contexts pseudo-randomness, group theory, online routing, Markov chains, metric embeddings, … S

3 Measuring Graph Expansion d-regular graph G with n vertices d expansion(S) = # edges leaving S d |S| vertex set S Ф G = expansion(S) minimum |S| ≤ n/2 Conductance of Graph G Complete Graph Complete Graph Path Complete graphs with a perfect matching Typically, small sets expand to a greater extent.

4 Isoperimetric Profile Ф(δ) = expansion(S) minimum |S| ≤ δn Isoperimetric Profile of Graph G expansion(S) = # edges leaving S d |S| Conductance function – defined by [Lovasz-Kannan] used to obtain better mixing time bounds for Markov chains. Decreasing function of δ Set Size δ 0.5 Ф(δ)Ф(δ) 1 S

5 Approximating Isoperimetric Profile Ф(δ) = expansion(S) minimum |S| ≤ δn Isoperimetric Profile of Graph G expansion(S) = # edges leaving S d |S| S Uniform Sparsest Cut: Determine the lowest point on the curve. Set Size δ 0.5 Ф(δ)Ф(δ) 1 What is the value of Ф(δ) for a given graph G and a constant δ > 0? Gap Small Set Expansion Problem (GapSSE)(η, δ) Given a graph G and constants δ, η > 0, Is Ф(δ) 1- η? ----Closely tied to Unique Games Conjecture (Last talk of the day “Graph Expansion and the Unique Games Conjecture” [R-Steurer 10])

6 Algorithm Theorem For every constant δ > 0, there exists a poly-time algorithm that finds a set S of size O(δ) such that expansion (S) ≤ -- A (Ф(δ) vs ) approximation For small enough δ, the algorithm cannot distinguish between Ф(δ) 1- η Theorem [R-Steurer-Tulsiani 10] An improvement over above algorithm by better than constant factor, would help distinguish Ф(δ) 1- η

7 A Spectral Relaxation Let x = (x 1, x 2,.. x n ) be the indicator function of the unknown least expanding set S S SCSC 1 0 Number of edges leaving S = |S| = |Support(x)| < δn Relaxing 0,1 to real numbers

8 Why is it spectral? Spectral Profile: [Goel-Montenegro-Tetali] Smallest Eigen Value of Laplacian: |Support(x)| < δn Observation: Λ(δ) is the smallest possible eigenvalue of a submatrix L(S,S) of size at most < δn of the Laplacian L.

9 Rounding Eigenvectors Cheeger’s inequality There is a sparse cut of value at most Smallest Eigen Value of Laplacian Rounding x |Support(x)| < δn Lemma: There exists a set S of volume at most δ whose expansion is at most 0

10 Spectral Profile [Goel-Montenegro-Tetali] |Support(x)| < δn Unlike eigen values, Λ(δ) is not the optimum of a convex program. We show an efficiently computable SDP that gives good guarantee. Theorem: There exists an efficiently computable SDP relaxation Λ * (δ) of Λ(δ) such that for every graph G Λ * (δ) ≤ Λ(δ) ≤ Λ * (δ/2)∙O(log 1/δ)

11 Recap Theorem: (Approximating Spectral Profile) There exists an efficiently computable SDP relaxation Λ * (δ) of Λ(δ) such that for every graph G Λ * (δ) ≤ Λ(δ) ≤ Λ * (δ/2)∙O(log 1/δ) Lemma: (Cheeger Style Rounding) There exists a set S of volume at most δ whose expansion is at most Theorem (Approximating Isoperimetric Profile) For every constant δ > 0, there exists a poly-time algorithm that finds a set S of size O(δ) such that expansion (S) ≤

12 Restricted Eigenvalue Problem Given a matrix A, find a submatrix A[S,S] of size at most δn X δn matrix with the least eigenvalue. -- Our algorithm is applicable to diagonally dominant matrices (yields a log(1/δ) approximation).

13 Approximating Spectral Profile |Support(x)| < δn

14 SDP Relaxation for |Support(x)| < δn Replace each x i by a vector v i : Denominator = n Without loss of generality, x i can be assumed positive. This yields the constraint: v i ∙v j ≥ 0 Enforcing Sparsity: By Cauchy-Schwartz inequality:

15 SDP Relaxation for |Support(x)| < δn Minimize (Sum of Squared Edge Lengths) Subject to Positive Inner Products: v i ∙v j ≥ 0 Average squared length =1 Average Pairwise Correlation < δ

16 Rounding Two Phase Rounding: Transform SDP vectors in to a SDP solution with only non- negative coordinates. Use thresholding to convert non-negative vectors in to sparse vectors.

17 Making SDP solution nonnegative Let v be a n-dimensional real vector. Let v * denote the unit vector along direction v. Map the vector v to the following function over R n : f v = ||v||∙ (Square Root of Probability Density Function of n-dimensional Gaussian centered at v * ) Formally, Where: Ф(x) = probability density function of a mean 0, variance σ spherical Gaussian in n-dimensions.

18 Properties Where: Ф(x) = probability density function of a mean 0, variance σ spherical Gaussian in n-dimensions. Lemma: (Pairwise correlation remains low if σ is small) Average Pairwise Correlation < δ SDP Constraint Pick σ = 1/sqrt{log(1/δ)}

19 Properties Lemma: (Squared Distances get stretched by at most 1/σ 2 ) |f 1 – f 2 | 2 ≤ O(1/σ 2 ) |v 1 – v 2 | 2 Where: Ф(x) = probability density function of a mean 0, variance σ spherical Gaussian in n-dimensions. For our choice of σ, squared distances are stretched by log(1/δ). With a log(1/δ) factor loss, we obtaining a non-negative SDP solution.

20 Rounding a positive vector solution Let us pretend the vectors v i are non-negative. i.e., v i (t) ≥ 0 for all t Rounding Non-Negative Vectors 1. Sample t 2.Compute threshold θ = (average of v i (t) over i) * (2/ δ ) 3.Set x i = max{ v i (t) – θ, 0 } for all i Observation: |Support(x)| < δ/2 ∙ n Observation:

21 Open Problem Find integrality gaps for the SDP relaxation: Current best have δ = 1/poly(logn), Do there exist integrality gaps with δ = 1/poly(n)? Minimize (Sum of Squared Edge Lengths) Subject to Positive Inner Products: v i ∙v j ≥ 0 Average squared length =1 Average Pairwise Correlation < δ

22 Thank You

23 Spectral Profile Second Eigen Value: |Support(x)| < δn Spectral Profile Remarks: Λ(δ) ≤ minimum of Ф(δ 0 ) over all δ 0 ≤ δ Unlike second eigen value, Λ(δ) is not the optimum of a convex program. Λ(δ) is the smallest possible eigenvalue of a submatrix of size at most < δn of the matrix L x i can be assumed to be all positive.

24 Small Sets via Spectral Profile Using an analysis along the lines of analysis of Cheeger’s inequality, it yields: Theorem [Raghavendra-Steurer-Tetali 10] There exists a poly-time algorithm that finds a set S of size O(δ) such that expansion (S) ≤ sqrt (Λ * (δ)∙O(log 1/δ)) So if there is a set S with expansion(S) = ε, then the algorithm finds a cut of size Similar behaviour as the Gaussian expansion profile

25 Spectral Profile Second Eigen Value: |Support(x)| < δn Spectral Profile Replace x i by vectors v i Subject to v i ∙v j ≥ 0

26 Graph Expansion d-regular graph G d expansion(S) = # edges leaving S d |S| vertex set S A random neighbor of a random vertex in S is outside of S with probability expansion(S) Ф G = expansion(S) minimum |S| ≤ n/2 Conductance of Graph G Uniform Sparsest Cut Problem Given a graph G compute Ф G and find the set S achieving it. Approximation Algorithms: Cheeger’s Inequality [Alon][Alon-Milman] Given a graph G, if the normalized adjacency matrix has second eigen value λ 2 then, A log n approximation algorithm [Leighton-Rao]. A sqrt(log n) approximation algorithm using semidefinite programming [Arora-Rao-Vazirani]. Extremely well-studied, many different contexts pseudo-randomness, group theory, online routing, Markov chains, metric embeddings, …

27 Limitations of Eigenvalues The best lower bound that Cheeger’s inequality gives on expansion is (1-λ 2 )/2 < ½, while Ф(δ) can be close to 1. Consider graph G Connect pairs of points on {0,1} n that are εn Hamming distance away. Then Second eigenvalue ≈ 1- ε and ф(1/2) ≈ ε yet Ф(δ) ≈ 1 (small sets have near-perfect expansion) A SIMPLE S DP RELAXATION cannot distinguish between -all small sets expand almost completely -exists small set with almost no expansion

28 A Conjecture Small-Set Expansion Conjecture : 8 η>0, 9 ± >0 such that GapSSE(η, ± ) is NP-hard, i.e., Given a graph G, it is NP-hard to distinguish YES: expansion(S) < η for some S of volume ¼ ± NO: expansion(S) > 1- η for all S of volume ¼ ±

29 Road Map Algorithm: – Spectral Profile. Reductions within Expansion Relationship with Unique Games Conjecture

30 Gaussian Curve Gaussian Graph Vertices: all points in R d (d dimensional real space) Edges: Weights according to the following sampling procedure: Sample a random Gaussian variable x in R d Perturb x as follows to get y in R d Add an edge between x an y Γ ε (δ) = Gaussian noise sensitivity of a set of measure δ = least expanding sets are caps/thresholds of measure δ = Set Size δ 0.5 Ф(δ)Ф(δ) 1

31 Approximating Expansion Profile

32 Reductions within Expansion

33 Theorem [Raghavendra-Steurer-Tulsiani 10] For every positive integer q, and constants ε,δ,γ, given a graph it is SSE-hard to distinguish between: There exists q disjoint small sets S 1, S 2,.. S q of size close to 1/q, such that expansion(S i ) ≤ ε No set of size μ> δ has expansion less than size Γ ε/2 (μ) -- expansion of set of size μ in Gaussian graph with parameter ε/2 expansion (S) < sqrt (Λ * (δ)∙O(log 1/ μ))

34 Informal Statement Set Size δ 1 0.5 Ф(δ)Ф(δ) 1 Qualitative Assumption GapSSE is NPhard Quantitative Statement Given a graph G, it is SSE-hard to distinguish whether, There is a small set of size whose expansion is ε. Every small set of size μ (in a certain range) expands at least as much as the corresponding Gaussian graph with noise ε

35 Corollaries Corollary: The algorithm in [Raghavendra-Steurer Tetali 10] has near-optimal guarantee assuming SSE conjecture. Corollary: Assuming GapSSE, there is no constant factor approximation for Balanced Separator or Uniform Sparsest Cut.

36 Relation with Unique Games Conjecture

37 Unique Games Unique game ¡ : Each edge (A,B) has an associated map ¼ ¼ is bijection from L(A) to L(B) graph of size n A B label set L(A) ¼ Goal: Find an assignment of labels to vertices such that maximum number of edges are satisfied. A labelling satisfies (A,B) if ¼ (label of A) = label of B Referee sample (A,B, ¼ ) BA Player 1Player 2 pick b in L(B)pick a in L(A) ba Referee players win if ¼ (a) = b no communication between players value( ¡ ): maximum success probability over all strategies of the players

38 Unique Games Conjecture [Khot02] Unique Games Conjecture: [Khot ‘02] 8 ² >0, 9 q >0: NP-hard to distinguish for ¡ with label set size q YES: value( ¡ ) > 1- ² NO: value( ¡ ) < ²

39 Implications of UGC UGC B ASIC S DP is optimal for … Constraint Satisfaction Problems [Raghavendra`08] M AX C UT, M AX 2S AT Ordering CSPs [GMR`08] M AX A CYCLIC S UBGRAPH, B ETWEENESS Grothendieck Problems [KNS`08, RS`09] Metric Labeling Problems [MNRS`08] M ULTIWAY C UT, 0- EXTENSION Kernel Clustering Problems [KN`08,10] Strict Monotone CSPs [KMTV`10] V ERTEX C OVER, H YPERGRAPH V ERTEX C OVER …

40 If we could show, then a refutation of UGC would imply an improved algorithm for P ROBLEM X “Reverse Reductions” UGC B ASIC S DP is optimal for lots of optimization problems, e.g.: M AX C UT and V ERTEX C OVER B ASIC S DP optimal for P ROBLEM X ? * * Parallel Repetition is natural candidate reduction for [FeigeKinderO’Donnell’07] * Win-Win Situation Bad news: this reduction cannot work [Raz’08, BHHRRS’08] P ROBLEM X = M AX C UT

41 Small Set Expansion and Unique Games Solving Unique Games  Finding a small non- expanding set in the “label extended graph” Theorem [Raghavendra-Steurer 10] Small Set Expansion Conjecture  Unique Games Conjecture Establishes a reverse connection from a natural problem.

42 Implications of UGC UGC B ASIC S DP is optimal for … Constraint Satisfaction Problems [Raghavendra`08] M AX C UT, M AX 2S AT Ordering CSPs [GMR`08] M AX A CYCLIC S UBGRAPH, B ETWEENESS Grothendieck Problems [KNS`08, RS`09] Metric Labeling Problems [MNRS`08] M ULTIWAY C UT, 0- EXTENSION Kernel Clustering Problems [KN`08,10] Strict Monotone CSPs [KMTV`10] V ERTEX C OVER, H YPERGRAPH V ERTEX C OVER … Uniform Sparsest Cut [KNS`08, RS`09] Minimum Linear Arrangement [KNS`08, RS`09] Gap SSE UGC With expansion

43 Most known SDP integrality gap instances for problems like MaxCut, Vertex Cover, Unique games have graphs that are “small set expanders” Theorem [Raghavendra-Steurer-Tulsiani 10] Small Set Expansion Conjecture  MaxCut or Unique Games on Small Set Expanders is hard. Reverse Connections?

44 Approximating Spectral Profile

45

46 Roadmap Introduction Graph Expansion: Cheeger’s Inequality, Leighton Rao, ARV Expansion Profile: Small Sets expand more than large ones. Cheeger’s inequality and SDPs fail GapSSE Problem Relation to Unique Games Conjecture: Unique Games definition, Applications, lack of reverse reductions. Label extended graph  Small sets Small Set Expansion Conjecture  UGC UGC with SSE is easy Algorithm for SSE: Spectral Profile, SDP for Spectral Profile, Rounding algorithm Relations within expansion: GapSSE  Balanced Separator Hardness

47

48 NP-hard Optimization Example: M AX C UT : partition vertices of a graph into two sets so as to maximize number of cut edges fundamental graph partitioning problem benchmark for algorithmic techniques

49 Approximation M AX C UT Trivial approximation First non-trivial approximation based on a semidefinite relaxation (B ASIC S DP ) Random assignment, cut ½ of the edges Goemans–Williamson algorithm (‘95) approx-ratio 0.878 Beyond Max Cut: Analogous B ASIC S DP relaxation for many other problems Almost always, B ASIC S DP gives best known approximation (often strictly better than non-SDP methods)

50 Approximation M AX C UT Trivial approximation based on a semidefinite relaxation (B ASIC S DP ) Random assignment, cut ½ of the edges Goemans–Williamson algorithm (‘95) approx-ratio 0.878 Can we beat the approximation guarantee of B ASIC S DP ? First non-trivial approximation

51 Unique Games Conjecture Unique Games Conjecture [Khot’02] (roughly): it is NP-hard to approximate the value of Unique Games certain optimization problem: given equations of the form x i – x j = c ij mod q satisfy as many as possible Can we beat the approximation guarantee of B ASIC S DP for Max Cut? No, assuming Khot’s Unique Games Conjecture! [KKMO`05, MOO`05, OW’08]

52 UGC: Is it true? UGC ? on expanding instances UGC on product instances [AKKSTV’08, AIMS’10] [BHHRRS’08,Raz’08] UGC for certain SDP hierarchies [RaghavendraS’09,KhotSaket’09] Any non-trivial consequences if UGC is false? What are hard instances?

53 “Reverse Reductions” UGC B ASIC S DP is optimal for lots of optimization problems, e.g.: M AX C UT and V ERTEX C OVER Win-Win Situation A refutation of UGC implies an improved algorithm for S MALL -S ET E XPANSION (better than B ASIC S DP !) S MALL -S ET E XPANSION First reduction from natural combinatorial problem to U NIQUE G AMES B ASIC S DP optimal for P ROBLEM X

54 Approximating Small-Set Expansion Expansion profile of G at ± : minimum expansion(S) over all S with volume < ± S How well can we approximate the expansion profile of G for small ± ? B ASIC S DP cannot distinguish between -all small sets expand almost completely -exists small set with almost no expansion Small-Set Expansion Conjecture: 8 ² >0, 9 ± >0: NP-hard to distinguish YES: expansion(S) < ² for some S of volume ¼ ± NO: expansion(S) > 1- ² for all S of volume ¼ ±

55 Unique 2-Prover Games Referee sample (A,B, ¼ ) from D BA Player 1Player 2 pick b in L(B)pick a in L(A) ba Referee players win if ¼ (a) = b Unique game ¡ : no communication between players value( ¡ ): maximum success probability over all strategies of the players distribution D over triples (A,B, ¼ ) - A and B are from U - ¼ is bijection from L(A) to L(B) universe of size n A B label set L(A) label set L(B) ¼

56 Approximating Unique Games Unique Games Conjecture: [Khot ‘02] 8 ² >0, 9 q >0: NP-hard to distinguish for ¡ with label set size q YES: value( ¡ ) > 1- ² NO: value( ¡ ) < ² How well can we approximate the value of a unique game for large label sets? B ASIC S DP cannot distinguish between games with value ¼ 1 and ¼ 0 for large label sets

57 Approximating Unique Games Unique Games Conjecture: 8 ² >0, 9 q >0: NP-hard to distinguish for ¡ with label set size q YES: value( ¡ ) > 1- ² NO: value( ¡ ) < ² Our main theorem: Small-Set Expansion Conjecture ) Unique Games Conjecture

58 Reduction : Small-Set Expansion  Unique Games Task: find non-expanding set of volume ¼ ± graph G A B Referee sample R = 1/ ± random edges M A = one half of each edge B = other half of each edge BA Player 1Player 2 pick b 2 Bpick a 2 A ba Referee players win if (a,b) 2 M

59 P ( ) < ² Completeness Small Non-Expanding Set  (Partial) Strategy graph G A B S Suppose expansion(S) < ² Strategy for Player 1: pick a 2 A if a is unique intersection with S otherwise, refuse to answer With constant probability, |A Å S |= {a} Conditioned on this event: other half of a’s edge outside of S  partial game value > 1 - ² referee allows players to refuse for few queries

60 Soundness Strategy  Small Non-Expanding Set standard trick: can assume both players have same strategy Suppose: players win with prob > 1- ² Idea: strategy  distribution over sets - sample R-1 vertices U - output S = { x | Player 1 picks x if A=U+x } Easy to show: E volume(S) = 1/R = ± E # edges leaving S E d |S| < ² Problem: volume(S) might not be concentrated around ± Are we done? No!

61 Soundness Strategy  Small Non-Expanding Set Idea: Referee adds ² -noise to A and B + ² -noise New distribution over sets: - sample R-1 vertices U - S = { x | players pick x if A = U+x+noise with probability > ½ } Intuition: players cannot distinguish x and noise Can show: 8 U. volume(S) < = ± / ² ² R random vertices # noise vertices 1 Suppose: players win with prob > 1- ²

62 Summary UGC B ASIC S DP is optimal for lots of optimization problems, e.g.: M AX C UT and V ERTEX C OVER B ASIC S DP optimal for S MALL -S ET E XPANSION more reverse reductions? Open Questions (+ Subsequent Work) Hardness results based on Small-Set Expansion Conjecture?  Reductions between Expansion Problems [Raghavendra S Tulsiani’10] Thanks! Questions? noise here vs. noise in other hardness reductions? Better algorithms for U NIQUE G AMES via S MALL -S ET E XPANSION ?  2 n poly( ² ) algorithm for (1- ², ½ )-UG via SSE [Arora Barak S’10]

63

64

65

66

67

68 APPENDIX

69 Rounding a UG strategy to a small non-expanding set

70

71 Unique Games decorated bipartite graph u v label set L(v) label set L(u) bijection ¼ uv : L(u)  L(v) Referee random edge (u,v) vu Player 1Player 2 pick j in L(v)pick i in L(u) ji Referee players win if ¼ uv (i) = j Unique Games instance ¡ : each player knows only half of referee’s edge value( ¡ ): maximum success probability over all strategies of the players

72 Approximation M AX C UT Trivial approximation First non-trivial approximation based on a semidefinite relaxation (B ASIC S DP ) Random assignment, cut ½ of the edges Goemans–Williamson algorithm (‘95) approx-ratio 0.878 General constraint satisfaction problems (CSPs): for every CSP: approximation matches integrality gap of certain SDP near-linear running time [S’10] simple generic approximation algorithm [RaghavendraS’09] Can we beat the approximation guarantee of B ASIC SDP (for Max Cut)?

73 Combinatorial Optimization Example: M AX C UT : partition vertices of a graph into two sets so as to maximize number of cut edges CAIDA at UCSD analyzes business relationships among Autonomous Systems in the internet using M AX 2S AT ¼ M AX C UT A concrete practical application:


Download ppt "Approximations for Isoperimetric and Spectral Profile and Related Parameters Prasad Raghavendra MSR New England S David Steurer Princeton University Prasad."

Similar presentations


Ads by Google