Expander Flows, Graph Spectra and Graph Separators Umesh Vazirani U.C. Berkeley Based on joint work with Khandekar and Rao and with Orrechia, Schulman.

Slides:



Advertisements
Similar presentations
Geometry and Expansion: A survey of some results Sanjeev Arora Princeton ( touches upon: S. A., Satish Rao, Umesh Vazirani, STOC04; S. A., Elad Hazan,
Advertisements

Routing in Undirected Graphs with Constant Congestion Julia Chuzhoy Toyota Technological Institute at Chicago.
Primal Dual Combinatorial Algorithms Qihui Zhu May 11, 2009.
TexPoint fonts used in EMF.
Satyen Kale (Yahoo! Research) Joint work with Sanjeev Arora (Princeton)
Poly-Logarithmic Approximation for EDP with Congestion 2
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
Primal-Dual Algorithms for Connected Facility Location Chaitanya SwamyAmit Kumar Cornell University.
Metric Embeddings with Relaxed Guarantees Hubert Chan Joint work with Kedar Dhamdhere, Anupam Gupta, Jon Kleinberg, Aleksandrs Slivkins.
Information Networks Graph Clustering Lecture 14.
Online Social Networks and Media. Graph partitioning The general problem – Input: a graph G=(V,E) edge (u,v) denotes similarity between u and v weighted.
Metric embeddings, graph expansion, and high-dimensional convex geometry James R. Lee Institute for Advanced Study.
Geometric embeddings and graph expansion James R. Lee Institute for Advanced Study (Princeton) University of Washington (Seattle)
Embedding Metrics into Ultrametrics and Graphs into Spanning Trees with Constant Average Distortion Ittai Abraham, Yair Bartal, Ofer Neiman The Hebrew.
Approximation Algorithms for Unique Games Luca Trevisan Slides by Avi Eyal.
Interchanging distance and capacity in probabilistic mappings Uriel Feige Weizmann Institute.
Distance Scales, Embeddings, and Metrics of Negative Type By James R. Lee Presented by Andy Drucker Mar. 8, 2007 CSE 254: Metric Embeddings.
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Graph Clustering. Why graph clustering is useful? Distance matrices are graphs  as useful as any other clustering Identification of communities in social.
Graph Sparsifiers by Edge-Connectivity and Random Spanning Trees Nick Harvey U. Waterloo Department of Combinatorics and Optimization Joint work with Isaac.
Graph Clustering. Why graph clustering is useful? Distance matrices are graphs  as useful as any other clustering Identification of communities in social.
Graph Sparsifiers by Edge-Connectivity and Random Spanning Trees Nick Harvey University of Waterloo Department of Combinatorics and Optimization Joint.
Graph Sparsifiers by Edge-Connectivity and Random Spanning Trees Nick Harvey U. Waterloo C&O Joint work with Isaac Fung TexPoint fonts used in EMF. Read.
Sparsest Cut S S  G) = min |E(S, S)| |S| S µ V G = (V, E) c- balanced separator  G) = min |E(S, S)| |S| S µ V c |S| ¸ c ¢ |V| Both NP-hard.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
Semidefinite Programming
Geometry and Expansion: A survey of recent results Sanjeev Arora Princeton ( touches upon: S. A., Satish Rao, Umesh Vazirani, STOC’04; S. A., Elad Hazan,
Approximation Algorithm: Iterative Rounding Lecture 15: March 9.
Expander flows, geometric embeddings, and graph partitioning Sanjeev Arora Princeton Satish Rao UC Berkeley Umesh Vazirani UC Berkeley ( + survey of other.
Expander flows, geometric embeddings, and graph partitioning Sanjeev Arora Princeton Satish Rao UC Berkeley Umesh Vazirani UC Berkeley.
Geometric Embeddings, Graph Partitioning, and Expander flows: A survey of recent results Sanjeev Arora Princeton ( touches upon: S. A., Satish Rao, Umesh.
Geometric Embeddings, Graph Partitioning, and Expander flows: A survey of recent results Sanjeev Arora Princeton ( touches upon: S. A., Satish Rao, Umesh.
EDA (CS286.5b) Day 6 Partitioning: Spectral + MinCut.
Geometric Embeddings, Graph Partitioning, and Expander flows: A survey of recent results Sanjeev Arora Princeton ( touches upon: S. A., Satish Rao, Umesh.
Primal-Dual Algorithms for Connected Facility Location Chaitanya SwamyAmit Kumar Cornell University.
SDP Based Approach for Graph Partitioning and Embedding Negative Type Metrics into L 1 Subhash Khot (Georgia Tech) Nisheeth K. Vishnoi (IBM Research and.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
Finding Almost-Perfect
Approximation Algorithms for Graph Routing Problems Julia Chuzhoy Toyota Technological Institute at Chicago.
Dana Moshkovitz, MIT Joint work with Subhash Khot, NYU.
Integrality Gaps for Sparsest Cut and Minimum Linear Arrangement Problems Nikhil R. Devanur Subhash A. Khot Rishi Saket Nisheeth K. Vishnoi.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University COT 5410 – Spring 2004.
Graph Sparsifiers Nick Harvey University of British Columbia Based on joint work with Isaac Fung, and independent work of Ramesh Hariharan & Debmalya Panigrahi.
Institute for Advanced Study, April Sushant Sachdeva Princeton University Joint work with Lorenzo Orecchia, Nisheeth K. Vishnoi Linear Time Graph.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Graph Sparsifiers Nick Harvey Joint work with Isaac Fung TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
13 th Nov Geometry of Graphs and It’s Applications Suijt P Gujar. Topics in Approximation Algorithms Instructor : T Kavitha.
Geometry and Expansion: A survey of recent results Sanjeev Arora Princeton ( touches upon: S. A., Satish Rao, Umesh Vazirani, STOC’04; S. A., Elad Hazan,
Embeddings, flow, and cuts: an introduction University of Washington James R. Lee.
Topics in Algorithms 2007 Ramesh Hariharan. Tree Embeddings.
Multicommodity flow, well-linked terminals and routing problems Chandra Chekuri Lucent Bell Labs Joint work with Sanjeev Khanna and Bruce Shepherd Mostly.
Artur Czumaj DIMAP DIMAP (Centre for Discrete Maths and it Applications) Computer Science & Department of Computer Science University of Warwick Testing.
Domain decomposition in parallel computing Ashok Srinivasan Florida State University.
New algorithms for Disjoint Paths and Routing Problems
Unique Games Approximation Amit Weinstein Complexity Seminar, Fall 2006 Based on: “Near Optimal Algorithms for Unique Games" by M. Charikar, K. Makarychev,
Graph Partitioning using Single Commodity Flows
Lower Bounds for Embedding Edit Distance into Normed Spaces A. Andoni, M. Deza, A. Gupta, P. Indyk, S. Raskhodnikova.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
Multi-way spectral partitioning and higher-order Cheeger inequalities University of Washington James R. Lee Stanford University Luca Trevisan Shayan Oveis.
Theory of Computing Lecture 12 MAS 714 Hartmut Klauck.
Generalized Sparsest Cut and Embeddings of Negative-Type Metrics
Haim Kaplan and Uri Zwick
June 2017 High Density Clusters.
A Combinatorial, Primal-Dual Approach to Semidefinite Programs
Structural Properties of Low Threshold Rank Graphs
Instructor: Shengyu Zhang
On the effect of randomness on planted 3-coloring models
Chaitanya Swamy University of Waterloo
Embedding Metrics into Geometric Spaces
Lecture 15: Least Square Regression Metric Embeddings
Presentation transcript:

Expander Flows, Graph Spectra and Graph Separators Umesh Vazirani U.C. Berkeley Based on joint work with Khandekar and Rao and with Orrechia, Schulman and Vishnoi

Graph Separators S T Sparsest Cut/Edge Expansion: c-Balanced Separator:

Applications Clustering Image segmentation VLSI layout Underlie many divide-and-conquer graph algorithms

Interesting Techniques Spectral methods. Connection to differential geometry, discrete isoperimetric inequalities. Linear/semidefinite programming Measure concentration Metric embeddings

Geometrical view Map vertices to points in some abstract space: - points well-spread - edges short

Geometrical view Map vertices to points in some abstract space: - points well-spread - edges short “Good bisection” of the space yields sparse cut in graph

Spectral Method Cut at random Minimize sum of “edge lengths”: Spread out vertices: [Cheeger’70] [Alon, Milman ’85][Jerrum, Sinclair’89]

Leighton-Rao ‘ Cut along ball of random radius Distances form a metric: satisfy triangle inequality. w ij + w jk >= w ik Minimize sum of “edge lengths”: Spread out vertices: O(log n) approximation: Approximate max-flow min-cut thm for multi-commodity flows.

ARV ‘04 Triangle inequality: Unit sphere in R d Unit L 2 2 embedding: No angles obtuse Minimize sum of “edge lengths” Spread out vertices Procedure to recover cut of size

ARV Procedure to recover cut Slice a randomly oriented “fat”-hyperplane of width Unit sphere in R d

ARV Procedure to recover cut Slice a randomly oriented “fat”-hyperplane of width Discard pairs of points (u,v): Arrange points according to distance from S Cut along ball of random radius r: Unit sphere in R d S

Metric Embeddings Finite Metric Space (X, d) x y R k with L 2 norm f(x) f(y) Distortion of f is min c: [Bourgain ’85] Every finite metric space can be embedded in L 2 with distortion O(log n). Longstanding open question: Better bound for L 1 ? [Enflo ’69] [Arora, Lee, Naor ’05] Any finite L 1 metric can be embedded in L 2 with distortion f()

Today’s Talk Leighton-Rao: multi-commodity flow O(n 2 ). Arora, Hazan, Kale: O * (n 2 ) ARV implementation based on expander-flow formalism Much faster in practice. [Khandekar-Rao-V] : O*(min{n 1.5, n/α(G)}) single commodity flow based algorithm. O(log 2 n) approx. ratio. [Arora, Kale]: matrix multiplicative weights algorithm based O(log n) approx [Orrechia, Schulman, V, Vishnoi] O(log n) approx using KRV style algorithm Multi-commodity flow: Single commodity flow:

Expander Flows Any algorithm for approximating sparse cuts must find a good cut, of expansion say β Must also certify no cut is much smaller. To give a k-approximation must certify that no cut has expansion less than β/k. Problem: there are exponentially many cuts. S T

Expander Flows G = H = For each edge of H, route one unit of flow through G

Expander Flows G = H = For each edge of H, route one unit of flow through G Must route Ώ(|S|) units of flow from S to T. Therefore |E S,T | = Ώ(|S|/c) expansion = Ώ(1/c) Ideally c = O(1/α(G)) S T max congestion = c expansion = Ώ(1/c)

Expander Flows max congestion = c. expansion = Ώ(1/c). ARV: max congestion = Leighton-Rao: H = complete graph. max cong = O(logn/α(G)) tight example: G = expander graph. Motivating idea for ARV: write LP to find best embedding of H in G + exponentially many constraints saying H expander eigenvalue bound gives efficient test for expansion! Therefore poly time using Ellipsoid algorithm. [Arora, Hazan, Kalle] O * (n 2 ) implementation of ARV

Know large number of vertices on each side of cut. A max-flow, min-cut computation should reveal sparse cut. But this is circular… KRV s t

H Φ Embed candidate expander H in G with small congestion. Test whether H is expander (if so done!) Else non-expanding cut in H gives a bipartition of G; route a flow in G across this bipartition. Decompose flow into flow paths and add the resulting matching to H. Outline of Algorithm

Cut-Matching Game H Φ Cut Player Find bad cut in H Goal: min # iterations until H is an expander Matching Player Pick a perfect matching across cut Goal: max # iterations until H is an expander Claim: There is a cut player strategy that succeeds in O(log 2 n) rounds.

Finding a cut: Spectral-like-method = +1 charge = –1 charge Mix the charges along the matchings { M 1, M 2, …, M t } Random assignment of charge V: Vertex set x y (x+y)/2 After t iterations, H = { M 1, M 2, …, M t }.

Finding a cut: Spectral-like-method Order the vertices according to the final charge present and cut in half. n/2 S S But how to formalize intuition?

Lift to R n Cannot directly formalize previous intuition Therefore lift random walk to R n – walk embedding of H. n-dimensional vector associated with each vertex In each step, replace vectors at endpoints of matched edge by their average vector. Potential function to measure progress of this process. Potential function small implies H expander. Relate lifted process to original random walk: each successive matching decreases potential function.

Walk Embedding H R n, Vertex i mapped to P i = (p i1, …, p in ) p ij = P[walk started at j ends at i] H t = { M 1, M 2, …, M t }. Small cut in graph shows up as clusters in walk embedding. (1/n, …, 1/n) P1P1 P3P3 P2P2 PnPn Potential: Claim: ψ(t) ≤1/4n 2 implies α(H t )≥ ½ Will show potential reduces by (1 – 1/log n) in each iteration. Ψ(0) = n-1

(1/n, …, 1/n) P1P1 P3P3 P2P2 PnPn Main Question: How to augment H t = { M 1, M 2, …, M t } by M t+1 so H closer to expander? Potential: If M t+1 matches vertex u to vertex v, then potential reduction in t+1-st step Since each of P u and P v replaced by So potential reduction = The Lifted Walk

Potential Reduction PvPv  =  v |P v  1/n| 2 Reduction in  =  |green| 2 1-d: reduction =  (  ) n-d  1-d: log n stretch Actual potential reduction =  /log n  Original random walk = projection of lifted walk on random vector

Running time Number of iterations = O(log 2 n) Each iteration = 1 max-flow + O*(n) work = O*(m 3/2 ) [Benczur-Karger’96] In O*(m) time, we can transform any graph G on n vertices into G’ on same vertices: –G’ has O(n log (n)/ε 2 ) edges –All cuts in G’ have size within (1 ± ε) of those in G Overall running time = O*(m + n 3/2 )

Improving to O(log n) approximation [Arora, Kale]: matrix multiplicative weights algorithm based combinatorial primal-dual schema for semidefinite progs [Orrechia, Schulman, V, Vishnoi]: simple KRV style algorithm Idea: To find M t+1 perform t steps of natural random walk (instead of round-robin walk) on H t = { M 1, M 2, …, M t }

Brief Sketch Instead of showing that H has constant edge expansion after O(log 2 n) steps, will show that the spectral gap of H is at least 1/log n, and therefore the conductance of H is at least 1/log n. Since degree of H is log 2 n, this means its edge edge expansion is at least log n.

Why natural walk? Suppose round robin walk on M 1, …, M k mixes perfectly on each of S, T. Now a single averaging step on M k+1 ensures perfect mixing on entire graph! S T M k+1

Matrix inequality: Question: Replace ½ self-loop with a ¾ self-loop in round-robin random walk! x y (3x/4+y/4) (3y/4 + x/4 Gives a way of relating round robin walk to time independent walk.

Conclusions and Open Questions Our algorithm is very similar to some heuristics. [Lang’04] similar to one iteration of our algorithm. METIS [Karypis-Kumar’99] –collapses random edges –finds a good partition in collapsed graph –induces it up to original graph, using local search Connections with these heuristics? Rigorous analysis?

When the Expansion is large … Could have used [Spielman-Teng’04] “nibble” algorithm instead of walk-embedding. But: AlgorithmOutput sparsityRunning Time Spectral  1/2 n2/2n2/2 Spielman-Teng  1/3 log 3 nn/  3 KRV  log 2 nmin {n 3/2,n/  } Conjecture: A single iteration of round-robin walk + max-flow should give a sparse cut.

[Khot, Vishnoi] Ώ(loglog n) integrality gap [Orrechia, Schulman, V, Vishnoi] Ώ(√logn) bound on cut-matching game. Is it possible to obtain a O(√log n) approximation algorithm using single commodity flows via the cut-matching game? Limits to these methods