Mixing Times of Self-Organizing Lists and Biased Permutations Sarah Miracle Georgia Institute of Technology.

Slides:



Advertisements
Similar presentations
Slow and Fast Mixing of Tempering and Swapping for the Potts Model Nayantara Bhatnagar, UC Berkeley Dana Randall, Georgia Tech.
Advertisements

The Cover Time of Random Walks Uriel Feige Weizmann Institute.
Great Theoretical Ideas in Computer Science
1 Decomposing Hypergraphs with Hypertrees Raphael Yuster University of Haifa - Oranim.
1 Discrete Structures & Algorithms Graphs and Trees: III EECE 320.
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
Graph Isomorphism Algorithms and networks. Graph Isomorphism 2 Today Graph isomorphism: definition Complexity: isomorphism completeness The refinement.
Study Group Randomized Algorithms 21 st June 03. Topics Covered Game Tree Evaluation –its expected run time is better than the worst- case complexity.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
Forest Diagrams for Thompson’s Group Jim Belk. Associative Laws Consider the following piecewise-linear homeomorphism of  :
Approximation Algorithms for Unique Games Luca Trevisan Slides by Avi Eyal.
Artur Czumaj Dept of Computer Science & DIMAP University of Warwick Testing Expansion in Bounded Degree Graphs Joint work with Christian Sohler.
1 By Gil Kalai Institute of Mathematics and Center for Rationality, Hebrew University, Jerusalem, Israel presented by: Yair Cymbalista.
11 - Markov Chains Jim Vallandingham.
Mixing Dana Randall Georgia Tech A tutorial on Markov chains ( Slides at: )
Lecture 3: Markov processes, master equation
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Graph Clustering. Why graph clustering is useful? Distance matrices are graphs  as useful as any other clustering Identification of communities in social.
Felix Fischer, Ariel D. Procaccia and Alex Samorodnitsky.
Graph Sparsifiers by Edge-Connectivity and Random Spanning Trees Nick Harvey University of Waterloo Department of Combinatorics and Optimization Joint.
Randomized Algorithms and Randomized Rounding Lecture 21: April 13 G n 2 leaves
Parallel Routing Bruce, Chiu-Wing Sham. Overview Background Routing in parallel computers Routing in hypercube network –Bit-fixing routing algorithm –Randomized.
Great Theoretical Ideas in Computer Science.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Zig-Zag Expanders Seminar in Theory and Algorithmic Research Sashka Davis UCSD, April 2005 “ Entropy Waves, the Zig-Zag Graph Product, and New Constant-
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
E.G.M. Petrakissearching1 Searching  Find an element in a collection in the main memory or on the disk  collection: (K 1,I 1 ),(K 2,I 2 )…(K N,I N )
May 7 th, 2006 On the distribution of edges in random regular graphs Sonny Ben-Shimon and Michael Krivelevich.
1 Biased card shuffling and the asymmetric exclusion process Elchanan Mossel, Microsoft Research Joint work with Itai Benjamini, Microsoft Research Noam.
Undirected ST-Connectivity In Log Space
K-Coloring k-coloring: A k-coloring of a graph G is a labeling f: V(G)  S, where |S|=k. The labels are colors; the vertices of one color form a color.
K-Coloring k-coloring: A k-coloring of a graph G is a labeling f: V(G)  S, where |S|=k. The labels are colors; the vertices of one color form a color.
Mixing Times of Markov Chains for Self-Organizing Lists and Biased Permutations Prateek Bhakta, Sarah Miracle, Dana Randall and Amanda Streib.
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Randomized Algorithms Morteza ZadiMoghaddam Amin Sayedi.
1 Graphs with tiny vector chromatic numbers and huge chromatic numbers Michael Langberg Weizmann Institute of Science Joint work with U. Feige and G. Schechtman.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
Algorithms to Approximately Count and Sample Conforming Colorings of Graphs Sarah Miracle and Dana Randall Georgia Institute of Technology (B,B)(B,B) (R,B)(R,B)
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
Introduction to Algorithms Jiafen Liu Sept
Many random walks are faster than one Noga AlonTel Aviv University Chen AvinBen Gurion University Michal KouckyCzech Academy of Sciences Gady KozmaWeizmann.
15-853:Algorithms in the Real World
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
1/19 Minimizing weighted completion time with precedence constraints Nikhil Bansal (IBM) Subhash Khot (NYU)
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Random walks on undirected graphs and a little bit about Markov Chains Guy.
Topics in Algorithms 2007 Ramesh Hariharan. Tree Embeddings.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
Artur Czumaj DIMAP DIMAP (Centre for Discrete Maths and it Applications) Computer Science & Department of Computer Science University of Warwick Testing.
Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang.
The Poincaré Constant of a Random Walk in High- Dimensional Convex Bodies Ivona Bezáková Thesis Advisor: Prof. Eric Vigoda.
Daphne Koller Sampling Methods Metropolis- Hastings Algorithm Probabilistic Graphical Models Inference.
拉丁方陣 交大應數系 蔡奕正. Definition A Latin square of order n with entries from an n-set X is an n * n array L in which every cell contains an element of X such.
Counting and Sampling in Lattices: The Computer Science Perspective Dana Randall Advance Professor of Computing Georgia Institute of Technology.
Domino Tilings of the Chessboard Dana Randall Computer Science and Mathematics Depts. Georgia Institute of Technology.
Shuffling by semi-random transpositions Elchanan Mossel, U.C. Berkeley Joint work with Yuval Peres and Alistair Sinclair.
Equitable Rectangular Dissections Dana Randall Georgia Institute of Technology Joint with: Sarah Cannon and Sarah Miracle.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Markov Chains and Random Walks
Sequential Algorithms for Generating Random Graphs
Random walks on undirected graphs and a little bit about Markov Chains
Markov Chains Mixing Times Lecture 5
Path Coupling And Approximate Counting
R. Srikant University of Illinois at Urbana-Champaign
Reconstruction on trees and Phylogeny 1
Glauber Dynamics on Trees and Hyperbolic Graphs
On the effect of randomness on planted 3-coloring models
Seminar on Markov Chains and Mixing Times Elad Katz
Slow Mixing of Local Dynamics via Topological Obstructions
Switching Lemmas and Proof Complexity
Presentation transcript:

Mixing Times of Self-Organizing Lists and Biased Permutations Sarah Miracle Georgia Institute of Technology

Joint work with.... Prateek BhaktaDana RandallAmanda Streib

Biased Card Shuffling 5n173...ij n pick a pair of adjacent cards uniformly at random - put j ahead of i with probability p j,i = 1- p i,j Converges to: π(σ) = Π / Z i σ(j) p ij p ji This is related to the “Move-Ahead-One Algorithm” for self-organizing lists.

What is already known? - proved this conjecture for n ≤ 4 Q: If the {p ij } are positively biased, is M nn always rapidly mixing? If p i,j ≥ ½ ∀ i < j we say the chain is positively biased. Conjecture (Fill): If {p ij } satisfy a “regularity” condition, then the spectral gap is minimized when p ij = 1/2 ∀ i,j Uniform bias: If p i,j = ½ ∀ i, j then M nn mixes in θ(n 3 log n) [Wilson] Constant bias: If p i,j = p > ½ ∀ i < j, then M nn mixes in θ(n 2 ) time [Benjamini, Berger, Hoffman, Mossel]

What is already known? Uniform bias: If p i,j = ½ ∀ i, j then M nn mixes in θ(n 3 log n) [Wilson] Constant bias: If p i,j = p > ½ ∀ i < j, then M nn mixes in θ(n 2 ) time [Benjamini, Berger, Hoffman, Mossel] Linear extensions of a partial order: If p i,j = ½ or 1 ∀ i < j, then M nn mixes in O(n 3 log n) time [Bubley, Dyer] M nn is fast for two new classes: “Choose your weapon” and “League hierarchies” [Bhakta, M., Randall, Streib] M nn can be slow even when the chain is positively biased [Bhakta, M., Randall, Streib]

Talk Outline 1.Background 2.New Classes of Bias where M nn is fast Choose your Weapon League Hierarchies 3.M nn can be slow ✔

The mixing time Definition: The total variation distance is ||P t,π|| = max __ ∑ |P t (x,y) - π(x)|. x  Ω y   Ω 2 1 A Markov chain is rapidly mixing if  (  ) is poly(n, log(  -1 )). (or polynomially mixing) Definition: Given , the mixing time is  = min { t: ||P t’,π|| < , t’ ≥ t }. A A Markov chain is slowly mixing if  (  ) is at least exp(n).

Choose your weapon Thm 1: Let p i,j = r i ∀ i < j. Then M NN is rapidly mixing. Proof sketch: A. Define auxiliary Markov chain M’ B. Show M’ is rapidly mixing C. Compare the mixing times of M NN and M’ M’ can swap pairs that are not nearest neighbors -Maintains the same stationary distribution -Allowed moves are based on inversion tables Given r 1, …, r n-1, r i ≥ 1/2.

Inversion Tables Inversion Table I σ : Permutation σ: I σ (i) = # elements j > i appearing before i in σ The map I is a bijection from S n to T = { (x 1,x 2,...,x n ): 0 ≤ x i ≤ n-i }. I σ (1) I σ (2)

Inversion Tables Inversion Table I σ : Permutation σ: M’ on Inversion Tables - choose a column i uniformly - w.p. r i : subtract 1 from x i (if x i >0) - w.p. 1- r i : add 1 to x i (if x i <n-i) M’ is just a product of n independent biased random walks M’ is rapidly mixing. 

Choose your weapon Thm 1: Let p i,j = r i ∀ i < j. Then M NN is rapidly mixing. Proof sketch: A. Define auxiliary Markov chain M’ B. Show M’ is rapidly mixing C. Compare the mixing times of M NN and M’ Given r 1, …, r n-1, r i ≥ 1/2. ✔ ✔

(Diaconis,Saloff-Coste’93) Unknown (M nn ) P Known (M’) P _ w z For each edge (x,y)  P, make a path  x,y using edges in P. Let  (z,w) be the set of paths  x,y using (z,w) _ x y Thm: Gap(P) ≥ Gap(P). _ 1 A A = max { ∑ |  x, y |π(x)P(x,y) } 1 Q(e) e   xy e _  Comparison Theorem

Inversion Tables Inversion Table I σ : Permutation σ: I σ (i) = # elements j > i appearing before i in σ M ’ on Inversion Tables - choose a column i uniformly - w.p. r i : subtract 1 from x i (if x i > 0) - w.p. 1- r i : add 1 to x i (if x i < n-i) - swap element i with the first j>i to the left w.p. r i - swap element i with the first j>i to the right w.p. 1-r i M ’ on Permutations - choose a card i uniformly

Want stationary dist.: Permutation σ: Permutation τ: π(σ) = Π / Z i σ(j) p ji p ij P ’ (σ,τ) P ’ (τ,σ) π(τ) π(σ) = p 1,2 p 3,2 p 3,1 p 2,1 p 2,3 p 1,3 = Comparing M’ and M nn - swap element i with the first j>i to the left w.p. r i - swap element i with the first j>i to the right w.p. 1-r i M ’ on Permutations - choose a card i uniformly P ’ (σ,τ) = r 2 = p 2,3 p 3,2 p 2,3 =

Permutation σ: Permutation τ: Comparing M’ and M nn

League Hierarchy Let T be a binary tree with leaves labeled {1,…,n}. Given q v ≥ 1/2 for each internal vertex v. Thm 2: Let p i,j = q i^j for all i < j. Then M NN is rapidly mixing.

League Hierarchy Let T be a binary tree with leaves labeled {1,…,n}. Given q v ≥ 1/2 for each internal vertex v. Thm 2: Let p i,j = q i^j for all i < j. Then M NN is rapidly mixing. A-League B-League

League Hierarchy Let T be a binary tree with leaves labeled {1,…,n}. Given q v ≥ 1/2 for each internal vertex v. Thm 2: Let p i,j = q i^j for all i < j. Then M NN is rapidly mixing. 1 st Tier 2 nd Tier

League Hierarchy Let T be a binary tree with leaves labeled {1,…,n}. Given q v ≥ 1/2 for each internal vertex v. Thm 2: Let p i,j = q i^j for all i < j. Then M NN is rapidly mixing. Proof sketch: A. Define auxiliary Markov chain M’ B. Show M’ is rapidly mixing C. Compare the mixing times of M NN and M’ M’ can swap pairs that are not nearest neighbors -Maintains the same stationary distribution -Allowed moves are based on the binary tree T

League Hierarchy Let T be a binary tree with leaves labeled {1,…,n}. Given q v ≥ 1/2 for each internal vertex v. Thm 2: Let p i,j = q i^j for all i < j. Then M NN is rapidly mixing.

Thm 2: Proof sketch Let T be a binary tree with leaves labeled {1,…,n}. Given q v ≥ 1/2 for each internal vertex v. Thm 2: Let p i,j = q i^j for all i < j. Then M NN is rapidly mixing.

Thm 2: Proof sketch Let T be a binary tree with leaves labeled {1,…,n}. Given q v ≥ 1/2 for each internal vertex v. Thm 2: Let p i,j = q i^j for all i < j. Then M NN is rapidly mixing.

Thm 2: Proof sketch Let T be a binary tree with leaves labeled {1,…,n}. Given q v ≥ 1/2 for each internal vertex v. Thm 2: Let p i,j = q i^j for all i < j. Then M NN is rapidly mixing.

Thm 2: Proof sketch Markov chain M’ allows a transposition if it corresponds to an ASEP move on one of the internal vertices. q 2^4 1 - q 2^4

Want stationary distribution: π(σ) = Π / Z i σ(j) p ji p ij P ’ (σ,τ) P ’ (τ,σ) π(τ) π(σ) = p 1,5 p 3,5 p 6,8 p 6,9 p 1,6 p 3,6 p 5,8 p 6,9 p 5,6 p 5,1 p 5,3 p 8,6 p 9,6 p 6,1 p 6,3 p 8,5 p 9,6 p 6,5 = What about the Stationary Distribution? P ’ (σ,τ) =1 – v 5^6 = p 6,5 p 5,6 p 6,5 = Permutation τ: Permutation σ: 15

Thm 2: Proof sketch Markov chain M’ allows a transposition if it corresponds to an ASEP move on one of the internal vertices. Each ASEP is rapidly mixing M’ is rapidly mixing. M NN is also rapidly mixing if {p} is weakly regular. i.e., for all i, p i,j i. (by comparison) 

Permutation σ:Permutation τ: Comparing M’ and M nn

So M NN is rapidly mixing when Choose your weapon: p i,j = r i ≥ 1/2 ∀ i < j Tree hierarchies: p i,j = q i^j > 1/2 ∀ i < j

Talk Outline 1.Background 2.New Classes of Bias where M nn is fast Choose your Weapon League Hierarchies 3.M nn can be slow

But…..M NN can be slow Thm 3: There are examples of positively biased {p ij } for which M NN is slowly mixing. 1. Reduce to biased lattice paths 13n n +1 n 2 always in order { 1 if i < j ≤ 2 n or < i < j 2 n p ij = Permutation σ: ’ s and 0 ’ s 2 n 2 n

Slow mixing example Thm 3: There are examples of positively biased {p} for which M NN is slowly mixing. 1. Reduce to biased lattice paths { 1 if i < j ≤ 2 n or < i < j 2 n p ij = 1/2+1/n 2 if i+(n-j+1) < M Each choice of p ij where i ≤ < j 2 n determines the bias on square (i, n-j+1) (“fluctuating bias”) 1/2 + 1/n 2 M= n 2/3 1- δ otherwise 1-δ 2. Define bias on individual cells (non-uniform growth proc.) [Greenberg, Pascoe, Randall]

Slow mixing example Thm 3: There are examples of positively biased {p} for which M NN is slowly mixing. 1. Reduce to biased lattice paths 2. Define bias on individual cells 3. Show that there is a “bad cut” in the state space Implies that M NN can take exp. time to reach stationarity. S S  Therefore biased card shuffling can be slow too!

Open Problems 1.Is M NN always rapidly mixing when {p i,j } are positively biased and satisfy a regularity condition? (i.e., p i,j is decreasing in i and j) 2. When does bias speed up or slow down a chain?

Thank you!