Presentation is loading. Please wait.

Presentation is loading. Please wait.

Random Walks Ben Hescott CS591a1 November 18, 2002.

Similar presentations


Presentation on theme: "Random Walks Ben Hescott CS591a1 November 18, 2002."— Presentation transcript:

1 Random Walks Ben Hescott CS591a1 November 18, 2002

2 Random Walk Definition Given an undirected, connected graph G(V,E) with |V| = n, |E| = m a random “step” in G is a move from some node u to a randomly selected neighbor v. A random walk is a sequence of these random steps starting from some initial node.

3 Points to note Processes is discrete G is not necessarily planar G is not necessarily fully connected A walk can back in on itself Can consider staying in same place as a move

4 Questions How many steps to get from u to v How many steps to get back to initial node How many steps to visit every node Easy questions to answer if we consider a simple example

5 Regular Graphs The expected number of steps to get from vertex u to v in a regular graph is n-1, The expected number of steps to get back to starting point is n for a regular graph. Expected number of steps to visit every node in a regular graph is

6 Triangle Example Consider probabilities of being at a particular vertex at each step in walk. Each of these can be consider a vector,

7 Transition Matrix We can use a matrix to represent transition probabilities, consider adjacency matrix A and diagonal matrix, D, with entries 1/d(i) where d(i) is degree of node i. Then we can define matrix M = DA For triangle d(i) = 2 so M = Note for triangle Pr[a to b] = Pr[b to a]

8 Markov Chains - Generalized Random Walks A Markov Chain is a stochastic process defined on a set of states with matrix of transition probabilities. The process is discrete, namely it is only in one state at given time step (0, 1, 2, …) Next move does not depend on previous moves, formally

9 Markov Chain Definitions Define vector where the i-th entry is the probability that the chain is in state t Note: Notice that we can then calculate everything given q0 and P.

10 More Definitions Consider question where am I after t steps, define t-step probability Question is this my first time at node j? Consider probability visit state j at some time t>0 when started at state i. Consider how many steps to get from state i to j. given,otherwise

11 Even More Definitions Consider f ii State i is called transient f ii < 1 State i is called persistent if f ii = 1 If state i is persistent and h ii is infinite then i is null-persistent If state i is persistent and h ii is not infinite then i is non-null-persistent Turns out every Markov Chain is either transient or non-null-persistent

12 Almost there A strong component of a directed graph G is a subgraph C of G where for each edge e ij there is a directed path from i to j and from j to i. A Markov Chain is irreducible if underlying graph G consists of a single strong component. A stationary distribution for a Markov Chain with transition matrix P is distribution  s.t. P  =  The periodicity of state i is max int T for which there is a q 0 and a>0 s.t. for all t, if then t is in arithmetic progression A state is periodic if T>1 and aperiodic otherwise An ergodic Markov Chain is one where all states are aperiodic and non-null persistent

13 Fundamental Theorem of Markov Chains Given any irreducible, finite, aperoidic Markov Chain then all of the following hold –The chain is ergodic –There is a unique stationary distribution  where for –for –Given N(i,t) number of time chain visits state i in t steps then

14 Random Walk is a Markov Chain Consider G a connected, non-bipartite, undirected graph with |V| = n, |E| = m. There is a corresponding Markov Chain Consider states of chain to be set of vertices Define transitional matrix to be

15 Interesting facts M G is irreducible since G is connected and undirected Notice the periodicity is the gcd of all cycles in G - closed walk –Smallest walk is 2 go one step and come back –Since G is non-bipartite then there is odd length cycle –GCD is then 1 so M G is aperiodic

16 Fundamental Theorem Holds So we have a stationary distribution, P  =  But Good news, Also get

17 Hitting Time Generally h ij, the expected number of steps needed before reaching node j when starting from node i is the hitting time. The commute time is the expected number of steps to reach node j when starting from i and then returning back to i. The commute is bounded by 2m We can express hitting time in terms of commute time as

18 Lollipop Hitting time from i to j not necessarily same a time from j to i. Consider the kite or lollipop graph. Here

19 Cover Time How long until we visit every node? The expected number of steps to reach every node in a graph G starting from node i is called the cover time, C i (G) Consider the maximum cover time over all nodes [Matthews]The maximum cover time of any graph with n nodes is (1+1/2+…+1/n) times the maximum hitting time between any two nodes - 2m(n-1)

20 Coupon Collector Want to collect n different coupons and every day Stop and Shop sends a different coupon at random - how long do have to wait before you can by food? Consider cover time on on a complete graph, here cover time is O(nlgn)

21 Mixing Rate Going back to probability, could ask how quickly do we converge to the stationary (limiting) distribution? We call this rate the mixing rate of the random walk. We saw How fast

22 Mixing with the Eigenvalues How do we calculate - yes eigenvalues! Since we are considering probability transitions as a matrix why not use spectral techniques Need graph G to be non-bipartite First need to make sure P is symmetric, not true unless G is regular

23 More decomposition Need to make P symmetric. Recall that P = DA, where D is diagonal matrix with entries of d(u), degree of node u Consider Claim this let us work with spectral since Now mixing rate is

24 Graph Connectivity Recall the graph connectivity problem, Given undirected graph G(V,E) want to find out if node s is connected to node t. Can do this in deterministic polynomial time. But what about space, would like to do this in a small amount of space. Recall hitting time of a graph is at most n 3 Try a walk of 2n 3 steps to get there, needs only lg(n) space to count number of steps.

25 Sampling So what’s the fuss? We can use random walks to sample, since we have the powerful notion of stationary distribution and on a regular graph this is uniform distribution, we can get at random elements. More importantly we can get a random sample from an exponentially large set.

26 Covering Problems Jerrun, Valiant, and Vazirani - Babai product estimator, or enumeration - self reducibilty Given V a set and V 1, V 2, V 3, …,V m subsets –For all i, |Vi| is polynomial time computable –Can sample from V uniformly at random –For all v in V, can determine efficiently that v is in V i Can we get size of union of subsets, or can we enumerate V in polynomial time

27 Permanent Want to count the number of perfect matchings in a bipartite graph. This is the permanent of a matrix Given a n x n matrix A the permanent is This is #P-complete

28 Commercial Break #P is the coolest class, defined as counting class. Consider the number of solutions to the problem Hard - [Toda] PH is contained in #P Want to show hyp-PH is also contained in #P

29 How to use random walk for permanent approximation Given graph G with d(u) > n/2 want to generate a random perfect matching First notice that input graph is bipartite. Instead consider graph of perfect matchings Let each node be a perfect matching in G - problem is how to connect Need to consider near-perfect matchings a matching with n/2-1 edges

30 Permanent Cont. Connect near perfect matchings with an edge if they have n/2-2 edges in common and connect a perfect matching to all of the near perfect matchings contained in it to create a graph H. Notice degree of H is bounded by 3n Walk (randomly) a polynomial number of steps in H - if node is a perfect matching - good - otherwise try again.

31 Volume Want to be able to calculate the volume of a convex body in n dimensions Computing convex polytope is #P-Hard No fear randomization is here Want to be able to fit this problem into enumeration problem

32 Volume Cont. Given C convex body in n dimensions, assume that C contains the origin Further assume that C contains the unit ball and itself is contained in a ball of radius r<n 3/2. Define C i = intersection of C and ball around origin of radius

33 Volume Cont. Then we get And we know Vol(C i ) Now only need to be able to get a element uniformly at random - use a walk Difficult to do walk, create a grid and walk from intersections of Ci’s Stationary distribution is not uniform but a distribution with density function proportional to local conductance

34 Partial Reference List L. Lovasz. Random Walks on Graphs: A survey. Combinatorics Paul Erdos is Eighty Volume 2, p1-46 R. Motwani, P. Raghavan Randomized Algorithms. Cambridge University 1995


Download ppt "Random Walks Ben Hescott CS591a1 November 18, 2002."

Similar presentations


Ads by Google