Random Walks. Random Walks on Graphs - At any node, go to one of the neighbors of the node with equal probability. -

Slides:



Advertisements
Similar presentations
2 4 Theorem:Proof: What shall we do for an undirected graph?
Advertisements

Max Cut Problem Daniel Natapov.
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
Lecture 22: April 18 Probabilistic Method. Why Randomness? Probabilistic method: Proving the existence of an object satisfying certain properties without.
NP-Completeness: Reductions
Great Theoretical Ideas in Computer Science for Some.
Great Theoretical Ideas in Computer Science.
17.1 CompSci 102 Today’s topics Instant InsanityInstant Insanity Random WalksRandom Walks.
COMPSCI 102 Introduction to Discrete Mathematics.
Entropy Rates of a Stochastic Process
Graph Sparsifiers by Edge-Connectivity and Random Spanning Trees Nick Harvey University of Waterloo Department of Combinatorics and Optimization Joint.
February 23, 2015CS21 Lecture 201 CS21 Decidability and Tractability Lecture 20 February 23, 2015.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
Randomized Algorithms and Randomized Rounding Lecture 21: April 13 G n 2 leaves
What is the next line of the proof? a). Let G be a graph with k vertices. b). Assume the theorem holds for all graphs with k+1 vertices. c). Let G be a.
Algorithms in Exponential Time. Outline Backtracking Local Search Randomization: Reducing to a Polynomial-Time Case Randomization: Permuting the Evaluation.
Random Walks Great Theoretical Ideas In Computer Science Anupam GuptaCS Fall 2005 Lecture 23Nov 14, 2005Carnegie Mellon University.
Complexity 19-1 Complexity Andrei Bulatov More Probabilistic Algorithms.
Is the following graph Hamiltonian- connected from vertex v? a). Yes b). No c). I have absolutely no idea v.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
CSE 421 Algorithms Richard Anderson Lecture 27 NP Completeness.
Random Walks Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture 24April 8, 2004Carnegie Mellon University.
Complexity ©D.Moshkovitz 1 Paths On the Reasonability of Finding Paths in Graphs.
Randomized Algorithms Morteza ZadiMoghaddam Amin Sayedi.
Random Walks and Markov Chains Nimantha Thushan Baranasuriya Girisha Durrel De Silva Rahul Singhal Karthik Yadati Ziling Zhou.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Lecture 22 More NPC problems
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
NP-complete Problems SAT 3SAT Independent Set Hamiltonian Cycle
Random Walks Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2005 Lecture 24April 7, 2005Carnegie Mellon University.
1 3-COLOURING: Input: Graph G Question: Does there exist a way to 3-colour the vertices of G so that adjacent vertices are different colours? 1.What could.
EMIS 8373: Integer Programming NP-Complete Problems updated 21 April 2009.
Probability III: The Probabilistic Method Great Theoretical Ideas In Computer Science John LaffertyCS Fall 2006 Lecture 11Oct. 3, 2006Carnegie.
Many random walks are faster than one Noga AlonTel Aviv University Chen AvinBen Gurion University Michal KouckyCzech Academy of Sciences Gady KozmaWeizmann.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
Unit 9: Coping with NP-Completeness
Random walks on undirected graphs and a little bit about Markov Chains Guy.
1.1 Chapter 3: Proving NP-completeness Results Six Basic NP-Complete Problems Some Techniques for Proving NP-Completeness Some Suggested Exercises.
1 Suppose I construct a TM M f which: 1. Preserves its input u. 2. Simulates a machine M b on input ε. 3. If M b hangs on input ε, force an infinite loop.
Presented by Dajiang Zhu 11/1/2011.  Introduction of Markov chains Definition One example  Two problems as examples 2-SAT Algorithm (simply introduce.
28.
1 CSE 326: Data Structures: Graphs Lecture 24: Friday, March 7 th, 2003.
Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang.
NPC.
COMPSCI 230 Discrete Mathematics for Computer Science.
Given this 3-SAT problem: (x1 or x2 or x3) AND (¬x1 or ¬x2 or ¬x2) AND (¬x3 or ¬x1 or x2) 1. Draw the graph that you would use if you want to solve this.
18.1 CompSci 102 Today’s topics Impaired wanderingImpaired wandering Instant InsanityInstant Insanity 1950s TV Dating1950s TV Dating Final Exam reviewFinal.
Theory of Computational Complexity Probability and Computing Lee Minseon Iwama and Ito lab M1 1.
CHAPTER SIX T HE P ROBABILISTIC M ETHOD M1 Zhang Cong 2011/Nov/28.
More NP-Complete and NP-hard Problems
Random walks on undirected graphs and a little bit about Markov Chains
(xy)(yz)(xz)(zy)
Probability III: The Probabilistic Method
NP-Completeness Yin Tat Lee
CS21 Decidability and Tractability
CS154, Lecture 16: More NP-Complete Problems; PCPs
Great Theoretical Ideas In Computer Science
Great Theoretical Ideas in Computer Science
Random Walks.
Great Theoretical Ideas in Computer Science
NP-Completeness Yin Tat Lee
CS21 Decidability and Tractability
Random Walks.
Great Theoretical Ideas in Computer Science
Clustering.
Introduction to Discrete Mathematics
Presentation transcript:

Random Walks

Random Walks on Graphs -

At any node, go to one of the neighbors of the node with equal probability. -

Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability. -

Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability. -

Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability. -

Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability. -

Let’s start with a natural question on general graphs

Getting back home Lost in a city, you want to get back to your hotel. How should you do this? Depth First Search: requires a good memory and a piece of chalk -

Getting back home Lost in a city, you want to get back to your hotel. How should you do this? How about walking randomly? no memory, no chalk, just coins… -

Will this work? When will I get home? I have a curfew of 10 PM!

Will this work? Is Pr[ reach home ] = 1? When will I get home? What is E[ time to reach home ]? I have a curfew of 10 PM!

Relax, Dude! Yes, Pr[ will reach home ] = 1

Furthermore: If the graph has n nodes and m edges, then E[ time to visit all nodes ] ≤ 2m × (n-1) E[ time to reach home ] is at most this

Cover times Let us define a couple of useful things: Cover time (from u) C u = E [ time to visit all vertices | start at u ] Cover time of the graph: C(G) = max u { C u }

Cover Time Theorem If the graph G has n nodes and m edges, then the cover time of G is C(G) ≤ 2m (n – 1) Any graph on n vertices has < n 2 /2 edges. Hence C(G) < n 3 for all graphs G.

Let’s prove that Pr[ eventually get home ] = 1

We will eventually get home Look at the first n steps. There is a non-zero chance p 1 that we get home. Suppose we fail. Then, wherever we are, there a chance p 2 > 0 that we hit home in the next n steps from there. Probability of failing to reach home by time kn = (1 – p 1 )(1- p 2 ) … (1 – p k )  0 as k  ∞

In fact Pr[ we don’t get home by 2k C(G) steps ] ≤ (½) k Recall: C(G) = cover time of G ≤ 2m(n-1)

An averaging argument Suppose I start at u. E[ time to hit all vertices | start at u ] ≤ C(G) Hence, Pr[ time to hit all vertices > 2C(G) | start at u ] ≤ ½. Why? (use Markov’s inequality.)

so let’s walk some more! Pr [ time to hit all vertices > 2C(G) | start at u ] ≤ ½. Suppose at time 2C(G), am at some node v, with more nodes still to visit. Pr [ haven’t hit all vertices in 2C(G) more time | start at v ] ≤ ½. Chance that you failed both times ≤ ¼ !

The power of independence It is like flipping a coin with tails probability q ≤ ½. The probability that you get k tails is q k ≤ (½) k. (because the trials are independent!) Hence, Pr[ havent hit everyone in time k × 2C(G) ] ≤ (½) k Exponential in k!

We’ve proved that if CoverTime(G) < 2m(n-1) then Pr[ home by time 4km(n-1) ] ≥ 1 – (½) k

Cover Time Theorem If the graph G has n nodes and m edges, then the cover time of G is C(G) ≤ 2m (n – 1)

Electrical Networks again Let H uv = E[ time to reach v | start at u ] Theorem: If each edge is a unit resistor H uv + H vu = 2m × Resistance uv - u v

Electrical Networks again Let H uv = E[ time to reach v | start at u ] Theorem: If each edge is a unit resistor H uv + H vu = 2m × Resistance uv If u and v are neighbors  Resistance uv ≤ 1 Then H uv + H vu ≤ 2m - u v

Electrical Networks again If u and v are neighbors  Resistance uv ≤ 1 Then H uv + H vu ≤ 2m We will use this to prove the Cover Time theorem C u ≤ 2m(n-1) for all u - u v

Suppose G is the graph

Pick a spanning tree of G Say 1 was the start vertex, C 1 ≤ H 12 +H 21 +H 13 +H 35 +H 56 +H 65 +H 53 +H 34 +H 43 +H 31 = (H 12 +H 21 ) + (H 35 +H 53 ) + (H 56 +H 65 ) + (H 34 +H 43 ) + (H 31 +H 13 ) Each H uv + H vu ≤ 2m, and there are (n-1) edges C u ≤ (n-1) × 2m

A R.A. for 2SAT Q = (X 1 ∨ ~X 2 ) ∧ (~X 3 ∨ X 2 ) ∧ (~X 1 ∨ X 4 ) ∧ (X 3 ∨ X 4 ) Start with X i = True for all i While Q contain an unsatisfied clause and #steps< 2n 2 pick an unsatisfied clause C arbitrarily pick one of the two literals in C u.a.r and flip its truth value if Q is now satisfied then output yes Output no

Anaylsis 1.A * : any satisfying assignment 2. f(A):#variable of Q whose value in A is the same as A * 3. f(A) takes value in {0,1, …, n} 4.if f(A) = n then A must be A * 5. at each step f(A) incr. or dec. by 1 6. at each step, Pr[f(A) incr. by 1] >= 1/2 0n k