Presentation is loading. Please wait.

Presentation is loading. Please wait.

What if time ran backwards? If X n, 0 ≤ n ≤ N is a Markov chain, what about Y n = X N-n ? If X n follows the stationary distribution, Y n has stationary.

Similar presentations


Presentation on theme: "What if time ran backwards? If X n, 0 ≤ n ≤ N is a Markov chain, what about Y n = X N-n ? If X n follows the stationary distribution, Y n has stationary."— Presentation transcript:

1 What if time ran backwards? If X n, 0 ≤ n ≤ N is a Markov chain, what about Y n = X N-n ? If X n follows the stationary distribution, Y n has stationary transition probs

2 Reversible chains X is called reversible if its time-reversal Y has the same transition probabilities as X Example: If P is symmetric, the stationary distribution is uniform, and so the chain is reversible.

3 Full balance Write flux outflux in of jto j If we have a large number of particles following the same Markov chain, and the system is in equilibrium, there should be about the same number moving in and out of state j at any one time. Equation of full balance.

4 Detailed balance Reversible processes: or Law of detailed balance

5 A useful result If X is an irreducible Markov chain satisfying detailed balance for some  i for all i,j in S, then it is reversible and positive persistent with stationary distribution . Proof: Need only show that  is stationary.

6 A birth and death chain At each stage i can only move to i+1 or i-1 with probabilities p i > 0 and q i > 0 (q 0 =0) or stay with probability r i. Full balance equation: Assuming detailed balance for j=k using the induction hypothesis

7 Birth and death chain, cont. Thus we have detailed balance. Now build up Random walk reflected at origin: p i =1-q i =p. Let p<1/2.

8 Ehrenfast diffusion Two containers, labeled 0 and 1. N molecules. One molecule chosen at random and moved to the other container. Describe system using an N-digit binary number, 2 N possible states. Micro-level process X. p x,y =1/N if x,y differ in only one location 0 otherwise

9 Ehrenfast, cont.  = ? Period? Detailed balance? Microreversibility: p x,y =p y,x for all x,y.

10 Ehrenfast, cont. If molecules are indistinguishable we get a macro-level description of the process. Y k = # molecules in container 0. Birth and death chain. Stationary distribution  0 = ?

11 Loschmidt’s paradox How can thermodynamics (with entropy increasing with time) be deduced from elementary physics (which is time- reversible)? To be reversible we need both P(X k = x | X 0 = x’) = P(X k = x’ | X 0 = x) and for y=  x i P(Y k = y | Y 0 = y’) = P(Y k = y’ | Y 0 = y) Let y be small, and y’ about N/2. Then the second equation will not hold. Which side is larger?

12 The ergodic theorem for Markov chains Let X be positive persistent. Then if satisfies we have This is the law of large numbers for Markov chains (and is essentially proved by the usual law of large numbers).

13 Markov chain Monte Carlo integration The ergodic theorem suggests that one can compute this type of expectations/integrals by generating a Markov chain with the right stationary distribution, and then just average function values. Example: Likelihood L(  )= h(x;  )/c(  ) where Often this is too complicated to calculate exactly.

14 Likelihood, cont. Let f(x)=g(x)/c be a fixed pmf such that h(x;  )>0 implies f(x)>0. The mle of  maximizes For any q we can compute h(x;  )/g(x) but not c(  )/c. Write If we can draw samples from f we can estimate the expectation. May be a difficult multivariate distribution.

15 The MCMC approach Instead of drawing samples from f, we draw samples from a Markov chain which has f as its stationary distribution. Burn-in (to get to stationary distribution) Sampling Exact simulation by coupling backwards in time


Download ppt "What if time ran backwards? If X n, 0 ≤ n ≤ N is a Markov chain, what about Y n = X N-n ? If X n follows the stationary distribution, Y n has stationary."

Similar presentations


Ads by Google