Presentation is loading. Please wait.

Presentation is loading. Please wait.

IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.

Similar presentations


Presentation on theme: "IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials."— Presentation transcript:

1 IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials

2 2 Outline Continuous-time Markov chain Chapman-Kolmogorov equation Kolmogorov’s Backward & Forward equation Limiting Probability P j Time Reversible Markov Process Summary

3 3 Continuous-time Markov chain ― Properties X(t) always corresponds to an embedded discrete-time Markov chain, with the constraint P ii = 0, ∀ i. X(t) = i means that the process is in state i at time t. When the process X(t) is in state i at time t, the remaining time for it to make a transition to other states (≠i) is an exponential r.v. with rate v i. The remaining time for a transition is independent of t.

4 4 Continuous-time Markov chain ― Notations P{X(t + s) = j | X(s) = i, X(u) = x(u) ∀ u < s} = P{X(t + s) = j | X(s) = i} = P ij (t) // stationary transition probability. P ii = 0, ∀ i ; but P ii (t)≠0 for t ≥ 0 v i = transition rate in state i; q ij = v i P ij = instantaneous transition rate from state i to state j //v i = ∑ q ij ; P ij = q ij / v i T i = waiting time for a transition in state i, exponential with v i

5 5 Continuous-time Markov chain ― Example Consider two machines that are maintained by a single repairman. Machine i functions for an exponential time with rate μ i before breaking down, i = 1, 2. The repair times for either machine are exponential with rate μ. Analyze this as a Markov Process. Define 5 states: 0 ― both machines on 1 ― 1 st down, 2 nd on 2 ― 1 st on, 2 nd down 3 ― both machines down, 1 st is under repair. 4 ― both machines down, 2 nd is under repair. Draw a transition diagram The transition rates: v 0 = μ 1 + μ 2, v 1 = μ + μ 2 v 2 = μ 1 + μ, v 3 = v 4 = μ

6 6 Birth-death Process A special case of Markov process such that q 01 =λ 0 and for all other states i, q ii+1 =λ i, q ii–1 = μ i, and other q ij = 0. Correspondingly, P 01 = 1; The Poisson process is a special case of birth-death process, with constant birth rateλ and death rate 0.

7 7 Chapman-Kolmogorov equation P ij (t) is the probability that given the current state is i, the process will stay at state j after time period t. P ij (t) = P{X(t + s) = j | X(s) = i} ; ∑ P ij (t) = 1 Chapman-Kolmogorov equation // By conditioning on the state after time period t.

8 8 Kolmogorov’s Backward & Forward equation By setting either s or t in C-K equation to be infinitesimal, we get Kolmogorov’s Backward equation Kolmogorov’s Forward equation

9 9 Kolmogorov’s Backward & Forward equation ― Example Find P ij (t) by Kolmogorov Forward equation in the 2-state birth-death process. [Given: ] P’ 01 (t) = P 00 (t)q 01 – P 01 (t)v 1 = (1 – P 01 (t))λ– P 01 (t) μ =λ – (λ+ μ )P 01 (t) By solving this 1 st order DE, P 01 (t) = Ae –(λ+μ)t +λ/(λ+ μ ). By the boundary condition P 01 (0) = 0, A = –λ/(λ+ μ ). ∴ P 01 (t) = λ/(λ+ μ )(1 – e –(λ+μ)t ), P 00 (t) =1 –λ/(λ+ μ )(1 – e –(λ+μ)t ) 0 = off 1 = on 

10 10 Kolmogorov’s Backward & Forward equation ― Example Find P ij (t) by Kolmogorov Forward equation in the 2-state birth-death process. [Given: ] Similarly, P’ 10 (t) = P 11 (t)q 10 – P 10 (t)v 0 = μ– (λ+ μ )P 10 (t). P 10 (t) = μ /(λ+ μ )(1 – e –(λ+μ)t ), P 11 (t) =1 – μ /(λ+ μ )(1 – e –(λ+μ)t ) 0 = off 1 = on 

11 11 Limiting Probability P j of P ij (t) Definition. A continuous-time Markov chain is said to be ergodic when lim t→∞ P ij (t) exists for all j and the limiting value is independent of the initial state i. Let P j denote lim t→∞ P ij (t) By the flow conservation law (balance equations): the rate into a state = the rate out of a state // from Forward equation Similar toπ j in the discrete case, P j is the long run proportion of time the ergodic Markov process stays at state j and

12 Long Run Probability π j of P ij 12  When a continuous-time Markov chain is positive recurrent, so is the imbedded discrete-time Markov chain.  If a continuous-time Markov chain is irreducible and positive recurrent, then it is ergodic. In that case, the imbedded discrete-time markov chain has a unique long-run distribution {  j }, which is a solution to  Similarly,

13 13 Time Reversible Markov Process For an ergodic Markov process {X(t)} t≥0, its reversed process {Y(t)} T ≥t≥0, Y(t) = X(T – t) corresponds to an embedded discrete-time Markov chain. If π i exists, Y(t) has the same v i and if its corresponding embedded discrete-time Markov chain is time reversible, it is also time reversible. Then, the rate from i to j = the rate from j to i. P i q ij = P j q ji or equivalently, P ij = Q ij or equivalently, P i q ij = P j q ji  An ergodic birth-death process is time reversible.

14 14 Burke’s Theorem If < s , the stationary output process of M/M/s is a Poisson process with intensity. // this M/M/s is time reversible at the stationary state so balance flow holds.

15 15 Summary Chapman-Kolmogorov equation Kolmogorov’s Backward (and Forward) equation

16 16 Summary Limiting Probability P j  P j = lim t→∞ P ij (t)  Time Reversible Markov Process  P i q ij = P j q ji


Download ppt "IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials."

Similar presentations


Ads by Google