Presentation is loading. Please wait.

Presentation is loading. Please wait.

Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.

Similar presentations


Presentation on theme: "Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only."— Presentation transcript:

1

2 Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous even.

3 We have two states of Markov Chain: Transient and Recurrent States. Recurrent States is when starting from that state, the chain has probability 1 to return back to that state. Transient State is when starting from that state, the chain has the probability of 0 to return back to that state. In other words it can never return back to that state. For example; 123 123 Recurrent stateTransient state

4 Consider now a finite state Markov chain and suppose that the states are numbered so that T = {1, 2,..., t} denotes the set of transient states. Let P11 P12 · · · P1t Pt =...... Pt1 Pt2 · · · Ptt and note that since Pt specifies only the transition probabilities from transient states into transient states, some of its row sums are less than 1 (otherwise, T would be a closed class of states). For transient states i and j, let sij denote the expected number of time periods that the Markov chain is in state j, given that it starts in state i. Let δi,j = 1 when i = j and let it be 0 otherwise. Condition on the initial transition to obtain sij = δi,j + Pik skj = δi,j + Pik skj t K=1

5 where the final equality follows since it is impossible to go from a recurrent to a transient state, implying that skj = 0 when k is a recurrent state. Let S denote the matrix of values sij, i, j = 1,..., t. That is, S11 S12 · · · S1t S = St1 St2 · · · Stt Based on the fact that the above equation is a general equation implying that Skj=0, the equation below can be used for transient states computations S = I+PT S Where I is the Identity Matrix of Size t, Pt is the transition probabilities and S is the expected number of periods in each state. The equation above can be transformed to; S-Pts=I S(1-Pt)=I Therefore S= I/(1-Pt) S= (I-Pt) −1

6 Example: Consider the gambler’s ruin problem with p = 0.4 and N = 7.Starting with 3 units, determine (a) the expected amount of time the gambler has 5 units, (b) the expected amount of time the gambler has 2 units. Solution: The matrix PT, which specifies Pij, i, j ∈ {1, 2, 3, 4, 5, 6}, is as follows: 1 2 3 4 5 6 1 2 3 4 5 6 1 0 0.4 0 0 0 0 1 1 0 0 0 0 0 2 0.6 0 0.4 0 0 0 2 0 1 0 0 0 0 3 0 0.6 0 0.4 0 0 I= 3 0 0 1 0 0 0 PT = 4 0 0 0.6 0 0.4 0 4 0 0 0 1 0 0 5 0 0 0 0.6 0 0.4 5 0 0 0 0 1 0 6 0 0 0 0 0.6 0 6 0 0 0 0 0 1 Applying the above Equation we will have (I-Pt)

7 1 2 3 4 5 6 1 1 -0.4 0 0 0 0 2 -0.6 1 -0.4 0 0 0 3 0 -0.6 1 -0.4 0 0 (I - PT) = 4 0 0 -0.6 1 -0.4 0 5 0 0 0 -0.6 1 -0.4 6 0 0 0 0 -0.6 1 Inverting I−PT gives ( This Computation was done Using Math lab) 1 2 3 4 5 6 1 2 3 S = (I−PT ) 4 5 6 −1 1.6149 1.0248 0.6314 0.3691 0.1943 0.0777 1.5372 2.5619 1.5784 0.9228 0.4857 0.1943 1.4206 2.3677 2.9990 1.7533 0.9228 0.3691 1.2458 2.0763 2.6299 2.9990 1.5784 0.6314 0.9835 1.6391 2.0763 2.3677 2.5619 1.0248 0.5901 0.9835 1.2458 1.4206 1.5372 1.6149 Hence, s3,5 = 0.9228, s3,2 = 2.3677

8 For i ∈ T,j ∈ T, the quantity fij, equal to the probability that the Markov chain ever makes a transition into state j given that it starts in state i, is easily determined from PT. To determine the relationship, let us start by deriving an expression for sij by conditioning on whether state j is ever entered. This yields sij = E[time in j |start in i, ever transit to j ]fij +E[time in j |start in i, never transit to j ](1 −fij ) = (δi,j +sjj )fij + δi,j (1 −fi,j ) = δi,j + fij sjj since sjj is the expected number of additional time periods spent in state j given that it is eventually entered from state i. Solving the preceding equation yields fij = sij − δi,j sjj Example: what is the probability that the gambler ever has a fortune of 1? Solution: Since s3,1 = 1.4206 and s1,1 = 1.6149, then f3,1 = s3,1 s1,1 = 0.8797 THE END. THANK YOU


Download ppt "Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only."

Similar presentations


Ads by Google