Presentation is loading. Please wait.

Presentation is loading. Please wait.

 { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains.

Similar presentations


Presentation on theme: " { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains."— Presentation transcript:

1  { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

2  { X n : n =0, 1, 2,...} is a discrete time stochastic process  If X n = i the process is said to be in state i at time n Markov Chains

3  { X n : n =0, 1, 2,...} is a discrete time stochastic process  If X n = i the process is said to be in state i at time n  { i : i =0, 1, 2,...} is the state space Markov Chains

4  { X n : n =0, 1, 2,...} is a discrete time stochastic process  If X n = i the process is said to be in state i at time n  { i : i =0, 1, 2,...} is the state space  If P ( X n +1 =j|X n =i, X n -1 =i n -1,..., X 0 =i 0 }= P ( X n +1 =j|X n =i } = P ij, the process is said to be a Discrete Time Markov Chain (DTMC). Markov Chains

5  { X n : n =0, 1, 2,...} is a discrete time stochastic process  If X n = i the process is said to be in state i at time n  { i : i =0, 1, 2,...} is the state space  If P ( X n +1 =j|X n =i, X n -1 =i n -1,..., X 0 =i 0 }= P ( X n +1 =j|X n =i } = P ij, the process is said to be a Discrete Time Markov Chain (DTMC).  P ij is the transition probability from state i to state j Markov Chains

6 P : transition matrix

7  Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P (rain tomorrow|rain today) =  P (rain tomorrow|no rain today) = 

8  Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P (rain tomorrow|rain today) =  P (rain tomorrow|no rain today) =  State 0 = rain State 1 = no rain

9  Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P (rain tomorrow|rain today) =  P (rain tomorrow|no rain today) =  State 0 = rain State 1 = no rain

10  Example 4: A gambler wins $1 with probability p, loses $1 with probability 1- p. She starts with $ N and quits if she reaches either $ M or $0. X n is the amount of money the gambler has after playing n rounds.

11  P ( X n =i +1 |X n -1 =i, X n -2 =i n -2,..., X 0 =N }= P ( X n =i +1 |X n -1 =i }= p (i≠ 0, M)

12  Example 4: A gambler wins $1 with probability p, loses $1 with probability 1- p. She starts with $ N and quits if she reaches either $ M or $0. X n is the amount of money the gambler has after playing n rounds.  P ( X n =i +1 |X n -1 =i, X n -2 =i n -2,..., X 0 =N }= P ( X n =i +1 |X n -1 =i }= p (i≠ 0, M)  P ( X n =i -1 | X n -1 =i, X n -2 = i n -2,..., X 0 =N } = P ( X n =i -1 |X n -1 =i }=1– p (i ≠ 0, M)

13  Example 4: A gambler wins $1 with probability p, loses $1 with probability 1- p. She starts with $ N and quits if she reaches either $ M or $0. X n is the amount of money the gambler has after playing n rounds.  P ( X n =i +1 |X n -1 =i, X n -2 =i n -2,..., X 0 =N }= P ( X n =i +1 |X n -1 =i }= p (i≠ 0, M)  P ( X n =i -1 | X n -1 =i, X n -2 = i n -2,..., X 0 =N } = P ( X n =i -1 |X n -1 =i }=1– p (i ≠ 0, M) P i, i +1 =P ( X n =i +1 |X n -1 =i }; P i, i -1 =P ( X n =i -1 |X n -1 =i }

14  P i, i +1 = p ;  P i, i -1 = 1- p for i≠ 0, M  P 0,0 = 1; P M, M = 1 for i≠ 0, M (0 and M are called absorbing states)  P i, j = 0, otherwise

15  random walk: A Markov chain whose state space is 0,  1,  2,..., and P i,i +1 = p = 1 - P i,i -1 for i =0,  1,  2,..., and 0 < p < 1 is said to be a random walk.

16 Chapman-Kolmogorv Equations

17

18

19

20

21

22

23

24

25

26

27  Example 1: Probability it will rain tomorrow depends only on whether it rains today or not: P (rain tomorrow|rain today) =  P (rain tomorrow|no rain today) =  What is the probability that it will rain four days from today given that it is raining today? Let  = 0.7 and  = 0.4. State 0 = rain State 1 = no rain

28

29

30

31

32

33 Unconditional probabilities

34

35

36

37 Classification of States

38

39

40

41 Properties

42

43

44

45 Classification of States (continued)

46

47


Download ppt " { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains."

Similar presentations


Ads by Google