Presentation is loading. Please wait.

Presentation is loading. Please wait.

Markov Chain. 2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the.

Similar presentations


Presentation on theme: "Markov Chain. 2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the."— Presentation transcript:

1 Markov Chain

2 2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the state of the process at time t. The set T is called the index set of the process. If T is countable, the stochastic process is said to be a discrete-time process. If T is an interval of the real line, the stochastic process is said to be a continuous-time process. The state space E is the set of all possible values that the random variables can assume.

3 3 Stochastic Processes Examples: 1. Winnings of a gambler at successive games of blackjack; T = {1, 2,.., 10}, E = set of integers. 2. Temperatures in Laurel, MD. on September 17, 2008; T = [12 a.m. Wed., 12 a.m. Thurs.), E = set of real numbers.

4 4 Markov Chains A stochastic process is called a Markov chain provided that for all states and all.

5 Markov Chains

6 Keadaan dari suatu kesuburan tanah dapat asumsi kedalam kondisi baik, sedang dan buruk

7 Transition Probabilities

8 Classification of The States

9 Steady State Probability

10

11


Download ppt "Markov Chain. 2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the."

Similar presentations


Ads by Google