# Markov Chain. 2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the.

## Presentation on theme: "Markov Chain. 2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the."— Presentation transcript:

Markov Chain

2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the state of the process at time t. The set T is called the index set of the process. If T is countable, the stochastic process is said to be a discrete-time process. If T is an interval of the real line, the stochastic process is said to be a continuous-time process. The state space E is the set of all possible values that the random variables can assume.

3 Stochastic Processes Examples: 1. Winnings of a gambler at successive games of blackjack; T = {1, 2,.., 10}, E = set of integers. 2. Temperatures in Laurel, MD. on September 17, 2008; T = [12 a.m. Wed., 12 a.m. Thurs.), E = set of real numbers.

4 Markov Chains A stochastic process is called a Markov chain provided that for all states and all.

Markov Chains

Keadaan dari suatu kesuburan tanah dapat asumsi kedalam kondisi baik, sedang dan buruk

Transition Probabilities

Classification of The States