Download presentation

Presentation is loading. Please wait.

Published byIvy Smitherman Modified over 2 years ago

1
Markov Chain

2
2 Stochastic Processes A stochastic process is a collection of random variables The index t is often interpreted as time. is called the state of the process at time t. The set T is called the index set of the process. If T is countable, the stochastic process is said to be a discrete-time process. If T is an interval of the real line, the stochastic process is said to be a continuous-time process. The state space E is the set of all possible values that the random variables can assume.

3
3 Stochastic Processes Examples: 1. Winnings of a gambler at successive games of blackjack; T = {1, 2,.., 10}, E = set of integers. 2. Temperatures in Laurel, MD. on September 17, 2008; T = [12 a.m. Wed., 12 a.m. Thurs.), E = set of real numbers.

4
4 Markov Chains A stochastic process is called a Markov chain provided that for all states and all.

5
Markov Chains

6
Keadaan dari suatu kesuburan tanah dapat asumsi kedalam kondisi baik, sedang dan buruk

7
Transition Probabilities

8
Classification of The States

9
Steady State Probability

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google