Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discrete-Time Markov Chains. © Tallal Elshabrawy 2 Introduction Markov Modeling is an extremely important tool in the field of modeling and analysis of.

Similar presentations


Presentation on theme: "Discrete-Time Markov Chains. © Tallal Elshabrawy 2 Introduction Markov Modeling is an extremely important tool in the field of modeling and analysis of."— Presentation transcript:

1 Discrete-Time Markov Chains

2 © Tallal Elshabrawy 2 Introduction Markov Modeling is an extremely important tool in the field of modeling and analysis of telecommunication networks Example: Markov Models are applicable in the following networking problems Connection Admission Control (CAC) Bandwidth Allocation Congestion Control Routing Queuing and Scheduling

3 © Tallal Elshabrawy 3 X(t) depicts a stochastic process The Markov Property: Conditional on X(t)=X, the future {X(u)} u>t and the past {X(u)} u<t are statistically independent time X(t) = X PAST Future t Markov Property

4 © Tallal Elshabrawy 4 Markov Process = Random State Process A Markov process is characterized by a state variable The state depicts a random process with the condition that knowledge of the value of the state at any given time t relieves the need of any past knowledge of the system Examples: The state could be the number of packets inside a queue with exponential inter-arrivals and exponential service times

5 © Tallal Elshabrawy 5 Types of Markov Process Markov processes with discrete amplitude are termed as Markov Chains The amplitude in Markov chains is discrete in the sense that they are drawn from a state space S where S={s 0, s 1, s 2, ……} TimeAmplitude Discrete Continuous Discrete Continuous

6 © Tallal Elshabrawy 6 Discrete-Time Markov Chains Consider Markov Chain {X(t)} in discrete-time t = 0, 1, 2, … time t=0 t=1t=2t=3t=4t=n X(0) X(1) X(2) X(3) X(4) X(n) Future with respect to t=1 Past Present From the Markov property this term is irrelevant in determining the future with respect to t=1

7 © Tallal Elshabrawy 7 Discrete-Time Markov Chains Consider Markov Chain {X(t)} in discrete-time t = 0, 1, 2, … time t=0 t=1t=2t=3t=4t=n X(0) X(1) X(2) X(3) X(4) X(n) Continuing in the same way Initial Condition Transition Probabilities

8 © Tallal Elshabrawy 8 Theorem The statistics of Markov chain X(.) are completely specified by: The initial distribution (the distribution of X(0)) The transition probabilities (the distribution of X(k) given X(k-1)) for all k Goal: To reduce the complexity of describing random processes

9 © Tallal Elshabrawy 9 Further Reduction: Time Invariance Markov X(.) is said to be time-homogeneous if and only if the distribution of X(t) given X(s) for s<t depends only on the difference t-s, i.e., Time Homogeneity means

10 © Tallal Elshabrawy 10 Modeling & Notations Assume the Markov chain may assume one of M states, i.e., S={S 0, S 1, …, S M-1 } The chain could be fully characterized by: An Mx1 vector Q(0) = {Pr[X[0]=S i }, 0≤i ≤M-1 An MxM matrix P(t) = {P ij (t)}, 0≤i ≤M-1, 0≤j≤M-1 S0S0 S1S1 S2S2 S M-1 SjSj SiSi SjSj SiSi P ij (t) P ji (t) P ij (t) = Pr[X(s+t)=S j |X(s)=S i ] P ji (t) = Pr[X(s+t)=S i |X(s)=S j ]

11 © Tallal Elshabrawy 11 Example S0S0 S1S1 S2S2 Matrix Multiplication

12 © Tallal Elshabrawy 12 Stationary Markov Chains If a chain is stationary: Q(t)={Pr[X(t)=S i ]}, 0≤i ≤M-1 does not depend on t Stationary means: Distribution of position is insensitive to time

13 © Tallal Elshabrawy 13 Chapman-Kolmogorov time t s u SiSi SjSj SkSk Future with respect to time=t Present with respect to time=t Past with respect to time=t True for all S i, S j and for all time points s<t<u Chapman-Kolmogorov Equations

14 © Tallal Elshabrawy 14 Example S0S0 S1S1 S2S2 To move from state S 0 to S 1 within time = 2t, we have the following three Possibilities: 1.Transition from S 0 to S 0 between 0,t is followed by the transition S 0 to S 1 between t,2t 2.Transition from S 0 to S 1 between 0,t is followed by the transition S 1 to S 1 between t,2t 3.Transition from S 0 to S 2 between 0,t is followed by the transition S 2 to S 1 between t,2t time t s=0 u=2t SiSi SjSj SkSk

15 © Tallal Elshabrawy 15 Chapman-Kolmogorov: Matrix Notation time t s u SiSi SjSj SkSk Chapman-Kolmogorov Equations

16 © Tallal Elshabrawy 16 One Step Transition Probabilities time t+1 t t+2 SiSi SjSj SkSk One Step Transition Probability

17 © Tallal Elshabrawy 17 Summary Discrete-Time Homogeneous Markov Chain A complete statistical characterization consists of Q(0)  Initial Probability distribution P(1)  One Step Transition Probability Q T (t) = Q T (0)P(t) (i.e., Pr[X(t)=S j ] = ∑ i Pr[X(0)=S i ]Pr[X(t)=S j |X(0)=S i ]) Q(t) is a Probability Vector (i.e., ∑ i Pr[X(t)=S i ] =1) The matrix P(t) is stochastic All elements >0 Sum of each row is equal to 1 (∑ j P ij (t) = 1)


Download ppt "Discrete-Time Markov Chains. © Tallal Elshabrawy 2 Introduction Markov Modeling is an extremely important tool in the field of modeling and analysis of."

Similar presentations


Ads by Google