Copyright © 2005 Department of Computer Science CPSC 641 Winter 20111 Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov.

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Modeling and Dimensioning of Mobile Networks: from GSM to LTE
Maciej Stasiak, Mariusz Głąbowski Arkadiusz Wiśniewski, Piotr Zwierzykowski Groups models Modeling and Dimensioning of Mobile Networks: from GSM to LTE.
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Queueing Models and Ergodicity. 2 Purpose Simulation is often used in the analysis of queueing models. A simple but typical queueing model: Queueing models.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Flows and Networks Plan for today (lecture 2): Questions? Continuous time Markov chain Birth-death process Example: pure birth process Example: pure death.
Markov Chains.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 6 on Stochastic Processes Kishor S. Trivedi Visiting Professor.
Discrete Time Markov Chains
TCOM 501: Networking Theory & Fundamentals
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Continuous Time Markov Chains and Basic Queueing Theory
Lecture 13 – Continuous-Time Markov Chains
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
CSE 3504: Probabilistic Analysis of Computer Systems Topics covered: Continuous time Markov chains (Sec )
TCOM 501: Networking Theory & Fundamentals
2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)
1 TCOM 501: Networking Theory & Fundamentals Lecture 7 February 25, 2003 Prof. Yannis A. Korilis.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec )
CSE 3504: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec )
1 Markov Chains H Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of.
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Flows and Networks Plan for today (lecture 5): Last time / Questions? Waiting time simple queue Little Sojourn time tandem network Jackson network: mean.
Introduction to Stochastic Models GSLM 54100
Generalized Semi-Markov Processes (GSMP)
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
1 Queuing Models Dr. Mahmoud Alrefaei 2 Introduction Each one of us has spent a great deal of time waiting in lines. One example in the Cafeteria. Other.
Modeling and Analysis of Computer Networks
1 Networks of queues Networks of queues reversibility, output theorem, tandem networks, partial balance, product-form distribution, blocking, insensitivity,
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Model under consideration: Loss system Collection of resources to which calls with holding time  (c) and class c arrive at random instances. An arriving.
Flows and Networks Plan for today (lecture 2): Questions? Birth-death process Example: pure birth process Example: pure death process Simple queue General.
The M/M/ N / N Queue etc COMP5416 Advanced Network Technologies.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Discrete Time Markov Chains
Chapter 6 Product-Form Queuing Network Models Prof. Ali Movaghar.
Queuing Theory.  Queuing Theory deals with systems of the following type:  Typically we are interested in how much queuing occurs or in the delays at.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Flows and Networks Plan for today (lecture 3): Last time / Questions? Output simple queue Tandem network Jackson network: definition Jackson network: equilibrium.
Queueing Fundamentals for Network Design Application ECE/CSC 777: Telecommunications Network Design Fall, 2013, Rudra Dutta.
Mohammad Khalily Islamic Azad University.  Usually buffer size is finite  Interarrival time and service times are independent  State of the system.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Discrete-time Markov chain (DTMC) State space distribution
Networks of queues Networks of queues reversibility, output theorem, tandem networks, partial balance, product-form distribution, blocking, insensitivity,
Markov Chains and Random Walks
Advanced Statistical Computing Fall 2016
Much More About Markov Chains
Flows and Networks Plan for today (lecture 4):
CTMCs & N/M/* Queues.
Discrete Time Markov Chains
Lecture on Markov Chain
CPSC 531: System Modeling and Simulation
Hidden Markov Autoregressive Models
Markov Chains Carey Williamson Department of Computer Science
Lecture 4: Algorithmic Methods for G/M/1 and M/G/1 type models
Networks of queues Networks of queues reversibility, output theorem, tandem networks, partial balance, product-form distribution, blocking, insensitivity,
Queueing networks.
September 1, 2010 Dr. Itamar Arel College of Engineering
Carey Williamson Department of Computer Science University of Calgary
Discrete time Markov Chain
Discrete time Markov Chain
Presentation transcript:

Copyright © 2005 Department of Computer Science CPSC 641 Winter Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of Markov chain analysis On-Off traffic model Markov-Modulated Poisson Process Erlang B blocking formula TCP congestion window evolution

Copyright © 2005 Department of Computer Science CPSC 641 Winter Definition: Markov Chain A discrete-state Markov process Has a set S of discrete states: |S| > 1 Changes randomly between states in a sequence of discrete steps Continuous-time process, although the states are discrete Very general modeling technique used for system state, occupancy, traffic, queues,... Analogy: Finite State Machine (FSM) in CS

Copyright © 2005 Department of Computer Science CPSC 641 Winter Some Terminology (1 of 3) Markov property: behaviour of a Markov process depends only on what state it is in, and not on its past history (i.e., how it got there, or when) A manifestation of the memoryless property, from the underlying assumption of exponential distributions

Copyright © 2005 Department of Computer Science CPSC 641 Winter Some Terminology (2 of 3) The time spent in a given state on a given visit is called the sojourn time Sojourn times are exponentially distributed and independent Each state i has a parameter q_i that characterizes its sojourn behaviour

Copyright © 2005 Department of Computer Science CPSC 641 Winter Some Terminology (3 of 3) The probability of changing from state i to state j is denoted by p_ij This is called the transition probability (sometimes called transition rate) Often expressed in matrix format Important parameters that characterize the system behaviour

Copyright © 2005 Department of Computer Science CPSC 641 Winter Properties of Markov Chains Irreducibility: every state is reachable from every other state (i.e., there are no useless, redundant, or dead-end states) Ergodicity: a Markov chain is ergodic if it is irreducible, aperiodic, and positive recurrent (i.e., can eventually return to a given state within finite time, and there are different path lengths for doing so) Stationarity: stable behaviour over time

Copyright © 2005 Department of Computer Science CPSC 641 Winter Analysis of Markov Chains The analysis of Markov chains focuses on steady-state behaviour of the system Called equilibrium, or long-run behaviour as time t approaches infinity Well-defined state probabilities p_i (non- negative, normalized, exclusive) Flow balance equations can be applied

Copyright © 2005 Department of Computer Science CPSC 641 Winter Examples of Markov Chains Traffic modeling: On-Off process Interrupted Poisson Process (IPP) Markov-Modulated Poisson Process Computer repair models (server farm) Erlang B blocking formula Birth-Death processes M/M/1 Queueing Analysis