Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov.

Slides:



Advertisements
Similar presentations
Discrete time Markov Chain
Advertisements

Introduction to Queuing Theory
Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Queueing Models and Ergodicity. 2 Purpose Simulation is often used in the analysis of queueing models. A simple but typical queueing model: Queueing models.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Markov Chains.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
TCOM 501: Networking Theory & Fundamentals
Al-Imam Mohammad Ibn Saud University
Continuous Time Markov Chains and Basic Queueing Theory
Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.
Lecture 13 – Continuous-Time Markov Chains
1.(10%) Let two independent, exponentially distributed random variables and denote the service time and the inter-arrival time of a system with parameters.
Performance analysis for high speed switches Lecture 6.
1 Performance Evaluation of Computer Networks Objectives  Introduction to Queuing Theory  Little’s Theorem  Standard Notation of Queuing Systems  Poisson.
Queueing Theory: Part I
7/3/2015© 2007 Raymond P. Jefferis III1 Queuing Systems.
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.
Introduction to Queuing Theory. 2 Queuing theory definitions  (Kleinrock) “We study the phenomena of standing, waiting, and serving, and we call this.

Lecture 7  Poisson Processes (a reminder)  Some simple facts about Poisson processes  The Birth/Death Processes in General  Differential-Difference.
CDA6530: Performance Models of Computers and Networks Examples of Stochastic Process, Markov Chain, M/M/* Queue TexPoint fonts used in EMF. Read the TexPoint.
Introduction to Queuing Theory
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Exponential and Chi-Square Random Variables
Introduction to Queuing Theory
Copyright ©: Nahrstedt, Angrave, Abdelzaher, Caccamo1 Queueing Systems.
Generalized Semi-Markov Processes (GSMP)
Probability Review Thinh Nguyen. Probability Theory Review Sample space Bayes’ Rule Independence Expectation Distributions.
MIT Fun queues for MIT The importance of queues When do queues appear? –Systems in which some serving entities provide some service in a shared.
Lecture 14 – Queuing Networks Topics Description of Jackson networks Equations for computing internal arrival rates Examples: computation center, job shop.
NETE4631:Capacity Planning (2)- Lecture 10 Suronapee Phoomvuthisarn, Ph.D. /
Introduction to Queueing Theory
Network Design and Analysis-----Wang Wenjie Queueing System IV: 1 © Graduate University, Chinese academy of Sciences. Network Design and Analysis Wang.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
1 Queuing Models Dr. Mahmoud Alrefaei 2 Introduction Each one of us has spent a great deal of time waiting in lines. One example in the Cafeteria. Other.
1 Elements of Queuing Theory The queuing model –Core components; –Notation; –Parameters and performance measures –Characteristics; Markov Process –Discrete-time.
CS433 Modeling and Simulation Lecture 12 Queueing Theory Dr. Anis Koubâa 03 May 2008 Al-Imam Mohammad Ibn Saud University.
Modeling and Simulation Queuing theory
Queuing Theory and Traffic Analysis Based on Slides by Richard Martin.
Why Wait?!? Bryan Gorney Joe Walker Dave Mertz Josh Staidl Matt Boche.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Chapter 20 Queuing Theory to accompany Operations Research: Applications and Algorithms 4th edition by Wayne L. Winston Copyright (c) 2004 Brooks/Cole,
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
CS352 - Introduction to Queuing Theory Rutgers University.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.
Copyright ©: Nahrstedt, Angrave, Abdelzaher, Caccamo1 Queueing Systems.
Queuing Theory.  Queuing Theory deals with systems of the following type:  Typically we are interested in how much queuing occurs or in the delays at.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 17 Queueing Theory.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
Lecture 14 – Queuing Networks
Much More About Markov Chains
Lecture on Markov Chain
Multinomial Distribution
ECE 358 Examples #1 Xuemin (Sherman) Shen Office: EIT 4155
Handling Routing Transport Haifa JFK TLV BGN To: Yishay From: Vered
Lecture 7 Poisson Processes (a reminder)
Queueing Theory II.
Introduction to Queueing Theory
Lecture 14 – Queuing Networks
Queueing Theory 2008.
Probability Fundamentals
Waiting Line Models Waiting takes place in virtually every productive process or service. Since the time spent by people and things waiting in line is.
Course Description Queuing Analysis This queuing course
Lecture 5 This lecture is about: Introduction to Queuing Theory
Presentation transcript:

Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov Chains Bertsekas/Gallager: Appendix A Kleinrock (Book I) Markov Chains by J. R. Norris

Queuing Theory deals with systems of the following type: Typically we are interested in how much queuing occurs or in the delays at the servers. Queuing Theory Input Process Server Process(es) Output

Queuing Theory Notation A standard notation is used in queuing theory to denote the type of system we are dealing with. Typical examples are: M/M/1Poisson Input/Poisson Server/1 Server M/G/1Poisson Input/General Server/1 Server D/G/nDeterministic Input/General Server/n Servers E/G/ Erlangian Input/General Server/Inf. Servers The first letter indicates the input process, the second letter is the server process and the number is the number of servers. (M = Memoryless = Poisson)

The M/M/1 Queue The simplest queue is the M/M/1 queue. Recall that a Poisson process has the following characteristics: Where A(t) is the number of events (arrivals) up to time t. Let us assume that the arrival process is a Poisson with mean and the service process is a Poisson with a mean

Poisson Processes (a refresher) Interarrival times are i.i.d. and exponentially distributed with parameter. t n is the time of packet n and n = t n+1 - t n then: For every t 0 and 0:

If two or more Poisson processes (A 1,A 2...A k ) with different means( 1, 2... k ) are merged then the resultant process has a mean given by: If a Poisson process is split into two (or more) by independently assigning arrivals to streams then the resultant processes are both Poisson. Because of the memoryless property of the Poisson process, an ideal tool for investigating this type of system is the Markov chain. Poisson Processes (a refresher)

On the Buses (a paradoxical property of Poisson Processes) You are waiting for a bus. The timetable says that buses are every 30 minutes. (But who believes bus timetables?) As a mathematician, you have observed that, in fact, the buses are a Poisson process with a mean arrival rate such that the expectation time between buses is 30 minutes. You arrived at a random time at the bus stop. What is your expected wait for a bus? What is the expected time since the last bus? 15 minutes. After all, they are, on average, 30 minutes apart. 30 minutes. As we have said, a Poisson Process is memoryless so logically, the expected waiting time must be the same whether we arrive just after a previous bus or a full hour since the previous bus.

Introduction to Markov Chains Some process (or time series) {X n | n= 0,1,2,...} takes values in nonnegative integers. The process is a Markov chain if, whenever it is in state i, the probability of being in state j next is p ij This is, of course, another way of saying that a Markov Chain is memoryless. p ij are the transition probabilities.

Visualising Markov Chains (the confused hippy hitcher example) A BC 1/3 2/3 1/2 1/4 3/4 A hitchhiking hippy begins at A town. For some reason he has poor short-term memory and travels at random according to the probabilities shown. What is the chance he is back at A after 2 days? What about after 3 days? Where is he likely to end up?

The Hippy Hitcher (continued) After 1 day he will be in B town with probability 3/4 or C town with probability 1/4 The probability of returning to A via B after 1 day is 3/12 and via C is 1/8 total 3/8 We can perform similar calculations for 3 or 4 days but it will quickly become tricky and finding which city he is most likely to end up in is impossible. A BC 1/3 2/3 1/2 1/4 3/4

Transition Matrix Instead we can represent the transitions as a matrix A BC 1/3 2/3 1/2 1/4 3/4 Prob of going to B from A Prob of going to A from C

Markov Chain Transition Basics p ij are the transition probabilities of a chain. They have the following properties: The corresponding probability matrix is:

Transition Matrix Define n as a distribution vector representing the probabilities of each state at time step n. We can now define 1 step in our chain as: And clearly, by iterating this, after m steps we have:

The Return of the Hippy Hitcher What does this imply for our hippy? We know the initial state vector: So we can calculate n with a little drudge work. (If you get bored raising P to the power n then you can use a computer) But which city is the hippy likely to end up in? We want to know

Invariant (or equilibrium) probabilities) Assuming the limit exists, the distribution vector is known as the invariant or equilibrium probabilities. We might think of them as being the proportion of the time that the system spends in each state or alternatively, as the probability of finding the system in a given state at a particular time. They can be found by finding a distribution which solves the equation: We will formalise these ideas in a subsequent lecture.

Some Notation for Markov Chains Formally, a process X n is Markov chain with initial distribution and transition matrix P if: 1. P{X 0 =i} = i (where i is the i th element of ) 2. P{X n+1 =j| X n =i, X n-1 =x n-1,...X 0 =x 0 }= P{X n+1 =j| X n =i }=p ij For short we say X n is Markov (,P) We now introduce the notation for an n step transition: And note in passing that: This is the Chapman-Kolmogorov equation