2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours)

Slides:



Advertisements
Similar presentations
Lecture 5 This lecture is about: Introduction to Queuing Theory Queuing Theory Notation Bertsekas/Gallager: Section 3.3 Kleinrock (Book I) Basics of Markov.
Advertisements

Modeling and Dimensioning of Mobile Networks: from GSM to LTE
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Flows and Networks Plan for today (lecture 2): Questions? Continuous time Markov chain Birth-death process Example: pure birth process Example: pure death.
Markov Chains.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Discrete Time Markov Chains
Topics Review of DTMC Classification of states Economic analysis
TCOM 501: Networking Theory & Fundamentals
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
Copyright © 2005 Department of Computer Science CPSC 641 Winter Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Continuous Time Markov Chains and Basic Queueing Theory
Lecture 13 – Continuous-Time Markov Chains
Nur Aini Masruroh Queuing Theory. Outlines IntroductionBirth-death processSingle server modelMulti server model.
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
Chap. 20, page 1051 Queuing Theory Arrival process Service process Queue Discipline Method to join queue IE 417, Chap 20, Jan 99.
1.(10%) Let two independent, exponentially distributed random variables and denote the service time and the inter-arrival time of a system with parameters.
Queueing Theory (2). Home Work 12-9 and Due Day: October 31 (Monday) 2005.
Queueing Theory: Part I
Question 11 – 3.
1 TCOM 501: Networking Theory & Fundamentals Lectures 9 & 10 M/G/1 Queue Prof. Yannis A. Korilis.
Stochastic Process1 Indexed collection of random variables {X t } t   for each t  T  X t is a random variable T = Index Set State Space = range.
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.
1 Markov Chains H Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of.
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy
Lecture 7  Poisson Processes (a reminder)  Some simple facts about Poisson processes  The Birth/Death Processes in General  Differential-Difference.
Queuing Networks. Input source Queue Service mechanism arriving customers exiting customers Structure of Single Queuing Systems Note: 1.Customers need.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Introduction to Stochastic Models GSLM 54100
Analysis of M/M/c/N Queuing System With Balking, Reneging and Synchronous Vacations Dequan Yue Department of Statistics, College of Sciences Yanshan University,
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
1 Elements of Queuing Theory The queuing model –Core components; –Notation; –Parameters and performance measures –Characteristics; Markov Process –Discrete-time.
CS433 Modeling and Simulation Lecture 12 Queueing Theory Dr. Anis Koubâa 03 May 2008 Al-Imam Mohammad Ibn Saud University.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Model under consideration: Loss system Collection of resources to which calls with holding time  (c) and class c arrive at random instances. An arriving.
1 Markov chains and processes: motivations Random walk One-dimensional walk You can only move one step right or left every time unit Two-dimensional walk.
Flows and Networks Plan for today (lecture 2): Questions? Birth-death process Example: pure birth process Example: pure death process Simple queue General.
S TOCHASTIC M ODELS L ECTURE 3 P ART II C ONTINUOUS -T IME M ARKOV P ROCESSES Nan Chen MSc Program in Financial Engineering The Chinese University of Hong.
Discrete Time Markov Chains
Chapter 6 Product-Form Queuing Network Models Prof. Ali Movaghar.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
8/14/04J. Bard and J. W. Barnes Operations Research Models and Methods Copyright All rights reserved Lecture 11 – Stochastic Processes Topics Definitions.
Queueing Fundamentals for Network Design Application ECE/CSC 777: Telecommunications Network Design Fall, 2013, Rudra Dutta.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Markov Chains.
Discrete-time Markov chain (DTMC) State space distribution
Ergodicity, Balance Equations, and Time Reversibility
Discrete-time markov chain (continuation)
Much More About Markov Chains
Al-Imam Mohammad Ibn Saud University
Finite M/M/1 queue Consider an M/M/1 queue with finite waiting room.
Discrete-time markov chain (continuation)
Markov Chains Carey Williamson Department of Computer Science
Lecture 7 Poisson Processes (a reminder)
Chapman-Kolmogorov Equations
Introduction to Queueing Theory
Queueing networks.
September 1, 2010 Dr. Itamar Arel College of Engineering
Carey Williamson Department of Computer Science University of Calgary
CS723 - Probability and Stochastic Processes
Kendall’s Notation ❚ Simple way of summarizing the characteristics of a queue. Arrival characteristics / Departure characteristics / Number of servers.
CS723 - Probability and Stochastic Processes
Lecture 5 This lecture is about: Introduction to Queuing Theory
Presentation transcript:

2003 Fall Queuing Theory Midterm Exam(Time limit:2 hours) (10%) For the Markov Chain in figure 1, which states are: (a) recurrent ? (2%) (b) transient ? (2%) (c) aperiodic ? (2%) (d) periodic ? (2%) (e) Is this chain reducible ? Why or why not? (2%) 1 2 3 4 5 Fig.1

2. ( 6%) Identify the following systems in figure 2 and make complete notations. (eg: X / X / X / X / X) (a) (3%) G G 100 G (b) (3%) M Fig.2

(15%) Consider that the discrete-state,discrete-time Markov chain transition probability matrix is given by . (a) Find the stationary state probability vector . (5%) (b) Find . (5%) (c) Find the general form for . (5%)

(14%) Given the differential-difference equations: Define the Laplace transform . For the initial condition we assume for . Transform the differential-difference equations to obtain a set of linear difference equations in . (a) Show that the solution to the set of equations is: (10%) (b) From (a), find for the case . (4%)

(15%) Consider an M/M/1 system with parameters , in which customers are impatient. Specifically, upon arrival, customers estimate their queuing time and then join the queue with probability or leave with probability . The estimate is when the new arrival finds in the system. Assume . (a) In terms of , find the equilibrium probabilities of finding in the system. Give an expression for in terms of the system parameters. (5%) (b) For , under what conditions will the equilibrium solution hold? (5%) (c) For ,find explicitly and find the average number in the system. (5%)

(15%) Consider an M/M/1 queuing system, the arrival rate is and the service rate is : (a) What three properties would make a Markov chain ergodic? (6%) (b) Prove that the limiting distribution exists only when . (4%) (c) When , argue that the M/M/1 queuing system is ergodic. (5%)

(a) Derive using notations provided above. (10%) (25%) Consider a discrete-time birth-death chain as shown in figure 3. The death rate is p and the birth rate is (1-p). The ratio between birth rate and death rate is . (a) Derive using notations provided above. (10%) (Hint: is the probability that the chain starts at state i and visits state 0 before it visits state m.) (b) Show with derivation that this system is: (5%) 1 2 3 Fig.3