CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes.

Slides:



Advertisements
Similar presentations
Exponential and Poisson Chapter 5 Material. 2 Poisson Distribution [Discrete] Poisson distribution describes many random processes quite well and is mathematically.
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Continuous-Time Markov Chains Nur Aini Masruroh. LOGO Introduction  A continuous-time Markov chain is a stochastic process having the Markovian property.
Flows and Networks Plan for today (lecture 2): Questions? Continuous time Markov chain Birth-death process Example: pure birth process Example: pure death.
Markov Chains.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 6 on Stochastic Processes Kishor S. Trivedi Visiting Professor.
IERG5300 Tutorial 1 Discrete-time Markov Chain
Discrete Time Markov Chains
TCOM 501: Networking Theory & Fundamentals
IEG5300 Tutorial 5 Continuous-time Markov Chain Peter Chen Peng Adapted from Qiwen Wang’s Tutorial Materials.
Copyright © 2005 Department of Computer Science CPSC 641 Winter Markov Chains Plan: –Introduce basics of Markov models –Define terminology for Markov.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Continuous Time Markov Chains and Basic Queueing Theory
Stochastic Processes Dr. Nur Aini Masruroh. Stochastic process X(t) is the state of the process (measurable characteristic of interest) at time t the.
Lecture 13 – Continuous-Time Markov Chains
048866: Packet Switch Architectures Dr. Isaac Keslassy Electrical Engineering, Technion Review.
1 Review of Probability Theory [Source: Stanford University]
If time is continuous we cannot write down the simultaneous distribution of X(t) for all t. Rather, we pick n, t 1,...,t n and write down probabilities.
Probability theory 2011 Outline of lecture 7 The Poisson process  Definitions  Restarted Poisson processes  Conditioning in Poisson processes  Thinning.
Markov Chains Chapter 16.
Stochastic Process1 Indexed collection of random variables {X t } t   for each t  T  X t is a random variable T = Index Set State Space = range.
1 Markov Chains H Plan: –Introduce basics of Markov models –Define terminology for Markov chains –Discuss properties of Markov chains –Show examples of.
Group exercise For 0≤t 1
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Introduction to Stochastic Models GSLM 54100
Lecture 7  Poisson Processes (a reminder)  Some simple facts about Poisson processes  The Birth/Death Processes in General  Differential-Difference.
Generalizations of Poisson Process i.e., P k (h) is independent of n as well as t. This process can be generalized by considering λ no more a constant.
Exponential Distribution & Poisson Process
1 Exponential Distribution & Poisson Process Memorylessness & other exponential distribution properties; Poisson process and compound P.P.’s.
CDA6530: Performance Models of Computers and Networks Examples of Stochastic Process, Markov Chain, M/M/* Queue TexPoint fonts used in EMF. Read the TexPoint.
Introduction to Stochastic Models GSLM 54100
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Generalized Semi-Markov Processes (GSMP)
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Intro. to Stochastic Processes
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
TexPoint fonts used in EMF.
Modeling and Analysis of Computer Networks
1 Birth and death process N(t) Depends on how fast arrivals or departures occur Objective N(t) = # of customers at time t. λ arrivals (births) departures.
Generalized Semi- Markov Processes (GSMP). Summary Some Definitions The Poisson Process Properties of the Poisson Process  Interarrival times  Memoryless.
Markov Chains X(t) is a Markov Process if, for arbitrary times t1 < t2 < < tk < tk+1 If X(t) is discrete-valued If X(t) is continuous-valued i.e.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
Courtesy of J. Akinpelu, Anis Koubâa, Y. Wexler, & D. Geiger
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Chapter 2 Probability, Statistics and Traffic Theories
Queuing Theory.  Queuing Theory deals with systems of the following type:  Typically we are interested in how much queuing occurs or in the delays at.
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Stochastic Processes1 CPSC : Stochastic Processes Instructor: Anirban Mahanti Reference Book “Computer Systems Performance.
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Fault Tree Analysis Part 11 – Markov Model. State Space Method Example: parallel structure of two components Possible System States: 0 (both components.
Queueing Fundamentals for Network Design Application ECE/CSC 777: Telecommunications Network Design Fall, 2013, Rudra Dutta.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
1 Chapter 5 Continuous time Markov Chains Learning objectives : Introduce continuous time Markov Chain Model manufacturing systems using Markov Chain Able.
Reliability Engineering
Flows and Networks (158052) Richard Boucherie Stochastische Operations Research -- TW wwwhome.math.utwente.nl/~boucherierj/onderwijs/158052/ html.
Exponential Distribution & Poisson Process
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
Lecture on Markov Chain
CPSC 531: System Modeling and Simulation
TexPoint fonts used in EMF.
September 1, 2010 Dr. Itamar Arel College of Engineering
TexPoint fonts used in EMF.
Presentation transcript:

CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes

Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv for each X(t) for each t2 T Index set T --- set of possible values of t t only means time T: countable  discrete-time process T: real number  continuous-time process State space --- set of possible values of X(t)

Counting Process A stochastic process that represents no. of events that occurred by time t; a continuous-time, discrete-state process {N(t), t>0} if N(0)=0 N(t)¸ 0 N(t) increasing (non-decreasing) in t N(t)-N(s) is the Number of events happen in time interval [s, t]

Counting Process Counting process has independent increments if no. events in disjoint intervals are independent P(N1=n1, N2=n2) = P(N1=n1)P(N2=n2) if N1 and N2 are disjoint intervals counting process has stationary increments if no. of events in [t1+s; t2+s] has the same distribution as no. of events in [t1; t2]; s > 0

Bernoulli Process Nt: no. of successes by discrete time t=0,1,…is a counting process with independent and stationary increments p: prob. of success Bernoulli trial happens at each discrete time Note: t is discrete time When n· t, Nt » B(t, p) E[Nt]=tp, Var[Nt]=tp(1-p)

Bernoulli Process X: time between success Geometric distribution P(X=n) = (1-p)n-1p

Little o notation Definition: f(h) is o(h) if f(h)=h2 is o(h) f(h)=h is not f(h)=hr, r>1 is o(h) sin(h) is not If f(h) and g(h) are o(h), then f(h)+g(h)=o(h) Note: h is continuous

Example: Exponential R.V. Exponential r.v. X with parameter ¸ has PDF P(X<h) = 1-e-¸h, h>0 Why? Why?

Poisson Process Counting process {N(t), t¸0} with rate ¸ t is continuous N(0)=0 Independent and stationary increments P(N(h)=1) = ¸h +o(h) P(N(h)¸ 2) = o(h) Thus, P(N(h)=0) = ? P(N(h)=0) = 1 -¸h +o(h) Notation: Pn(t) = P(N(t)=n)

Drift Equations

For n=0, P0(t+¢t) = P0(t)(1-¸¢t)+o(¢t) Thus, dP0(t)/dt = -¸ P0(t) Thus, P0(t) = e-¸t Thus, inter-arrival time is exponential distr. With the same rate ¸ Remember exponential r.v.: FX(x)= 1-e-¸x That means: P(X> t) = e-¸t {X>t} means at time t, there is still no arrival X(n): time for n consecutive arrivals Erlang r.v. with order k Why?

Similar to Poisson r.v. You can think Poisson r.v. is the static distr. of a Poisson process at time t

Poisson Process Take i.i.d. sequence of exponential rvs {Xi} with rate ¸ Define: N(t) = max{n| Σ1· i· n Xi · t}, {N(t)} is a Poisson process Meaning: Poisson process is composed of many independent arrivals with exponential inter-arrival time.

Poisson Process if N(t) is a Poisson process and one event occurs in [0,t], then the time to the event, denoted as r.v. X, is uniformly distributed in [0,t], fX|N(t)=1(x|1)=1/t, 0· x· t Meaning: Given an arrival happens, it could happen at any time Exponential distr. is memoryless One reason why call the arrival with “rate” ¸ Arrival with the same prob. at any time

Poisson Process if N1(t) and N2(t) are independent Poisson processes with rates λ1 and λ2, then N(t) = N1(t) + N2(t) is a Poisson process with rate λ =λ1+ λ2 Intuitive explanation: A Poisson process is caused by many independent entities (n) with small chance (p) arrivals Arrival rate is proportional to population size ¸=np Still a Poisson proc. if two large groups of entities arrives in mixed format

Poisson Process N(t) is Poisson proc. with rate λ , Mi is Bernoulli proc. with success prob. p. Construct a new process L(t) by only counting the n-th event in N(t) whenever Mn >Mn -1 (i.e., success at time n) L(t) is Poisson with rate λp Useful in analysis based on random sampling

Example 1 A web server where failures are described by a Poisson process with rate λ = 2.4/day, i.e., the time between failures, X, is exponential r.v. with mean E[X] = 10hrs. P(time between failures < 1 day) = P(5 failures in 1 day)= P(N(5)<10)= look in on system at random day, what is prob. of no. failures during next 24 hours? failure is memory failure with prob. 1/9, CPU failure with prob. 8/9. Failures occur as independent events. What is process governing memory failures?

Example 2 The arrival of claims at an insurance company follows a Poisson process. On average the company gets 100 claims per week. Each claim follows an exponential distribution with mean $700.00. The company offers two types of policies. The first type has no deductible and the second has a $250.00 deductible. If the claim sizes and policy types are independent of each other and of the number of claims, and twice as many policy holders have deductibles as not, what is the mean liability amount of the company in any 13 week period? First, claims be split into two Poisson arrival processes X: no deductible claims Y: deductible claims Second, the formula for liability?

Birth-Death Process Continuous-time, discrete-space stochastic process {N(t), t >0}, N(t) ∈{0, 1,...} N(t): population at time t P(N(t+h) = n+1 | N(t) = n) = ¸n h + o(h) P(N(t+h) = n-1 | N(t) = n) = ¹n h + o(h) P(N(t+h) = n | N(t) = n) = 1-(¸n + ¹n) h + o(h) ¸n - birth rates ¹n - death rates, ¹0 = 0 Q: what is Pn(t) = P(N(t) = n)? n = 0,1,...

Birth-Death Process Similar to Poisson process drift equation If ¹i=0, ¸i=¸, then B-D process is a Poisson process Initial condition: Pn(0)

Stationary Behavior of B-D Process Most real systems reach equilibrium as t1 No change in Pn(t) as t changes No dependence on initial condition Pn = limt1 Pn(t) Drift equation becomes:

Transition State Diagram Balance Equations: Rate of trans. into n = rate of trans. out of n Rate of trans. to left = rate of trans. to right

Probability requirement:

Markov Process Prob. of future state depends only on present state {X(t), t>0} is a MP if for any set of time t1<<tn+1 and any set of states x1<<xn+1 P(X(tn+1)=xn+1|X(t1)=x1,  X(tn)=xn} = P(X(tn+1)=xn+1| X(tn)=xn} B-D process, Poisson process are MP

Markov Chain Discrete-state MP is called Markov Chain (MC) Discrete-time MC Continuous-time MC First, consider discrete-time MC Define transition prob. matrix:

Chapman-Kolmogorov Equation What is the state after n transitions? A: define Why? Why?

If MC has n state Define n-step transition prob. matrix: C-K equation means:

Markov Chain Irreducible MC: Periodic MC: If every state can be reached from any other states Periodic MC: A state i has period k if any returns to state i occurs in multiple of k steps k=1, then the state is called aperiodic MC is aperiodic if all states are aperiodic

An irreducible, aperiodic finite-state MC is ergodic, which has a stationary (steady-state) prob. distr.

Example Markov on-off model (or 0-1 model) Q: the steady-state prob.? 1 Markov on-off model (or 0-1 model) Q: the steady-state prob.?

An Alternative Calculation Use balance equation: Rate of trans. to left = rate of trans. to right 1

Discrete-Time MC State Staying Time Xi: the number of time steps a MC stays in the same state i P(Xi = k) = Piik-1 (1-Pii) Xi follows geometric distribution Average time: 1/(1-Pii) In continuous-time MC, the staying time is? Exponential distribution time

Homogeneous Continuous-Time Markov Chain P(X(t+h)=j|X(t)=i) = ¸ijh +o(h) We have the properties: The state holding time is exponential distr. with rate Why? Due to the summation of independent exponential distr. is still exponential distr.

Steady-State Ergodic continuous-time MC Define ¼i = P(X=i) Consider the state transition diagram Transit out of state i = transit into state i

Infinitesimal Generator Define Q = [qij] where Q is called infinitesimal generator Why?

Discrete vs. Continues MC Jump at time tick Staying time: geometric distr. Transition matrix P Steady state: State transition diagram: Has self-jump loop Probability on arc Continuous Jump at continuous t Staying time: exponential distr. Infinitesimal generator Q Steady state: State transition diagram: No self-jump loop Transition rate on arc

Semi-Markov Process X(t): discrete state, continuous time State jump: follow Markov Chain Zn State i holding time: follow a distr. Y(i) If Y(i) follows exponential distr. ¸ X(t) is a continuous-time Markov Chain

Steady State Let ¼’j=limn1 P(Zn=j) Let ¼j = limt1 P(X(t)=j) Why?