References for M/G/1 Input Process

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Doc.: IEEE /1216r1 Submission November 2009 BroadcomSlide 1 Internet Traffic Modeling Date: Authors: NameAffiliationsAddressPhone .
Network and Service Assurance Laboratory Analysis of self-similar Traffic Using Multiplexer & Demultiplexer Loaded with Heterogeneous ON/OFF Sources Huai.
STAT 497 APPLIED TIME SERIES ANALYSIS
Corporate Banking and Investment Mathematical issues with volatility modelling Marek Musiela BNP Paribas 25th May 2005.
2  Something “feels the same” regardless of scale 4 What is that???
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Lecture 13 – Continuous-Time Markov Chains
Introduction to stochastic process
Simulation Where real stuff starts. ToC 1.What, transience, stationarity 2.How, discrete event, recurrence 3.Accuracy of output 4.Monte Carlo 5.Random.
CMPT 855Module Network Traffic Self-Similarity Carey Williamson Department of Computer Science University of Saskatchewan.
A gentle introduction to fluid and diffusion limits for queues Presented by: Varun Gupta April 12, 2006.
Chapter 6 Continuous Random Variables and Probability Distributions
Small scale analysis of data traffic models B. D’Auria - Eurandom joint work with S. Resnick - Cornell University.
3.3 Brownian Motion 報告者:陳政岳.
Probability By Zhichun Li.
A Nonstationary Poisson View of Internet Traffic T. Karagiannis, M. Molle, M. Faloutsos University of California, Riverside A. Broido University of California,
Self-Similarity in Network Traffic Kevin Henkener 5/29/2002.
4. Review of Basic Probability and Statistics
Chapter 5 Continuous Random Variables and Probability Distributions
Review of Probability and Random Processes
Queuing Networks: Burke’s Theorem, Kleinrock’s Approximation, and Jackson’s Theorem Wade Trappe.
Self-Similar through High-Variability: Statistical Analysis of Ethernet LAN Traffic at the Source Level Walter Willinger, Murad S. Taqqu, Robert Sherman,
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Chapter 4 Continuous Random Variables and Probability Distributions
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Self-Similarity of Network Traffic Presented by Wei Lu Supervised by Niclas Meier 05/
1 Chapters 9 Self-SimilarTraffic. Chapter 9 – Self-Similar Traffic 2 Introduction- Motivation Validity of the queuing models we have studied depends on.
Panel Topic: After Long Range Dependency (LRD) discoveries, what are the lessons learned so far to provide QoS for Internet advanced applications David.
Probability Theory and Random Processes
Simulation Output Analysis
Traffic Modelling and Related Queueing Problems Presenter: Moshe Zukerman ARC Centre for Ultra Broadband Information Networks EEE Dept., The University.
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Exponential and Chi-Square Random Variables
“A non parametric estimate of performance in queueing models with long-range correlation, with applications to telecommunication” Pier Luigi Conti, Università.
Traffic Modeling.
CHAPTER 4 Multiple Random Variable
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Queuing Theory Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues Chapter.
Queueing Theory What is a queue? Examples of queues: Grocery store checkout Fast food (McDonalds – vs- Wendy’s) Hospital Emergency rooms Machines waiting.
EE6610: Week 6 Lectures.
Link Dimensioning for Fractional Brownian Input Chen Jiongze PhD student, Electronic Engineering Department, City University of Hong Kong Supported by.
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Chapter 20 Queuing Theory to accompany Operations Research: Applications and Algorithms 4th edition by Wayne L. Winston Copyright (c) 2004 Brooks/Cole,
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 2 Probability, Statistics and Traffic Theories
IMA Summer Program on Wireless Communications VoIP over a wired link Phil Fleming Network Advanced Technology Group Motorola, Inc.
Performance Evaluation of Long Range Dependent Queues Performance Evaluation of Long Range Dependent Queues Chen Jiongze Supervisor: Moshe ZukermanCo-Supervisor:
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Risk Analysis Workshop April 14, 2004 HT, LRD and MF in teletraffic1 Heavy tails, long memory and multifractals in teletraffic modelling István Maricza.
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
1 Part Three: Chapters 7-9 Performance Modeling and Estimation.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Notices of the AMS, September Internet traffic Standard Poisson models don’t capture long-range correlations. Poisson Measured “bursty” on all time.
Random Variables r Random variables define a real valued function over a sample space. r The value of a random variable is determined by the outcome of.
1 Interesting Links. On the Self-Similar Nature of Ethernet Traffic Will E. Leland, Walter Willinger and Daniel V. Wilson BELLCORE Murad S. Taqqu BU Analysis.
Stochastic Process - Introduction
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Internet Traffic Modeling
Brownian Motion for Financial Engineers
Notices of the AMS, September 1998
STOCHASTIC HYDROLOGY Random Processes
Presented by Chun Zhang 2/14/2003
Chapter 6 Random Processes
CPSC 641: Network Traffic Self-Similarity
Presentation transcript:

Internet Analysis - Performance Models - G.U. Hwang Next Generation Communication Networks Lab. Division of Applied Mathematics KAIST

References for M/G/1 Input Process Krunz and Makowski, Modeling Video Traffic Using M/G/1 Input Process, IEEE JSAC, vol. 16, 733-748, 1998 Self-similar Network Traffic and Performance Evaluation, Eds. K. Park and W. Willinger, John Wiley & Sons, 2000. B. Tsybakov, N.D. Georganas, Overflow and losses in a network queue with a self-similar input, Queueing Systems, vol. 35, 201-235, 2000 M. Zukerman, T.D. Neame and R.G. Addie, Internet traffic modeling and future technology implications, INFOCOM 2003, 587-596. Next Generation Communication Networks Lab.

The M/G/1 arrival model Consider a discrete time system with an infinite number of servers. During time slot [n,n+1), we have Poisson arrivals with rate l and each arrival requires service time X according to a p.m.f. sn, n¸ 1 where E[X]<1. c.f. a customer arriving at the M/G/1 system can be considered as a burst. When there are bn busy servers in the beginning of slot [n,n+1), the number of packets generated is bn. c.f. Each burst generates packets during its holding time. We assume the system is in the steady state. Next Generation Communication Networks Lab.

The {bn} process arrivals Next Generation Communication Networks Lab.

The process {bn} of the M/G/1 arrivals Let Yk denote a Poisson random variable with parameter lP{X¸ k}, which denotes the number of bursts arriving at [n-k,n-k+1) and being still in the system at time [n,n+1). bn = åk=11 Yk = Poisson R.V. with parameter lE[X]. n-5 n-4 n-3 n-2 n-1 n n+1 n+2 n+3 n+4 Next Generation Communication Networks Lab.

the stationary version of {bn}n¸ 0 b0 : the initial number of bursts a Poisson r.v. with parameter E[X] the length of each initial burst is according to the forward recurrence time Xr of X the forward recurrence time X 1 2 3 4 5 6 7 8 9 Next Generation Communication Networks Lab.

The autocovariance function of {bn} Let The autocovariance function of {bn} The autocorrelation function of {bn} Next Generation Communication Networks Lab.

The M/G/1 arrival model is long range dependent if E[X2] = 1. Then The M/G/1 arrival model is long range dependent if E[X2] = 1. short range dependent if E[X2] < 1. Next Generation Communication Networks Lab.

A Pareto distribution A random variable Y is called to have a Pareto distribution if its distribution function is given by where 0 < g < 2 is the shape parameter and d (> 0) is called the location parameter. Remarks: If 0 <  < 2, then Y has infinite variance. If 0 <  · 1, then Y has infinite mean. Next Generation Communication Networks Lab.

The expectation of the Pareto distribution The distribution of the forward recurrence time Yr of the Pareto distribution Next Generation Communication Networks Lab.

The M/Pareto arrival process When the service times are Pareto distributed given above, we have M/Pateto input process (or Poisson Pareto Burst input process). Now let A(t) be the total amount of work arriving in the period (0,t]. We assume that each burst in the system generate r bits per slot. Next Generation Communication Networks Lab.

The mean and variance of A(t) If we define H = (3-)/2 and 1<<2, then the M/Pareto input process is asymptotically self-similar with Hurst parameter H. c.f. Var[Yt] = t2H Var[Y1] for a self-similar process Yt Next Generation Communication Networks Lab.

A sample path of the M/Pareto arrivals  = 0.4,  = 1.18 , = 0.9153 Next Generation Communication Networks Lab.

The autocorrelaton function Next Generation Communication Networks Lab.

c.f. M/G/1 for S.R.D. Krunz and Makowski, Modeling Video Traffic Using M/G/1 Input Process, IEEE JSAC, vol. 16, 733-748, 1998 M/G/1 input process is used to model video traffic encoded by DCT. Next Generation Communication Networks Lab.

Fractal Brownian Motion Consider a self-similar process Yt and wide sense stationary increments Xn. Recall that For 0 < H · 1, we can show that the function r(t,s) is nonnegative definite, i.e., for any real numbers t1, , tn and u1,,un, i=1nj=1n r(ti,tj) ui uj ¸ 0. Next Generation Communication Networks Lab.

Definition of a joint normal distribution The vector X = (X1,,Xk), is said to have a joint normal distribution N(0,) if the joint characteristic function is given by where E[Xi] = 0 for all 1· i · m and =(mn) is the covariance matrix defined by mn = E[XmXn] for 1· m,n · k. Next Generation Communication Networks Lab.

Definition of a Gaussian process A stochastic process Yt is Gaussian if every finite set {Yt1,Yt2,,Ytn } has a joint normal distribution for all n. From classical probability theory, there exists a Gaussian process whose finite dimensional distributions are joint normal distributions N(0,) where  = (r(t,s)). Next Generation Communication Networks Lab.

A self-similar Gaussian process Yt with stationary increments Xn having 0 < H < 1 is called a fractional Brownian Motion (fBm). If E[Yt] = 0 and E[Yt2] = 2 |t|2H for some  > 0 for a Gaussian process, then we get Next Generation Communication Networks Lab.

Suppose that a stochastic process Yt Theorem Suppose that a stochastic process Yt is a Gaussian process with zero mean, Y0 = 0, E[Yt2] = 2 |t|2H for some  > 0 and 0 < H < 1, has stationary increments; then {Yt} is called a fractional Brownian motion. c.f. The self-similarity comes from the following: Next Generation Communication Networks Lab.

c.f. The fractional Gaussian Noise The increment process of the fractional Brownian motion with Hurst parameter H is called the fractional Gaussian Noise (fGN) with Hurst parameter H. Next Generation Communication Networks Lab.

q(t) = sups· t [A(t) - A(s) - C(t-s)] Consider a queueing system with input process At = t + Yt where Yt is a normalized fBM,i.e., E[Yt2] = 1. Then the queue content process q(t) is given by q(t) = sups· t [A(t) - A(s) - C(t-s)] where C is the output link capacity. Assume that q = limt!1 q(t) exists. Next Generation Communication Networks Lab.

A lower bound for the queue length Since Yt has stationary increments, we get Next Generation Communication Networks Lab.

Hence, from the fact that Yt » N(0,t2H) we get where F(x) denotes the distribution function of a standard normal R.V. Next Generation Communication Networks Lab.

Next Generation Communication Networks Lab.

The superposition of ON/OFF sources Consider an ON/OFF source with the following properties The ON periods are according to a heavy tail distribution The OFF periods are either heavy tailed or light tailed with finite variance. The superposition of N ON/OFF sources is shown to behave like the fractional Brownian Motion when N is sufficiently large. Next Generation Communication Networks Lab.

Traffic model in the backbone T. Karagiannis et. al, A nonstationary Poisson view of internet traffic, INFOCOM 2004, 1558-1569. Traffic appears Poisson at sub-second time scale Next Generation Communication Networks Lab.

The complementary distribution function of the Packet interarrival times exponential distribution Next Generation Communication Networks Lab.

Traffic follows a non-stationary Poisson process at multi-second time scale points of rate changes relative magnitude of the change in the slope change of free region Next Generation Communication Networks Lab.

The change of Hurst parameters Hurst parameters of time intervals of length 20 sec the reasons for change: self-similarity of the original traffic the change in routing the change in the number of active sources Next Generation Communication Networks Lab.

Autocorrelation for the magnitude of rate changes (i.e., the height of the spikes in Fig. 7) a negative correlation at lag 1 95 % C.I. for 0 Next Generation Communication Networks Lab.

The complementary distribution function for the lengths of the change of free intervals (the stationary intervals) exponential distribution A Markovian random walk model would be a good candidate Next Generation Communication Networks Lab.

Traffic appears LRD at large time scales original ACF ACF using moving averages Next Generation Communication Networks Lab.

Summary Due to the high variability of the internet traffic it is very difficult to give good mathematical models and additionally estimate the traffic parameters. continuous traffic measurements should be done to reflect the changes of the internet traffic characteristics on performance models. Next Generation Communication Networks Lab.