Presentation is loading. Please wait.

Presentation is loading. Please wait.


Similar presentations

Presentation on theme: "T.C ATILIM UNIVERSITY MODES ADVANCED SYSTEM SIMULATION MODES 650."— Presentation transcript:


2 A COMPREHENSIVE REVIEW OF METHODS FOR SIMULATION OUTPUT ANALYSIS Christos Alexopoulos School of Industrial and Systems Engineering Georgia Institute of Technology Atlanta, Georgia 30332–0205, U.S.A. represented by: Adel Agila

3 Introduction Simulation output analysis – Point estimator and confidence interval – Variance estimation (σ 2 )  confidence interval Independent and identically distributed (IID) – Suppose X 1,…X m are iid

4 The Methods The Replication/Deletion Approach The Regenerative Method The Batch Means Method nonoverlapping Batch Means(NBM) Overlapping Batch Means(OBM) Consistent Batch Means Estimation Methods The Standardized Time Series Method(STS) The Weighted Area Estimator Batched Area Estimators

5 types of simulations with regard to output analysis: Finite-horizon simulations. In this case the simulation starts in a specific state and is run until some terminating event occurs. EX: Bank The output process is not expected to achieve any steady-state behavior. Steady-state simulations. The purpose of a steady-state simulation is the study of the long-run behavior of the system of interest. EX: Hospital

6 Finite-horizon simulations. n output data are needed X 1, X 2,..., X n are collected with the objective of estimating is the sample mean of the data. X i can be : The transit time of unit i through a network, or the total time station i is busy during the ith hour. Then is an unbiased estimator for μ Xi are generally dependent random variables

7 Finite-horizon simulations. Then, let be the sample variance of the data Then, the estimator a biased estimator of If X i are positively correlated, To overcome the problem, run k independent replications

8 Replications Xij are the output data can be illustrated as: and the replicate averages are are (IID)random var’s Here their means is unbiased estimator of μ and their sample variance is an unbiased estimator of Approximate 1-α cI for u 0<α<1, T k-1,1- α/2 = upper critical point for t distribution,k-1 DOF

9 STEADY-STATE ANALYSIS Stationary process :The process X = {X i } is called stationary if the joint distribution of is independent of i for all indices j 1,j 2,..., jk and all k ≥ 1. weakly stationary process: If E(Xi) = µ, Var(X i ) ≡ σ X 2 < ∞ for all i, and the Cov (X i,X i+j ) is independent of i, then X is called WSP. σ X 2 the (asymptotic) variance parameter of X.

10 Stationary Process Stationary time series with positive autocorrelation Stationary time series with negative autocorrelation Nonstationary time series with an upward trend The stochastic process X is stationary for t 1,…,t k, t ∈ T, if

11 Stationary Process A discrete-time stationary process X = {X i : i≥1} with mean µ and variance σ X 2 = Cov(X i,X i ), Variance of the sample mean Then (the (asymptotic) variance parameter) For iid,

12 Stationary Process The expected value of the variance estimator is: – If Xi are independent, then is an unbiased estimator of – If the autocorrelation is positive, then is biased low as an estimator of – If the autocorrelation is negative, then is biased high as an estimator of

13 Stationary Process Cov(X, Y) = Σ ( X i - X ) ( Y i - Y ) / N = Σ x i y i / N where N is the number of scores in each set of data X is the mean of the N scores in the first data set X i is the ithe raw score in the first set of scores x i is the ith deviation score in the first set of scores Y is the mean of the N scores in the second data set Y i is the ithe raw score in the second set of scores y i is the ith deviation score in the second set of scores Cov(X, Y) is the covariance of corresponding scores in the two sets of datamean

14 Functional Central Limit Theorem (FCLT) Assumption. Suppose the series is convergent, and σ 2 >0. where R j = Cov ( X i,X i+j ), and σ 2 : -the (asymptotic) variance parameter of X - equals As n→∞,we have the following convergent: t ≥ 0.

15 The variance of the sample mean in terms of the autocovariance function is Assumption: Along the above equation 0<σ 2 <∞. Imply The paper focuses on methods for obtaining CIs for μ, which involve estimating σ 2. CI:1-α Confidence Interval

16 The Replication/Deletion Approach K independent replications each of length l +n observations. Discard the first l observations from each run. Use the IID sample means Compute the point estimate If k is large,compute the approximate 1-α cI for μ: Ex: Note

17 The Replication/Deletion Approach For ( l,n, and k) (a) As l increased for fixed n, the “systematic” error in each Y i ( l, n) due to the initial conditions decreased. (b) As n increased for fixed l, the systematic and sampling errors in Y i ( l, n) decreased. (c) #of replications k cannot effect The Y i ( l, n) k. (d) For fixed n, the CI is valid only if l / lnk → ∞ as k → ∞. l increase faster than lnk. Replication method (more )is expensive

18 The Regenerative Method The basic concept underlying this approach is that for many systems a simulation run can be divided into a series of cycles such that the evolution of the system in a cycle is a probabilistic replica of the evolution in any other cycle. IID cycles The Method (we have) Random time indices 1≤T 1

19 The Regenerative Method Y1=62-53=9 Y3= =7 Y2=70-62=8 Z1=62-24=38 To obtain estimates of the expected value of some random variable X

20 The Regenerative Method

21 Disadvantages difficult to apply in prac­tice because the majority of simulations have either no regenerative points or very long cycle lengths.

22 The Batch Means Method nonoverlapping Batch Means(NBM) To compute points and CI estimators for the mean µ. The method suppose the sample x 1, x 2,….x n. Divide the sample into k batches with m observations (n=km). Then, for i=1,2,……,k, the ith batch consists of the observations X (i+1)m+1, X (i+1)m+2,….,X im And the ith batch mean The NBM-based estimator of the mean is

23 Non-overlapping batch mean (NBM) Batch means method (Nonoverlapping batch mean) m observations with batch mean Y 1,m Batch 1 m observations with batch mean Y k,m Batch k

24 Nonoverlapping batch mean (NBM) Suppose the batch means become uncorrelated as m  ∞ NBM estimator for σ 2 Confidence Interval

25 Consistent Batch Means Estimation Methods Alternative rules that yield strongly consistent estimators for The Assumption of Strong Approximation(ASA) Given a constant, and a finite random variable C such that, as n →∞ Where w(n) is a standard Brownian motion process. λ→1/2 normal distribution and low correlation among Xi. λ→0 the absence of one of the above.

26 The Assumption of Strong Approximation(ASA) Theorem suppose ASA is hold, m n is batch sizes and k n is batch counts Such that, as n →∞ And for some finite integer q ≥ 1. Then, and Where N(0,1) is a standard normal random variable.

27 Overlapping Batch Means(OBM) For a given m, this method uses all n-m+1 overlapping batches to estimate µ and the first batch X 1,…….,X m, the second batch X 2,……….., X m+1.etc The OBM estimator of µ is Where (batch mean) The OBM estimator of σ 2 is Where k=n/m.

28 Overlapping Batch Mean (OBM) OBM estimator for σ 2 Y 1,m Y 2,m

29 NBM vs. OBM 29 Under mild conditions – Thus, both have similar bias Variance of the estimators – Thus, the OBM method gives better (asymptotic)performance than NBM.

30 The Standardized Time Series Method(STS) The estimator based on STS applied to batches. The method For the sample X 1,X 2,…..X n. Define D 0,n =0, and D i,n =Ŷ i -Ŷ n, for i=1,….,n; Scales the sequence D i,n by and the time by setting t=i/n. Then STS is If X satisfies a FCLT, then as n →∞

31 Standardized Time Series Define the ‘centered’ partial sums of X i as Central Limit Theorem Define the continuous time process Question: How does T n (t) behave as n increases?

32 n=100 32

33 n=

34 n=

35 n=1,000,000 35

36 To estimate σ 2 We define the square of the weighted area under the standardized time series. its limiting functional is Where And satisfies the following If the above hold, then Where denotes equivalence in distribution, and Nor(0,1) denotes the standard normal random variable. The Weighted Area Estimator

37 Under assumption FCLT, the continuous mapping theorem (CMT) implies A( f )=σ 2 X v 2,where X v 2 denotes chi-squared random variable with v degrees of freedom. And var[A( f )]=var[σ 2 X v 2 ]=2σ 4.

38 Batched Area Estimators Divide the run into contiguous, nonoverlapping batches Form an STS estimator from each batch. Take the average of the estimators. The STS from batch i ( i=1,2,….K then Where (i=1,…,k),&(j=1,….m) If Assumption FCLT holds, then

39 Batched Area Estimators Where (i) the [Z i : i = 1,……,k] are IID. standard normal random variables; (ii) the [Z i : i =1,…….,k] are independent of the B s ; and (iii) B s denotes a standard Brownian bridge on [s,s+1], for The area estimator from batch i is

40 Batched Area Estimators and the batched area estimator for σ 2 is Since the T i,m, i = 1,...,k, converge to independent Brownian bridges as m becomes large (with fixed k), we shall assume that the the A i (f; m) are asymptotically independent as m →∞. Then,we have Then, the variance of the batched area estimator : as m →∞.


Similar presentations

Ads by Google