1 Validation and Verification of Simulation Models.

Slides:



Advertisements
Similar presentations
+ Validation of Simulation Model. Important but neglected The question is How accurately does a simulation model (or, for that matter, any kind of model)
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Estimation in Sampling
Variance reduction techniques. 2 Introduction Simulation models should be coded such that they are efficient. Efficiency in terms of programming ensures.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Chapter 10 Verification and Validation of Simulation Models
Discrete-Event Simulation: A First Course Steve Park and Larry Leemis College of William and Mary.
Lecture 7 Model Development and Model Verification.
Variance Reduction Techniques
Role and Place of Statistical Data Analysis and very simple applications Simplified diagram of scientific research When you know the system: Estimation.
1 Simulation Modeling and Analysis Verification and Validation.
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
Inferences About Process Quality
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Lecture 6 Data Collection and Parameter Estimation.
Model Calibration and Model Validation
Monté Carlo Simulation MGS 3100 – Chapter 9. Simulation Defined A computer-based model used to run experiments on a real system.  Typically done on a.
1 Terminating Statistical Analysis By Dr. Jason Merrick.
Analysis of Simulation Results Andy Wang CIS Computer Systems Performance Analysis.
Simulation Output Analysis
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
Chapter 10 Verification and Validation of Simulation Models
1 Performance Evaluation of Computer Networks: Part II Objectives r Simulation Modeling r Classification of Simulation Modeling r Discrete-Event Simulation.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
 1  Outline  stages and topics in simulation  generation of random variates.
Verification & Validation
Confidence Interval Estimation
Topics: Statistics & Experimental Design The Human Visual System Color Science Light Sources: Radiometry/Photometry Geometric Optics Tone-transfer Function.
MS 305 Recitation 11 Output Analysis I
Verification and Validation
1 Statistical Distribution Fitting Dr. Jason Merrick.
Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient.
VI. Evaluate Model Fit Basic questions that modelers must address are: How well does the model fit the data? Do changes to a model, such as reparameterization,
6-1 Introduction To Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is.
1 Chapter 10: Introduction to Inference. 2 Inference Inference is the statistical process by which we use information collected from a sample to infer.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Inferential Statistics A Closer Look. Analyze Phase2 Nature of Inference in·fer·ence (n.) “The act or process of deriving logical conclusions from premises.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved. 1.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Statistical Inference for the Mean Objectives: (Chapter 9, DeCoursey) -To understand the terms: Null Hypothesis, Rejection Region, and Type I and II errors.
MODES-650 Advanced System Simulation Presented by Olgun Karademirci VERIFICATION AND VALIDATION OF SIMULATION MODELS.
Chapter 10 Verification and Validation of Simulation Models
CHEMISTRY ANALYTICAL CHEMISTRY Fall Lecture 6.
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
1 OUTPUT ANALYSIS FOR SIMULATIONS. 2 Introduction Analysis of One System Terminating vs. Steady-State Simulations Analysis of Terminating Simulations.
Sampling and estimation Petter Mostad
Review of Statistics.  Estimation of the Population Mean  Hypothesis Testing  Confidence Intervals  Comparing Means from Different Populations  Scatterplots.
Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.
1 Chapter 8 Interval Estimation. 2 Chapter Outline  Population Mean: Known  Population Mean: Unknown  Population Proportion.
Chapter 1 Introduction to Statistics. Section 1.1 Fundamental Statistical Concepts.
Chapter 9: Introduction to the t statistic. The t Statistic The t statistic allows researchers to use sample data to test hypotheses about an unknown.
1 Collecting and Interpreting Quantitative Data Deborah K. van Alphen and Robert W. Lingard California State University, Northridge.
The inference and accuracy We learned how to estimate the probability that the percentage of some subjects in the sample would be in a given interval by.
Statistical Inference for the Mean Objectives: (Chapter 8&9, DeCoursey) -To understand the terms variance and standard error of a sample mean, Null Hypothesis,
Building Valid, Credible & Appropriately Detailed Simulation Models
STA248 week 121 Bootstrap Test for Pairs of Means of a Non-Normal Population – small samples Suppose X 1, …, X n are iid from some distribution independent.
Variance reduction techniques Mat Simulation
Estimating standard error using bootstrap
Hypothesis Testing and Confidence Intervals (Part 1): Using the Standard Normal Lecture 8 Justin Kern October 10 and 12, 2017.
Chapter 10 Verification and Validation of Simulation Models
Analyzing Reliability and Validity in Outcomes Assessment Part 1
6-1 Introduction To Empirical Models
MECH 3550 : Simulation & Visualization
Building Valid, Credible, and Appropriately Detailed Simulation Models
Analyzing Reliability and Validity in Outcomes Assessment
Collecting and Interpreting Quantitative Data
Presentation transcript:

1 Validation and Verification of Simulation Models

2 Outline n Introduction n Definitions of validation and verification n Techniques for verification of simulation models n Techniques for validation of simulation models n Statistical Methods for Comparing real-world observations with simulation output data ê Inspection Approach ê Confidence-Interval Approach n Summary

3 Introduction n One of the most difficult problems facing the simulation analyst is determining whether a simulation model is an accurate representation of the actual system being studied ( i.e., whether the model is valid). n If the simulation model is not valid, then any conclusions derived from it is of virtually no value. n Validation and verification are two of the most important steps in any simulation project.

4 What are Validation and Verification? n Validation is the process of determining whether the conceptual model is an accurate representation of the actual system being analyzed. Validation deals with building the right model. n Verification is the process of determining whether a simulation computer program works as intended (i.e., debugging the computer program). Verification deals with building the model right. Real - World System Conceptual Model Simulation Program Validation Verification Validation

5 Techniques for Verification of Simulation Models n Use good programming practice: ê Write and debug the computer program in modules or subprograms. ê In general, it is always better to start with a “moderately detailed” model, and later embellish, if needed. n Use “structured walk-through”: ê Have more than one person to read the computer program. n Use a “trace”: ê The analyst may use a trace to print out some intermediate results and compare them with hand calculations to see if the program is operating as intended.

6 n Check simulation output for reasonableness: ê Run the simulation model for a variety of input scenarios and check to see if the output is reasonable. ê In some instances, certain measures of performance can be computed exactly and used for comparison. n Animate: ê Using animation, the users see dynamic displays (moving pictures) of the simulated system. ê Since the users are familiar with the real system, they can detect programming and conceptual errors. Techniques for Verification of Simulation Models

7 n Compare final simulation output with analytical results: ê May verify the simulation response by running a simplified version of the simulation program with a known analytical result. If the results of the simulation do not deviate significantly from the known mean response, the true distributions can then be used. ê For example, for a queuing simulation model, queuing theory can be used to estimate steady state responses (e.g., mean time in queue, average utilization). These formulas, however, assume exponential interarrival and service times with n servers (M/M/n). Techniques for Verification of Simulation Models

8 n A three-step approach for developing a valid and credible model: 1.Develop a model with high face validity: ê The objective of this step is to develop a model that, on the surface, seems reasonable to people who are familiar with the system under study. ê This step can be achieved through discussions with system experts, observing the system, or the use of intuition. ê It is important for the modeler to interact with the client on a regular basis throughout the process. ê It is important for the modeler to perform a structured walk-through of the conceptual model before key people to ensure the correctness of model’s assumptions. Techniques for Validation of Simulation Models

9 2.Test the assumptions of the model empirically: ê In this step, the assumptions made in the initial stages of model development are tested quantitatively. For example, if a theoretical distribution has been fitted to some observed data, graphical methods and goodness of fit tests are used to test the adequacy of the fit. ê Sensitivity analysis can be used to determine if the output of the model significantly changes when an input distribution or when the value of an input variable is changed. If the output is sensitive to some aspect of the model, that aspect of the model must be modeled very carefully. Techniques for Validation of Simulation Models

10 3.Determine how representative the simulation output data are: ê The most definitive test of a model’s validity is determining how closely the simulation output resembles the output from the real system. ê The Turing test can be used to compare the simulation output with the output from the real system. The output data from the simulation can be presented to people knowledgeable about the system in the same exact format as the system data. If the experts can differentiate between the simulation and the system outputs, their explanation of how they did that should improve the model. ê Statistical methods are available for comparing the output from the simulation model with those from the real-world system. Techniques for Validation of Simulation Models

11 Statistical Methods for Comparing Real-World Observation With Simulation Output Data n Suppose are observations from a real-world system and are output data from the simulation model. n Both, the real-world system outputs and simulation outputs are almost always non-stationary (the distribution of the successive observations change over time) and autocorrelated(the observations are correlated with each other). n Therefore, because classical statistical tests ( the t-test, chi-square test, K-S test, etc.) assume I.I.D data, they can not directly be used to compare the two data sets to determine model validity.

12 n 2 approaches for comparing the outputs from the real- world system with the simulation outputs are: ê Inspection Approach ê Confidence-Interval Approach Statistical Methods for Comparing Real-World Observation With Simulation Output Data

13 n Run the simulation model with historical system input data (e.g., actual observed interarrival and service times) instead of sampling from the input probability distributions, and compare the system and model output data. n The system and the model experience exactly the same observations from the input random variables. This approach results in model and system outputs being positively correlated. Inspection Approach

14 Inspection Approach Historical system input data Historical system input data Actual system Simulation model System output data Model output data

15 n If X is the output from the real-world system and Y is the corresponding output from the model, we are interested in estimating. n We make n experiments (using historical data) and compute X j - Y j (for j = 1, 2, …, n ) as an estimate of. Note that X j and Y j use exactly the same interarrival times and service times). n X j and Y j can then be plotted such that the horizontal axis denotes time and the vertical axis denotes the real and simulated outputs. The user can then eyeball timepaths to see if the model accurately represents the real-world system. Inspection Approach

16 n Due to positive correlation between X and Y, Var ( X - Y ) is much smaller than if X and Y were independent which makes X j - Y j a much better estimate of : Reason: --In general, Var ( X-Y ) = Var ( X ) + Var ( Y ) - 2Cov( X, Y ) --If X and Y are independent, Cov ( X, Y )=0 and Var ( X-Y ) = Var ( X ) + Var ( Y ) --But, if X and Y are positively correlated, Cov( X, Y ) > 0 leading to a smaller value for Var ( X-Y ). Inspection Approach

17 n Inspection approach (also called trace driven method)may provide valuable insight into the adequacy of the simulation model for some simulation studies. n In fact, this may be the only feasible approach because of severe limitations on the amount of data available on the operation of some real-world systems (e.g., military situation). n Care must, however, be taken in interpreting the result of this approach. Inspection Approach

18 n A more reliable approach for comparing a simulation model with the real-world system. n Requires a large amount of data. n Suppose we collect m independent sets of data from the system and n independent sets of data from the model ( m and n can be equal). n Let X j be the average of observations in the j th set of system data with mean and let Y j be the output from the j th replication of the simulation model with. n The objective is to build a confidence- interval for Confidence-Interval Approach

19 Confidence-Interval Approach n In the case of correlated outputs where X j is correlated with Y j (e.g., using trace driven simulation), let m = n and pair X j ‘s and Y j ’s. Let Z j = X j - Y j for j =1, 2 …, n. Z j ’ s are IID random variables with E(Z j ) =. n Let and n Then, the approximate 100(1- ) percent C.I. is n If the confidence interval does not include a zero, then the observed difference between and is statistically different at level.

20 n Validation and verification are two of the most important steps in any simulation study. Validation is not something to be attempted after the simulation model has already been developed, and only if there is time and money still remaining. Instead, model development should be done throughout the entire simulation study. Summary