Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building Valid, Credible, and Appropriately Detailed Simulation Models

Similar presentations


Presentation on theme: "Building Valid, Credible, and Appropriately Detailed Simulation Models"— Presentation transcript:

1 Building Valid, Credible, and Appropriately Detailed Simulation Models
Chapter 5 Building Valid, Credible, and Appropriately Detailed Simulation Models

2 Outline Introduction and definitions
Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

3 Verification vs. Validation
Verification is concerned with determining whether the conceptual simulation model has been correctly translated into a computer “program”. Validation is the process of determining whether a simulation model is an accurate representation of the system. 2019/4/28 Prof. Huei-Wen Ferng

4 Perspectives on Validation
If a simulation model is “valid”, then it can be used to make decisions about the system. The ease or difficulty of the validation process depends on the complexity of the system being modeled and on whether a version of the system currently exists. A simulation model of a complex system can only be an approximation to the actual system. A model that is valid for one purpose may not be for another. 2019/4/28 Prof. Huei-Wen Ferng

5 Credibility and Accreditation
Credibility is supported by the manager and other key project personnel. Note that credible is not necessarily valid. Accreditation is an official determination that a simulation model is acceptable for a particular purpose. 2019/4/28 Prof. Huei-Wen Ferng

6 Things Help Establish Credibility
The manager’s understanding and agreement Demonstration that the model has been validated and verified The manager’s ownership of and involvement with the project Reputation of the model developers 2019/4/28 Prof. Huei-Wen Ferng

7 Issues of Accreditation
Verification and validation that have been done Simulation model development and use history Quality of data that are available Quality of the documentation Known problems or limitations with the simulation model 2019/4/28 Prof. Huei-Wen Ferng

8 Timing and Relationship
2019/4/28 Prof. Huei-Wen Ferng

9 Validation and Output Analysis
Validation should be contrasted with output analysis. Output analysis is a statistical issue concerned with estimating a simulation model’s true measures of performance. 2019/4/28 Prof. Huei-Wen Ferng

10 Validation and Output Analysis (Cont’d)
Validation is concerned with the second term. Output analysis is concerned with the first term. 2019/4/28 Prof. Huei-Wen Ferng

11 Outline Introduction and definitions
Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

12 Before the Simulation A simulation practitioner must
determine what aspects actually needed to be incorporated; what level of detail; what aspects can be safely ignored. It is rarely necessary to have a 1-1 correspondence between each element of the system and that of the model. 2019/4/28 Prof. Huei-Wen Ferng

13 Some General Guidelines
Carefully define the specific issues to be investigated. The entity in the simulation does not have to be the same as the entity of the corresponding system. Use subject-matter experts (SMEs) and sensitivity analyses to help determine the level of model detail. A mistake for novice is to include an excessive amount of model detail. Do not have more detail in the model than is necessary to address the issue of interest. The level of model detail should be consistent with the type of data available Time and money constraints are a major factor. Use a “coarse” simulation model or an analytic model to identify what factors have a significant impact on system performance. A “detailed” simulation model is then built. 2019/4/28 Prof. Huei-Wen Ferng

14 Outline Introduction and definitions
Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

15 Eight Techniques Write and debug the computer program in modules or subprograms. More than one person to review the computer program Run the simulation under a variety of settings. Trace a program. 2019/4/28 Prof. Huei-Wen Ferng

16 Eight Techniques (Cont’d)
The model should be run under simplifying assumptions for which its true characteristics are know or can easily be computed. Observe an animation of the simulation output. Check the input part. Use a commercial simulation package to reduce the amount of programming required. 2019/4/28 Prof. Huei-Wen Ferng

17 Outline Introduction and definitions
Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

18 Six Classes to Increase the Validity and Credibility
Collect high-quality information and data on the system Interact with the manager on a regular basis Maintain an assumptions document and perform a structured walk-through Validate components of the model by using quantitative techniques Validate the output from the overall simulation model Animation 2019/4/28 Prof. Huei-Wen Ferng

19 Collect High-Quality Information and Data on the System
In developing a simulation model, the analyst should make use Conversations with SMEs A simulation model is not an abstraction developed by an analyst working in isolation Observations of the system Data are not representative of what one really wants to model Data are not of the appropriate type or format Data may contain measurement, recording, or rounding errors Data may be “biased” because of self-interest Data may have inconsistent units 2019/4/28 Prof. Huei-Wen Ferng

20 Collect High-Quality Information and Data on the System (Cont’d)
Existing theory Relevant results from similar simulation studies Experience and intuition of the modelers 2019/4/28 Prof. Huei-Wen Ferng

21 Interact with the Manager on a Regular Basis -- Benefits
As the simulation study proceeds and the nature of the problem becomes clearer, this information should be conveyed to the manager. The manager’s interest and involvement in the study are maintained. The manager’s knowledge of the system contributes to the actual validity of the model. The model is more credible since the manager understands and accepts the model’s assumptions. 2019/4/28 Prof. Huei-Wen Ferng

22 Assumptions Document An overview section
Detailed descriptions of each subsystem, in bullet format, and how these subsystems interact. What simplifying assumptions were made and why. Summaries of data Sources of important or controversial information 2019/4/28 Prof. Huei-Wen Ferng

23 Structured Walk-Through
A structured walk-through will increase both the validity and credibility of the simulation model. The structured walk-through of the assumptions document might be called conceptual model validation. 2019/4/28 Prof. Huei-Wen Ferng

24 Validate Components of the Model by Using Quantitative Techniques
Sensitivity analysis is used to determine which model factors have a important impact on the desired measures of performance. Examples of factors including The value of a parameter The choice of a distribution The entity moving through the simulated system The level of detail for a subsystem What data are most crucial to collect 2019/4/28 Prof. Huei-Wen Ferng

25 Validate the Output from the Overall Simulation Model
The idea of comparing the model and subsystem output data for the existing system might be called results validation. 2019/4/28 Prof. Huei-Wen Ferng

26 Animation An animation can be an effective way to find invalid model assumptions and to enhance the credibility of a simulation model. 2019/4/28 Prof. Huei-Wen Ferng

27 Outline Introduction and definitions
Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

28 Some of Responsibilities of the Manager
Formulating problem objectives Directing personnel to provide information and data to the simulation modeler and to attend the structured walk-through Interact with the simulation modeler on a regular basis Using the simulation results as an aid in the decision-making process 2019/4/28 Prof. Huei-Wen Ferng

29 Outline Introduction and definitions
Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

30 Three Approaches Inspection approach
Confidence-interval approach based on independent data Time-series approaches 2019/4/28 Prof. Huei-Wen Ferng

31 Inspection Approach Suppose that are observations from a real-world and that are output data from a corresponding simulation model. The difficulty with this (basic) inspection approach is that each statistic is essentially a sample of size 1. Example 5.34 illustrate the danger of using inspection. 2019/4/28 Prof. Huei-Wen Ferng

32 Inspection Approach (Cont’d)
2019/4/28 Prof. Huei-Wen Ferng

33 Inspection Approach (Cont’d)
Correlated inspection approach 2019/4/28 Prof. Huei-Wen Ferng

34 Inspection Approach (Cont’d)
2019/4/28 Prof. Huei-Wen Ferng

35 Confidence-Interval Approach
Suppose we collect m independent sets of data from the system and n independent sets of data from the model. Let Xj be the average of the observations in the jth set of system data Let Yj be the average of the observations in the jth set of model data Two approaches: paired-t (see Ch 10.2) and Welch. (see Example 5.38) Two difficulties: it may require a large amount of data; it provides no information about the autocorrelation structures. 2019/4/28 Prof. Huei-Wen Ferng

36 Time-Series Approaches
A time series is a finite realization of a stochastic process Requires only one set of each type of output data May yield information on the autocorrelation structures of two output processes Examples see Ch 9. 2019/4/28 Prof. Huei-Wen Ferng


Download ppt "Building Valid, Credible, and Appropriately Detailed Simulation Models"

Similar presentations


Ads by Google