Building Valid, Credible, and Appropriately Detailed Simulation Models

Slides:



Advertisements
Similar presentations
Software Quality Assurance Plan
Advertisements

Chapter 10 Verification and Validation of Simulation Models
Software Testing and Quality Assurance
Discrete-Event Simulation: A First Course Steve Park and Larry Leemis College of William and Mary.
Steps of a sound simulation study
Simulation.
Lecture 7 Model Development and Model Verification.
1 Simulation Modeling and Analysis Verification and Validation.
1 Validation and Verification of Simulation Models.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
© 2006 Pearson Addison-Wesley. All rights reserved2-1 Chapter 2 Principles of Programming & Software Engineering.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
AUDIT PROCEDURES. Commonly used Audit Procedures Analytical Procedures Analytical Procedures Basic Audit Approaches - Basic Audit Approaches - System.
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 11 Regression.
Introductory Statistical Concepts. Disclaimer – I am not an expert SAS programmer. – Nothing that I say is confirmed or denied by Texas A&M University.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
Chapter 10 Verification and Validation of Simulation Models
 1  Outline  stages and topics in simulation  generation of random variates.
Chapter Nine Copyright © 2006 McGraw-Hill/Irwin Sampling: Theory, Designs and Issues in Marketing Research.
S14: Analytical Review and Audit Approaches. Session Objectives To define analytical review To define analytical review To explain commonly used analytical.
Modeling and simulation of systems Model building Slovak University of Technology Faculty of Material Science and Technology in Trnava.
1 Introduction to Software Engineering Lecture 1.
1 CHAPTER 2 Decision Making, Systems, Modeling, and Support.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
MODES-650 Advanced System Simulation Presented by Olgun Karademirci VERIFICATION AND VALIDATION OF SIMULATION MODELS.
Chapter 10 Verification and Validation of Simulation Models
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
Chapter 6 CASE Tools Software Engineering Chapter 6-- CASE TOOLS
Systems Development Life Cycle
© 2006 Pearson Addison-Wesley. All rights reserved 2-1 Chapter 2 Principles of Programming & Software Engineering.
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
Communicating Marketing Research Findings
Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.
Chapter 1 Software Engineering Principles. Problem analysis Requirements elicitation Software specification High- and low-level design Implementation.
Analytical Review and Audit Approaches
Chapter 1 Introduction to Statistics. Section 1.1 Fundamental Statistical Concepts.
Csci 418/618 Simulation Models Dr. Ken Nygard, IACC 262B
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
 Simulation enables the study of complex system.  Simulation is a good approach when analytic study of a system is not possible or very complex.  Informational,
Building Valid, Credible & Appropriately Detailed Simulation Models
Introduction to OOAD and UML
MANAGEMENT INFORMATION SYSTEM
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Research Design
Dr. Justin Bateh. Point of Estimate the value of a single sample statistics, such as the sample mean (or the average of the sample data). Confidence Interval.
Laurea Triennale in Informatica – Corso di Ingegneria del Software I – A.A. 2006/2007 Andrea Polini XVII. Verification and Validation.
The Role of Simulation in the 2010 Highway Capacity Manual ITE Western District Annual Meeting June 28, 2010 Loren Bloomberg/CH2M HILL Santa Ana, CA
CHAPTER 2 SYSTEM PLANNING DFC4013 System Analysis & Design.
Variance reduction techniques Mat Simulation
Principles of Programming & Software Engineering
MECH 373 Instrumentation and Measurements
Chapter 4: Business Process and Functional Modeling, continued
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
CS 389 – Software Engineering
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
DSS & Warehousing Systems
Section 2: Science as a Process
Research Process №5.
Chapter 1.
Chapter 10 Verification and Validation of Simulation Models
Chapter Three Research Design.
Chapter 2 – Software Processes
Chapter Eight: Quantitative Methods
9 Tests of Hypotheses for a Single Sample CHAPTER OUTLINE
DMAIC Roadmap DMAIC methodology is central to Six Sigma process improvement projects. Each phase provides a problem solving process where-by specific tools.
MECH 3550 : Simulation & Visualization
Applied Software Project Management
MECH 3550 : Simulation & Visualization
Biological Science Applications in Agriculture
Introduction to Decision Sciences
Presentation transcript:

Building Valid, Credible, and Appropriately Detailed Simulation Models Chapter 5 Building Valid, Credible, and Appropriately Detailed Simulation Models

Outline Introduction and definitions Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

Verification vs. Validation Verification is concerned with determining whether the conceptual simulation model has been correctly translated into a computer “program”. Validation is the process of determining whether a simulation model is an accurate representation of the system. 2019/4/28 Prof. Huei-Wen Ferng

Perspectives on Validation If a simulation model is “valid”, then it can be used to make decisions about the system. The ease or difficulty of the validation process depends on the complexity of the system being modeled and on whether a version of the system currently exists. A simulation model of a complex system can only be an approximation to the actual system. A model that is valid for one purpose may not be for another. 2019/4/28 Prof. Huei-Wen Ferng

Credibility and Accreditation Credibility is supported by the manager and other key project personnel. Note that credible is not necessarily valid. Accreditation is an official determination that a simulation model is acceptable for a particular purpose. 2019/4/28 Prof. Huei-Wen Ferng

Things Help Establish Credibility The manager’s understanding and agreement Demonstration that the model has been validated and verified The manager’s ownership of and involvement with the project Reputation of the model developers 2019/4/28 Prof. Huei-Wen Ferng

Issues of Accreditation Verification and validation that have been done Simulation model development and use history Quality of data that are available Quality of the documentation Known problems or limitations with the simulation model 2019/4/28 Prof. Huei-Wen Ferng

Timing and Relationship 2019/4/28 Prof. Huei-Wen Ferng

Validation and Output Analysis Validation should be contrasted with output analysis. Output analysis is a statistical issue concerned with estimating a simulation model’s true measures of performance. 2019/4/28 Prof. Huei-Wen Ferng

Validation and Output Analysis (Cont’d) Validation is concerned with the second term. Output analysis is concerned with the first term. 2019/4/28 Prof. Huei-Wen Ferng

Outline Introduction and definitions Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

Before the Simulation A simulation practitioner must determine what aspects actually needed to be incorporated; what level of detail; what aspects can be safely ignored. It is rarely necessary to have a 1-1 correspondence between each element of the system and that of the model. 2019/4/28 Prof. Huei-Wen Ferng

Some General Guidelines Carefully define the specific issues to be investigated. The entity in the simulation does not have to be the same as the entity of the corresponding system. Use subject-matter experts (SMEs) and sensitivity analyses to help determine the level of model detail. A mistake for novice is to include an excessive amount of model detail. Do not have more detail in the model than is necessary to address the issue of interest. The level of model detail should be consistent with the type of data available Time and money constraints are a major factor. Use a “coarse” simulation model or an analytic model to identify what factors have a significant impact on system performance. A “detailed” simulation model is then built. 2019/4/28 Prof. Huei-Wen Ferng

Outline Introduction and definitions Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

Eight Techniques Write and debug the computer program in modules or subprograms. More than one person to review the computer program Run the simulation under a variety of settings. Trace a program. 2019/4/28 Prof. Huei-Wen Ferng

Eight Techniques (Cont’d) The model should be run under simplifying assumptions for which its true characteristics are know or can easily be computed. Observe an animation of the simulation output. Check the input part. Use a commercial simulation package to reduce the amount of programming required. 2019/4/28 Prof. Huei-Wen Ferng

Outline Introduction and definitions Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

Six Classes to Increase the Validity and Credibility Collect high-quality information and data on the system Interact with the manager on a regular basis Maintain an assumptions document and perform a structured walk-through Validate components of the model by using quantitative techniques Validate the output from the overall simulation model Animation 2019/4/28 Prof. Huei-Wen Ferng

Collect High-Quality Information and Data on the System In developing a simulation model, the analyst should make use Conversations with SMEs A simulation model is not an abstraction developed by an analyst working in isolation Observations of the system Data are not representative of what one really wants to model Data are not of the appropriate type or format Data may contain measurement, recording, or rounding errors Data may be “biased” because of self-interest Data may have inconsistent units 2019/4/28 Prof. Huei-Wen Ferng

Collect High-Quality Information and Data on the System (Cont’d) Existing theory Relevant results from similar simulation studies Experience and intuition of the modelers 2019/4/28 Prof. Huei-Wen Ferng

Interact with the Manager on a Regular Basis -- Benefits As the simulation study proceeds and the nature of the problem becomes clearer, this information should be conveyed to the manager. The manager’s interest and involvement in the study are maintained. The manager’s knowledge of the system contributes to the actual validity of the model. The model is more credible since the manager understands and accepts the model’s assumptions. 2019/4/28 Prof. Huei-Wen Ferng

Assumptions Document An overview section Detailed descriptions of each subsystem, in bullet format, and how these subsystems interact. What simplifying assumptions were made and why. Summaries of data Sources of important or controversial information 2019/4/28 Prof. Huei-Wen Ferng

Structured Walk-Through A structured walk-through will increase both the validity and credibility of the simulation model. The structured walk-through of the assumptions document might be called conceptual model validation. 2019/4/28 Prof. Huei-Wen Ferng

Validate Components of the Model by Using Quantitative Techniques Sensitivity analysis is used to determine which model factors have a important impact on the desired measures of performance. Examples of factors including The value of a parameter The choice of a distribution The entity moving through the simulated system The level of detail for a subsystem What data are most crucial to collect 2019/4/28 Prof. Huei-Wen Ferng

Validate the Output from the Overall Simulation Model The idea of comparing the model and subsystem output data for the existing system might be called results validation. 2019/4/28 Prof. Huei-Wen Ferng

Animation An animation can be an effective way to find invalid model assumptions and to enhance the credibility of a simulation model. 2019/4/28 Prof. Huei-Wen Ferng

Outline Introduction and definitions Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

Some of Responsibilities of the Manager Formulating problem objectives Directing personnel to provide information and data to the simulation modeler and to attend the structured walk-through Interact with the simulation modeler on a regular basis Using the simulation results as an aid in the decision-making process 2019/4/28 Prof. Huei-Wen Ferng

Outline Introduction and definitions Guidelines for determining the level of model detail Verification of simulation computer programs Techniques for increasing model validity and credibility Management’s role in the simulation process Statistical procedures for comparing real-world observations and simulation output data 2019/4/28 Prof. Huei-Wen Ferng

Three Approaches Inspection approach Confidence-interval approach based on independent data Time-series approaches 2019/4/28 Prof. Huei-Wen Ferng

Inspection Approach Suppose that are observations from a real-world and that are output data from a corresponding simulation model. The difficulty with this (basic) inspection approach is that each statistic is essentially a sample of size 1. Example 5.34 illustrate the danger of using inspection. 2019/4/28 Prof. Huei-Wen Ferng

Inspection Approach (Cont’d) 2019/4/28 Prof. Huei-Wen Ferng

Inspection Approach (Cont’d) Correlated inspection approach 2019/4/28 Prof. Huei-Wen Ferng

Inspection Approach (Cont’d) 2019/4/28 Prof. Huei-Wen Ferng

Confidence-Interval Approach Suppose we collect m independent sets of data from the system and n independent sets of data from the model. Let Xj be the average of the observations in the jth set of system data Let Yj be the average of the observations in the jth set of model data Two approaches: paired-t (see Ch 10.2) and Welch. (see Example 5.38) Two difficulties: it may require a large amount of data; it provides no information about the autocorrelation structures. 2019/4/28 Prof. Huei-Wen Ferng

Time-Series Approaches A time series is a finite realization of a stochastic process Requires only one set of each type of output data May yield information on the autocorrelation structures of two output processes Examples see Ch 9. 2019/4/28 Prof. Huei-Wen Ferng