MODES-650 Advanced System Simulation 02.12.2010 Presented by Olgun Karademirci VERIFICATION AND VALIDATION OF SIMULATION MODELS.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

DETAILED DESIGN, IMPLEMENTATIONA AND TESTING Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Introduction to Research Methodology
Chapter 4 Validity.
Chapter 10 Verification and Validation of Simulation Models
Software Testing and Quality Assurance
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 4: Modeling Decision Processes Decision Support Systems in the.
Simulation.
Lecture 7 Model Development and Model Verification.
1 Simulation Modeling and Analysis Verification and Validation.
1 Validation and Verification of Simulation Models.
9 1 Chapter 9 Database Design Database Systems: Design, Implementation, and Management, Seventh Edition, Rob and Coronel.
1 Software Testing and Quality Assurance Lecture 1 Software Verification & Validation.
Model Calibration and Model Validation
Chapter One: The Science of Psychology
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Software Verification and Validation (V&V) By Roger U. Fujii Presented by Donovan Faustino.
Section 2: Science as a Process
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Expert System Presentation On…. Software Certification for Industry - Verification and Validation Issues in Expert Systems By Anca I. Vermesan Presented.
Types of Research 1. Categorized by Practicality a. Basic research  done to satisfy a need to know with no intention of resolving an immediate social.
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
Chapter 10 Verification and Validation of Simulation Models
How to conduct good investigation in social sciences Albu Iulian Alexandru.
Chapter One: The Science of Psychology. Ways to Acquire Knowledge Tenacity Tenacity Refers to the continued presentation of a particular bit of information.
Verification and Validation Overview References: Shach, Object Oriented and Classical Software Engineering Pressman, Software Engineering: a Practitioner’s.
FCS - AAO - DM COMPE/SE/ISE 492 Senior Project 2 System/Software Test Documentation (STD) System/Software Test Documentation (STD)
1 OM2, Supplementary Ch. D Simulation ©2010 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible.
The student will demonstrate an understanding of how scientific inquiry and technological design, including mathematical analysis, can be used appropriately.
SOFTWARE DESIGN (SWD) Instructor: Dr. Hany H. Ammar
1 Science as a Process Chapter 1 Section 2. 2 Objectives  Explain how science is different from other forms of human endeavor.  Identify the steps that.
West Virginia University Towards Practical Software Reliability Assessment for IV&V Projects B. Cukic, E. Gunel, H. Singh, V. Cortellessa Department of.
CS 475/575 VV&A M. Overstreet Old Dominion University Spring 2005.
1 Chapter Nine Conducting the IT Audit Lecture Outline Audit Standards IT Audit Life Cycle Four Main Types of IT Audits Using COBIT to Perform an Audit.
NCHRP Project Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications SURVEY OF PRACTITIONERS.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
IV&V Facility 26SEP071 Validation Workshop Dr. Butch Caffall Director, NASA IV&V Facility 26SEP07.
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
Chapter 10 Verification and Validation of Simulation Models
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
Research Design. Selecting the Appropriate Research Design A research design is basically a plan or strategy for conducting one’s research. It serves.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation.
Thomas L. Gilchrist Testing Basics Set 3: Testing Strategies By Tom Gilchrist Jan 2009.
SAFEWARE System Safety and Computers Chap18:Verification of Safety Author : Nancy G. Leveson University of Washington 1995 by Addison-Wesley Publishing.
STEPS IN THE DEVELOPMENT OF MODEL INPUT DATA Evaluate chosen distribution and associated parameters for goodness-of-fit Goodness-of-fit test provide helpful.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
Requirement Analysis SOFTWARE ENGINEERING. What are Requirements? Expression of desired behavior Deals with objects or entities, the states they can be.
The Psychologist as Detective, 4e by Smith/Davis © 2007 Pearson Education Chapter One: The Science of Psychology.
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
1 CEN 4020 Software Engineering PPT4: Requirement analysis.
Research Methodology How the study was conducted; what did and how you did it. 1- Participants/ subjects, who participated in the study? How many? How.
Research Methods in Psychology Introduction to Psychology.
Building Valid, Credible & Appropriately Detailed Simulation Models
1 Chapter 1 Introduction to Accounting Information Systems Chapter 2 Intelligent Systems and Knowledge Management.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
1 Software Testing. 2 What is Software Testing ? Testing is a verification and validation activity that is performed by executing program code.
CS223: Software Engineering Lecture 25: Software Testing.
Verification and Validation Overview
Chapter 10 Verification and Validation of Simulation Models
© 2012 The McGraw-Hill Companies, Inc.
Scientific Investigations
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Software Verification and Validation
Software Verification and Validation
MECH 3550 : Simulation & Visualization
Building Valid, Credible, and Appropriately Detailed Simulation Models
Software Verification and Validation
Scientific Investigations
Presentation transcript:

MODES-650 Advanced System Simulation Presented by Olgun Karademirci VERIFICATION AND VALIDATION OF SIMULATION MODELS Robert G. Sargent

Professor Dept. of Electrical Engineering and Computer Science Syracuse University N.Y. U.S.A. His Interests  modeling methodologies  model validation, and methodology areas of discrete event simulation  both strategical and tactical experimental design and data analysis

OUTLINE Purpose and ScopeIntroductionBasic Approaches for ValidationTwo Paradigms for Verification and ValidationValidation TechniquesData ValidityConceptual Model ValidationComputerized Model VerificationOperational ValidityDocumentationRecommended ProcedureAccreditationSummary

Purpose and Scope  Purpose of the author in writing this proceeding is to discuss verification and validation of simulation models.  Distinct approaches and techniques related to verification and validation of the simulation models are discussed throughout the article.

Introduction Model validation; substantiation that a computerized model within its domain of applicability possesses a satisfactory range of accuracy consistent with the intended application of the model.  Model verification; ensuring that the computer program of the computerized model and its implementation are correct.

Introduction  Model accreditation; determines if a model satisfies specified model accreditation criteria according to a specified process.  Model credibility; is concerned with developing in (potential) users the confidence they require in order to use a model and in the information derived from that model.

Basic Approaches for Validation There are four basic approaches for deciding whether a simulation model is valid:  One approach, and a frequently used one, is for the model development team itself to make the decision as to whether a simulation model is valid.  If the size of the simulation team developing the model is not large, a better approach than the first one is to have the user(s) of the model heavily involved with the model development team in deciding the validity of the simulation model.

Basic Approaches for Validation  Another approach, usually called “independent verification and validation” (IV&V), uses a third (independent) party to decide whether the simulation model is valid. There are two common ways that the third party conducts IV&V: (a) IV&V is conducted concurrently with the development of the simulation model and (b) IV&V is conducted after the simulation model has been developed. It is the author’s opinion that first one is the better of the two ways to conduct IV&V.  The last approach for determining whether a model is valid is to use a scoring model. This approach is seldom used in practice.

Two Paradigms for Verification and Validation

Validation Techniques  Animation  Comparison to Other Models  Degenerate Tests  Event Validity  Extreme Condition Tests  Face Validity  Historical Data Validation  Historical Methods  Internal Validity  Multistage Validation  Operational Graphics  Parameter Variability - Sensitivity Analysis  Predictive Validation  Traces  Turing Tests

Data Validity Data validity is often not considered to be part of model validation, because it is difficult, time consuming, and costly to obtain appropriate, accurate, and sufficient data, and is often the reason that attempts to valid a model fail. Data are needed for three purposes;  for building the conceptual model,  for validating the model,  for performing experiments with the validated model. In model validation we are usually concerned only with data for the first two purposes.

Data Validity To build a conceptual model we must have sufficient data on the problem entity to develop theories that can be used to build the model. In additional, behavioral data are needed on the problem entity to be used in the operational validity step of comparing the problem entity’s behavior with the model’s behavior. (Usually, this data are system in-put/output data.)

Conceptual Model Validation Conceptual model validity is determining that;  the theories and assumptions underlying the conceptual model are correct,  the model’s representation of the problem entity and the model’s structure, logic, and mathematical and causal relationships are “reasonable” for the intended purpose of the model.

Conceptual Model Validation The primary validation techniques used for these evaluations are face validation and traces. Face validation has experts on the problem entity evaluate the conceptual model to determine if it is correct and reasonable for its purpose.

Computerized Model Verification Computerized model verification ensures that the computer programming and implementation of the conceptual model are correct. The major factor affecting verification is whether a simulation language or a higher level programming language such as FORTRAN, C, Java,or C++ is used. Simulation language; general purpose, special purpose

Computerized Model Verification There are two basic approaches for testing simulation software: static testing and dynamic testing (Fairley 1976).  In static testing the computer program is analyzed to determine if it is correct by using such techniques as structured walk-throughs, correctness proofs, and examining the structure properties of the program.  In dynamic testing the computer program is executed under different conditions and the values obtained (including those generated during the execution) are used to determine if the computer program and its implementations are correct.

Operational Validity Operational validation is determining whether the simulation model’s output behavior has the accuracy required for the model’s intended purpose over the domain of the model’s intended applicability. All of the validation techniques discussed in Section 4 are applicable to operational validity. Which techniques and whether to use them objectively or subjectively must be decided by the model development team and the other interested parties.

Operational Validity İf an sys-tem is not observable, which is often the case, it is usually not possible to obtain a high degree of confidence in the model. In this situation the model output behavior(s) should be explored as thoroughly as possible and comparisons made to other valid models whenever possible.

O PERATIONAL V ALIDITY Comparisons of Output Behaviors There are three basic approaches used in comparing the simulation model output behavior to either the system output behavior or another model output behavior;  the use of graphs to make a subjective decision,  the use of confidence intervals to make an objective decision,  the use of hypothesis tests to make an objective decisions. It is preferable to use confidence intervals or hypothesis tests for the comparisons because these allow for objective decisions. As a result, the use of graphs is the most commonly used approach for operational validity.

O PERATIONAL V ALIDITY Comparisons of Output Behaviors, Graphical Comparisons of Data Three types of graphs are used: histograms, box (and whisker) plots, and behavior graphs using scatter plots.

O PERATIONAL V ALIDITY Comparisons of Output Behaviors, Graphical Comparisons of Data

O PERATIONAL V ALIDITY Comparisons of Output Behaviors, Graphical Comparisons of Data

Documentation Documentation on model verification and validation is usually critical in convincing users of the “correctness” of a model and its results, and should be included in the simulation model documentation. Both detailed and summary documentation are desired. The detailed documentation should include specifics on the tests, evaluations made, data, results, etc. The summary documentation should contain a separate evaluation table for data validity, conceptual model validity, computer model verification, operational validity, and an overall summary.

Summary Model verification and validation are critical in the development of a simulation model. Unfortunately, there is no set of specific tests that can easily be applied to determine the “correctness” of a model. Furthermore, no algorithm exists to determine what techniques or procedures to use. In this paper we discussed ‘practical approaches’ to verification and validation of simulation models.