Presentation is loading. Please wait.

Presentation is loading. Please wait.

September 24, 2007 NASA IV&V Facility Workshop on Validation Morgantown, WV 1 A Framework for Computer-aided Validation Presented by Bret Michael Joint.

Similar presentations


Presentation on theme: "September 24, 2007 NASA IV&V Facility Workshop on Validation Morgantown, WV 1 A Framework for Computer-aided Validation Presented by Bret Michael Joint."— Presentation transcript:

1 September 24, 2007 NASA IV&V Facility Workshop on Validation Morgantown, WV 1 A Framework for Computer-aided Validation Presented by Bret Michael Joint work with Doron Drusinsky and Man-Tak Shing Naval Postgraduate School Monterey, CA

2 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 2 Disclaimer The views and conclusions in this talk are those of the author and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of the U.S. Government The views and conclusions in this talk are those of the author and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of the U.S. Government

3 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 3 Conventional Approach to Conducting IV&V Relies on Relies on Manual examination of software requirements and design artifacts Manual examination of software requirements and design artifacts Manual and tool-based code analysis Manual and tool-based code analysis Systematic or random independent testing of target code Systematic or random independent testing of target code Poses seemingly insurmountable challenges Poses seemingly insurmountable challenges Most of these techniques are ineffective for validating the correctness of the developer’s cognitive understanding of the requirements Most of these techniques are ineffective for validating the correctness of the developer’s cognitive understanding of the requirements For complex software-intensive systems, manual IV&V techniques are inadequate for locating the subtle errors in the software For complex software-intensive systems, manual IV&V techniques are inadequate for locating the subtle errors in the software For example, sequencing behaviors only observable at runtime and at such a fine level of granularity of time make human intervention at runtime impractical For example, sequencing behaviors only observable at runtime and at such a fine level of granularity of time make human intervention at runtime impractical

4 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 4 Software Automation Holds the key to the validation and verification of the behaviors of complex software-intensive systems Holds the key to the validation and verification of the behaviors of complex software-intensive systems Relies on formal specification of system behaviors Relies on formal specification of system behaviors Requires breaking from time-honored rules of thumb about how to conduct IV&V Requires breaking from time-honored rules of thumb about how to conduct IV&V Enables IV&V teams to Enables IV&V teams to Accelerate their productivity Accelerate their productivity Cope with the impacts of accelerating technological change, or what Alan Greenspan refers to as the “revolution in information technology” Cope with the impacts of accelerating technological change, or what Alan Greenspan refers to as the “revolution in information technology”

5 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 5 IEEE Definitions Validation Validation “The process of evaluating a system or component during or at the end of the development process to determine whether a system or component satisfies specified requirements” “The process of evaluating a system or component during or at the end of the development process to determine whether a system or component satisfies specified requirements” Verification Verification “The process of evaluating a system or component to determine whether a system of a given development phase satisfies the conditions imposed at the start of that phase” “The process of evaluating a system or component to determine whether a system of a given development phase satisfies the conditions imposed at the start of that phase”

6 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 6 Current IEEE Standards View of Validation and Verification (V&V) Checking the Checking the Correctness of a target system or component against a formal model that is derived from the natural language requirements Correctness of a target system or component against a formal model that is derived from the natural language requirements Consistency and completeness of the formal models without ensuring that the developer understands the requirements and that the formal models correctly match the developer’s cognitive intent of the requirements Consistency and completeness of the formal models without ensuring that the developer understands the requirements and that the formal models correctly match the developer’s cognitive intent of the requirements

7 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 7 IV&V Team’s Independent Requirements Effort Describe the necessary attributes, characteristics, and qualities of any system developed to solve the problem and satisfy the intended use and user needs Describe the necessary attributes, characteristics, and qualities of any system developed to solve the problem and satisfy the intended use and user needs Ensure that its cognitive understanding of the problem and the requirements for any system solving the problem are correct before performing IV&V on developer-produced systems Ensure that its cognitive understanding of the problem and the requirements for any system solving the problem are correct before performing IV&V on developer-produced systems

8 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 8 Proposed Framework Incorporates advanced computer-aided validation techniques to the IV&V of software systems Incorporates advanced computer-aided validation techniques to the IV&V of software systems Allows the IV&V team to capture both Allows the IV&V team to capture both Its own understanding of the problem Its own understanding of the problem The expected behavior of any proposed system for solving the problem via an executable system reference model The expected behavior of any proposed system for solving the problem via an executable system reference model

9 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 9 Terminology as Used in the Framework Developer-generated requirements Developer-generated requirements The requirements artifacts produced by the developer of a system The requirements artifacts produced by the developer of a system System reference model (SRM) System reference model (SRM) The artifacts developed by the IV&V team’s own requirements effort The artifacts developed by the IV&V team’s own requirements effort

10 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 10 Contents of a SRM Use cases and UML artifacts Use cases and UML artifacts Formal assertions to describe precisely the necessary behaviors to satisfy system goals (i.e., to solve the problem) with respect to Formal assertions to describe precisely the necessary behaviors to satisfy system goals (i.e., to solve the problem) with respect to What the system should do What the system should do What the should not do What the should not do How the system should respond under non- nominal circumstances How the system should respond under non- nominal circumstances

11 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 11 Prerequisites for Using Computer- Based V&V Technology Development of formal, executable representations of a system’s properties, expressed as a set of desired system behaviors Development of formal, executable representations of a system’s properties, expressed as a set of desired system behaviors

12 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 12 Classes of System Behaviors Logical behavior Logical behavior Describes the cause and effect of a computation, typically represented as functional requirements of a system Describes the cause and effect of a computation, typically represented as functional requirements of a system Sequencing behavior Sequencing behavior Describes the behaviors that consist of sequences of events, conditions and constraints on data values, and timing Describes the behaviors that consist of sequences of events, conditions and constraints on data values, and timing In its vanilla form specifies sets of legal (or illegal) sequences In its vanilla form specifies sets of legal (or illegal) sequences

13 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 13 Beyond Pure Sequencing Timing constraints Timing constraints Describe the timely start and/or termination of successful computations at a specific point of time Describe the timely start and/or termination of successful computations at a specific point of time Example: Deadline of a periodic computation or the maximum response time of an event handler Example: Deadline of a periodic computation or the maximum response time of an event handler Time-series constraints Time-series constraints Describe the timely execution of a sequence of data values within a specific duration of time Describe the timely execution of a sequence of data values within a specific duration of time

14 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 14 Use Cases and UML Artifacts of the SRM

15 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 15 Categories of Formal Specifications of Behavior Assertion-oriented specifications Assertion-oriented specifications High-level requirements are decomposed into more precise requirements that are mapped one-to-one to formal assertions High-level requirements are decomposed into more precise requirements that are mapped one-to-one to formal assertions Model-oriented specifications Model-oriented specifications A single monolithic formal model (either as a state- or an algebraic-based system) captures the combined expected behavior described by the lower level specifications of behavior A single monolithic formal model (either as a state- or an algebraic-based system) captures the combined expected behavior described by the lower level specifications of behavior Describes the expected behavior of a conceptualized system from the IV&V team’s understanding of the problem space Describes the expected behavior of a conceptualized system from the IV&V team’s understanding of the problem space May differ significantly from the system design models created by the developers in their design space May differ significantly from the system design models created by the developers in their design space

16 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 16 Example of Conducting Assertion-oriented Specification Start with high-level requirement Start with high-level requirement R1. The track processing system can only handle a workload not exceeding 80% of its maximum load capacity at runtime R1. The track processing system can only handle a workload not exceeding 80% of its maximum load capacity at runtime Reify R1 into lower level requirement Reify R1 into lower level requirement R1.1 Whenever the track count (cnt) Average Arrival Rate (ART) exceeds 80% of the MAX_COUNT_PER_MIN, cnt ART must be reduced back to 50% of the MAX_COUNT_PER_MIN within 2 minutes and cnt ART must remain below 60% of the MAX_COUNT_PER_MIN for at least 10 minutes R1.1 Whenever the track count (cnt) Average Arrival Rate (ART) exceeds 80% of the MAX_COUNT_PER_MIN, cnt ART must be reduced back to 50% of the MAX_COUNT_PER_MIN within 2 minutes and cnt ART must remain below 60% of the MAX_COUNT_PER_MIN for at least 10 minutes

17 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 17 Continuation of Example Map R1.1 to a formal assertion expressed as a Statechart assertion Map R1.1 to a formal assertion expressed as a Statechart assertion

18 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 18 Advantages of Using an Assertion- Oriented Specification Approach Requirements are traceable because they are represented, one-to-one, by assertions (acting as watchdogs for the requirements) Requirements are traceable because they are represented, one-to-one, by assertions (acting as watchdogs for the requirements) A monolithic model is the sum of all concerns: on detecting a violation of the formal specification, it is difficult to map that violation to a specific human- driven requirement A monolithic model is the sum of all concerns: on detecting a violation of the formal specification, it is difficult to map that violation to a specific human- driven requirement Assertion-oriented specifications have a lower maintenance cost than the model-oriented counterpart when requirements change (i.e., ability to adjust the model) Assertion-oriented specifications have a lower maintenance cost than the model-oriented counterpart when requirements change (i.e., ability to adjust the model)

19 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 19 Continuation of Advantages Assertions can be constructed to represent illegal behaviors, whereas the monolithic model typically only represents “good behavior” Assertions can be constructed to represent illegal behaviors, whereas the monolithic model typically only represents “good behavior” It is much easier to trace the expected and actual behaviors of the target system to the required behaviors in the requirements space and the formal assertions can be used directly as input to the verifiers in the verification dimension It is much easier to trace the expected and actual behaviors of the target system to the required behaviors in the requirements space and the formal assertions can be used directly as input to the verifiers in the verification dimension

20 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 20 Continuation of Advantages Conjunction of all the assertions becomes a “single” formal model of a conceptualized system from the requirement space Conjunction of all the assertions becomes a “single” formal model of a conceptualized system from the requirement space Can be used to check for inconsistencies and other gaps in the specifications with the help of computer-aided tools Can be used to check for inconsistencies and other gaps in the specifications with the help of computer-aided tools

21 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 21 Validation of Formal Assertions Formal assertions must be executable to allow the modelers to visualize the true meaning of the assertions via scenario simulations Formal assertions must be executable to allow the modelers to visualize the true meaning of the assertions via scenario simulations One way to do this is to use an iterative process that allows the modeler to One way to do this is to use an iterative process that allows the modeler to Write formal specifications using Statechart assertions Write formal specifications using Statechart assertions Validate the correctness of the assertions via simulated test scenarios within the JUnit test- framework Validate the correctness of the assertions via simulated test scenarios within the JUnit test- framework

22 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 22 Validation of Statechart Assertion via Scenario-based Testing

23 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 23 Process for Validating Assertions (Utilizing the Executable SRM) Start by testing individual assertions using the scenario-based test cases to validate the correctness of the logical and temporal meaning of the assertions Start by testing individual assertions using the scenario-based test cases to validate the correctness of the logical and temporal meaning of the assertions Next test the assertions using the scenario- based test cases subjected to the constraints imposed by the objects in the SRM conceptual model Next test the assertions using the scenario- based test cases subjected to the constraints imposed by the objects in the SRM conceptual model Then use an automated tool to exercise all assertions together to detect any conflicts in the formal specification Then use an automated tool to exercise all assertions together to detect any conflicts in the formal specification

24 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 24 A process for formal specification and computer-aided validation

25 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 25 Runtime Verification (RV) Uses executable SRMs Uses executable SRMs Monitors the runtime execution of a system and checks the observed runtime behavior against the system’s formal specification Monitors the runtime execution of a system and checks the observed runtime behavior against the system’s formal specification It serves as an automated observer of the program’s behavior and compares it with the expected behavior per the formal specification It serves as an automated observer of the program’s behavior and compares it with the expected behavior per the formal specification Requires that the software artifacts produced by the developer be instrumented Requires that the software artifacts produced by the developer be instrumented

26 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 26 Execution-based Model Checking (EMC) Can be used if state-based design models are available Can be used if state-based design models are available A combination of RV and Automatic Test Generation (ATG) A combination of RV and Automatic Test Generation (ATG) Large volumes of automatically generated tests are used to exercise the program or system under test, using RV on the other end to check the SUT’s conformance to the formal specification Large volumes of automatically generated tests are used to exercise the program or system under test, using RV on the other end to check the SUT’s conformance to the formal specification Examples of ATG tools that can be used in combination with RV to conduct EMC Examples of ATG tools that can be used in combination with RV to conduct EMC StateRover’s white-box automatic test-generator (WBATG) StateRover’s white-box automatic test-generator (WBATG) NASA’s Java Path Finder (JPF) NASA’s Java Path Finder (JPF)

27 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 27 Execution-based Model Checking of State-Based Design Models

28 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 28 Three Ways in Which to Use the Auto-generated Tests To search for severe programming errors, of the kind that induces a JUnit error status, such as NullPointerException To search for severe programming errors, of the kind that induces a JUnit error status, such as NullPointerException To identify test cases which violate temporal assertions To identify test cases which violate temporal assertions To identify input sequences that lead the statechart under test to particular states of interest To identify input sequences that lead the statechart under test to particular states of interest

29 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 29 Example StateRover generated WBTestCase creates sequences of events and conditions for the state chart under test StateRover generated WBTestCase creates sequences of events and conditions for the state chart under test Only sequences consisting of events that the SUT or some assertion is sensitive to, by repeatedly observing all events that potentially affect the SUT when it is in a given configuration state, selects one of those events and fires the SUT using this event Only sequences consisting of events that the SUT or some assertion is sensitive to, by repeatedly observing all events that potentially affect the SUT when it is in a given configuration state, selects one of those events and fires the SUT using this event

30 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 30 Hybrid Model- and Specification-based WBATG StateRover’s WBTestCase auto-generates StateRover’s WBTestCase auto-generates Events Events Time-advance increments, for the correct generation of timeoutFire events Time-advance increments, for the correct generation of timeoutFire events External data objects of the type that the statechart prototype refers to External data objects of the type that the statechart prototype refers to WBATG observes all entities, namely, the SUT and all embedded assertions WBATG observes all entities, namely, the SUT and all embedded assertions It collects all possible events from all of those entities It collects all possible events from all of those entities

31 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 31 Verification of Target Code If only executable code is available, the IV&V team can use the StateRover white- box tester in tandem with the executable assertions of the SRM to automate the testing of the target code produced by the developer If only executable code is available, the IV&V team can use the StateRover white- box tester in tandem with the executable assertions of the SRM to automate the testing of the target code produced by the developer Executable assertions of the SRM Executable assertions of the SRM Keep track of the set of possible next events to drive the SUT Keep track of the set of possible next events to drive the SUT Serve as the observer for the RV during the test Serve as the observer for the RV during the test

32 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 32 Automated testing using the system reference model

33 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 33 Manual Examination of the Developer- Generated Requirements IV&V team can use the SRM to validate the textual descriptions of the requirements produced by the developer IV&V team can use the SRM to validate the textual descriptions of the requirements produced by the developer Start by associating the developer-generated requirements with the use cases to obtain the context for assessing the requirements Start by associating the developer-generated requirements with the use cases to obtain the context for assessing the requirements Next, trace the developer-generated requirements to the other artifacts, for example trace the requirements to the Next, trace the developer-generated requirements to the other artifacts, for example trace the requirements to the Activity and sequence diagrams to help identify the subsystems or components responsible for the system requirements Activity and sequence diagrams to help identify the subsystems or components responsible for the system requirements Domain model to identify the correct naming of the objects and events Domain model to identify the correct naming of the objects and events Then use the traces to identify the critical components of the target system for more thorough testing Then use the traces to identify the critical components of the target system for more thorough testing

34 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 34 Recap The IV&V team needs to capture its own understanding of the problem to be solved and the expected behavior of any system for solving the problem, using SRMs The IV&V team needs to capture its own understanding of the problem to be solved and the expected behavior of any system for solving the problem, using SRMs Complex system sequencing behaviors can mainly be understood and their formal specifications can most effectively be validated via execution- based techniques Complex system sequencing behaviors can mainly be understood and their formal specifications can most effectively be validated via execution- based techniques We advocate the use of assertion-oriented specification We advocate the use of assertion-oriented specification We presented a framework for incorporating computer- aided validation into the IV&V of complex reactive systems We presented a framework for incorporating computer- aided validation into the IV&V of complex reactive systems We described how the SRM can be used to automate the testing of the software artifacts produced by the developer of the system We described how the SRM can be used to automate the testing of the software artifacts produced by the developer of the system

35 September 24, 2007NASA IV&V Facility Workshop on Validation Morgantown, WV 35 Challenge for the NASA’s Software Engineering Community Taking the proposed exotic validation framework from being exotic to being ubiquitous while harnessing Taking the proposed exotic validation framework from being exotic to being ubiquitous while harnessing “Creative destruction,” coined by the late Joseph Schumpeter “Creative destruction,” coined by the late Joseph Schumpeter Reallocate resources to new, productive business practices (antithesis of catering to the human need for stability and permanence) Reallocate resources to new, productive business practices (antithesis of catering to the human need for stability and permanence) “Disruptive innovation,” coined by Clayton Christensen “Disruptive innovation,” coined by Clayton Christensen Cause a technological innovation, product, or service to overturn the existing dominant technology or status quo product in the market Cause a technological innovation, product, or service to overturn the existing dominant technology or status quo product in the market


Download ppt "September 24, 2007 NASA IV&V Facility Workshop on Validation Morgantown, WV 1 A Framework for Computer-aided Validation Presented by Bret Michael Joint."

Similar presentations


Ads by Google