Presentation is loading. Please wait.

Presentation is loading. Please wait.

Verification and Validation Overview

Similar presentations


Presentation on theme: "Verification and Validation Overview"— Presentation transcript:

1 Verification and Validation Overview
References: Shach, Object Oriented and Classical Software Engineering Pressman, Software Engineering: a Practitioner’s Approach, McGraw Hill Pfleeger, Software Engineering, Theory and Practice, Prentice Hall Davis, Software Requirements: Objects, Functions, and States, Prentice Hall 1209

2 Purpose of V&V Groups of 3 5 minutes
What is verification and validation? What is the purpose of verification and validation?

3 Purpose of V&V Give programmers information they can use to prevent faults Give management information to evaluate risk Provide software that is reasonably defect free Achieve a “testable” design (one that can be easily verified) Validate the software (demonstrate that it works)

4 Definitions-1 (note: usual definitions, but they do not match all authors or the 1990 IEEE glossary)
Validation Evaluation of an object to demonstrate that it meets expectations. (Did we build the right system?) Verification Evaluation of an object to demonstrate that it meets its specification. (Did we build the system right?) Evaluation of the work product of a development phase to determine whether the product satisfies the conditions imposed at the start of the phase. Correct Program Program matches its specification Correct Specification Specification matches the client’s intent

5 Definitions-2 Error (a.k.a. mistake): Fault (a.k.a. bug)
A human activity that leads to the creation of a fault A human error results in a fault which may, at runtime, result in a failure Kaner: “It’s an error if the software doesn’t do what the user intended” Fault (a.k.a. bug) May result in a failure A discrepancy between what something should contain (in order for failure to be impossible) and what it does contain The physical manifestation of an error Failure (a.k.a. symptom, problem, incident) Observable misbehavior Actual output does not match the expected output Can only happen when a thing is being used

6 Definitions-3 Fault identification Fault correction Debugging
Process of determining what fault caused a failure Fault correction Process of changing a system to remove a fault. Debugging The act of finding and fixing program errors

7 Definitions-4 Testing Test Test case Oracle
The act of designing, debugging, and executing tests Test A sample execution to be examined Test case A particular set of input and the expected output Oracle Any means used to predict the outcome of a test

8 Definitions-5 Significant test case Significant test set
A test case with a high probability of detecting an error One test case may be more significant than another Significant test set A test set with a high probability of detecting an error A test set is more significant than another if the first is a superset of the second The number of test cases does not determine the significance Regression testing: rerun a test suite to see if a change fixed a bug a change introduced a new one 

9 Definitions-6 Let S be a relation, a specification of a program.
Let P be the implementation of the program. R is the Range. r  R. D is the domain. d  D. S (r,d). The specification. P (r,d). The implementation.

10 Definitions-7 Failure Test case Test set T P passes T if T is ideal if
R: Range. r  R. D: Domain. d  D. S (r,d). The specification. P (r,d). The implementation. Failure P (r,d) but not S (r,d) Test case A pair (r,d) such that S (r,d) Test set T A finite set of test cases. P passes T if  tT, t=(r,d)  S (r,d)  P (r,d) T is ideal if ( d,r | S (r, d)  (P(r, d)))  ( tT | t=(r’,d’)  S (r’,d’)  P(r’,d’))

11 In groups, translate these into English
Definitions-7 In groups, translate these into English Failure P (r,d) but not S (r,d) Test case A pair (r,d) such that S (r,d) Test set T A finite set of test cases. P passes T if  tT, t=(r,d)  S (r,d)  P (r,d) T is ideal if ( d,r | S (r, d)  (P(r, d)))  ( tT | t=(r’,d’)  S (r’,d’)  P(r’,d’))

12 Take Home Message If your test set does not identify any bugs, was your testing successful?

13 Parnas: “There are only three engineering techniques for verification”
Mathematical analysis Exhaustive case analysis Prolonged realistic testing

14 Parnas: “There are only three engineering techniques for verification”
Mathematical analysis Works well for continuous functions (software engineering is more difficult than other engineering) Cannot interpolate reliably for discrete functions Exhaustive case analysis Prolonged realistic testing

15 Parnas: “There are only three engineering techniques for verification”
Mathematical analysis Exhaustive case analysis Only possible for systems with small state space Prolonged realistic testing

16 Hierarchy of V&V techniques (not exhaustive)
Dynamic Techniques Static Techniques Formal Analysis Informal Analysis Testing Symbolic Execution Model Checking Static Analysis Walkthrough Inspection Proofs Review

17 Hierarchy of V&V techniques
Dynamic Techniques Static Techniques Formal Analysis Informal Analysis Testing Symbolic Execution Model Checking Static Analysis Walkthrough Note: This does not match Van Vliet’s definition of testing: I use “testing” to mean that the program is being executed. He does not. Inspection Proofs Review 17

18 Types of Faults In group 3 minutes
List all the types and causes of faults: what can go wrong in the development process?

19 Some Types of Faults Algorithmic: algorithm or logic does not produce the proper output for the given input Syntax: improper use of language constructs Computation (precision): formula’s implementation wrong or result not to correct degree of accuracy Documentation: documentation does not match what program does Stress (overload): data structures filled past capacity Capacity: system’s performance unacceptable as activity reaches its specified limit Timing: code coordinating events is inadequate Throughput: system does not perform at speed required Recovery: failure encountered and does not behave correctly.

20 Causes of Faults Incorrect or missing requirements Requirements
Incorrect translation Incorrect design specification System Design Incorrect design specification Program Design Incorrect design interpretation Program Implementation Incorrect documentation Incorrect semantics Unit Testing Incomplete testing System Testing New faults introduced correcting others

21 Some Verification and Validation Techniques
Requirements Reviews: walkthroughs/inspections Design Implementation Synthesis Testing Traditional Runtime Monitoring Operation Model Checking Correctness Proofs Maintenance

22 Effectiveness of Fault Detection Techniques

23 What does this slide say?
Groups: 2 min. What does this slide say?

24 Error Estimates: 3 errors per 1000 keystrokes for trained typists.
1 bug per 100 lines of code (after publication) 1.5 bugs per line of code (all together, including typing errors). Testing is 30-90% of the cost of a product. probability of correctly changing a program 50% if less than 10 lines, 20% if between 10 and 50 lines


Download ppt "Verification and Validation Overview"

Similar presentations


Ads by Google