Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS4311 Spring 2011 Verification & Validation Dr. Guoqiang Hu Department of Computer Science UTEP.

Similar presentations


Presentation on theme: "CS4311 Spring 2011 Verification & Validation Dr. Guoqiang Hu Department of Computer Science UTEP."— Presentation transcript:

1 CS4311 Spring 2011 Verification & Validation Dr. Guoqiang Hu Department of Computer Science UTEP

2 Announcement What: CS4390 (Cross-listed with CS5390): “ Software Testing ” Testing Only, Nothing Else! When: This Summer Prerequisite: CS4311 * Note: The V&V content covered in this course (CS4311) can be considered as an introduction to the above course

3 What Are Verification and Validation? Groups of 2 Why? Who? What? Against what? When? How? 5 minutes Verification: Evaluating the product (a system or component) of a development phase to determine whether the product satisfies the specification at the start of the phase. Did we build the system or component right? Validation: Evaluating the product (mostly the system or subsystem) during or at the end of the development process to determine whether it satisfies the specified requirements. Did we build the right system?

4 V&V Activities and the Software Lifecycle Requirements engineering: Determine general test strategy/plan(techniques, criteria, team) Test requirements specification Completeness Consistency Feasibility (Functional, performance requirements) Testability (Specific; Unambiguous; Quantitative; Traceable) Generate acceptance/validation testing data Design: Determine integration test strategy Assess/Test the design: Completeness; Consistency; Handling scenarios; Traceability (to and fro). ( design walkthrough, inspection )

5 V&V Activities and the Software Lifecycle Implementation: Determine unit test strategy Techniques (Static v. Dynamic): Read/Have it read Code walkthrough/Formal code inspection Formal verification/Proof Manual testing Tools, and whistles and bells (driver/harness, stub) Maintenance: Determine regression test strategy Documentation maintenance (vital)

6 V&V Activities and the Software Lifecycle The “V” model:

7 Where Do the Errors Come From? Groups of 2 What kinds of errors? Who? 3 minutes Kinds of errors: Missing information Wrong information/design/implementation Extra information Facts about errors: To err is human (but different person has different error rate). Different studies indicate 30 to 85 errors per 1000 lines. After extensive testing, 0.5 to 3 errors per 1000 lines remain. The longer an error goes undetected, the more costly to correct.

8 Basic Concepts Correct specification: Specification matches the client’s intent. Correct program: Program matches its specification. Error: A human activity that leads to the creation of a fault. Fault/Bug: The physical manifestation of an error. It can cause a failure. Failure: The state when a fault is encountered during execution.

9 Basic Concepts Fault identification: Process of determining what fault caused a failure Fault correction: Process of changing a system to remove a fault Debugging: Process of finding and fixing program faults Testing: Designing and executing tests. And (if bugs present) debugging. Test case: A particular set of input and the expected output

10 Basic Concepts Test set: A finite set of test cases working together with the same purpose Test objective: The main goal for a particular test. Ex., finding faults/fault detection, or, demonstrating reliability/confidence building (no/low failure rate in normal use). Test objective affects test strategy, test criteria, and test selection. Test criteria: Specifies testing requirements/stopping rule/measurement. It is closely linked to test techniques. Ex., for coverage-based test techniques, 100% statements, or branches, coverage, or both.

11 The General Approaches of Verification & Validation Groups of 2 How? 2 minutes

12 The Exhaustive Test Examples Groups of 2 3 minutes How many test cases are required? How long will it take?

13 The Exhaustive Test Examples Groups of 2 3 minutes A B C

14 The Exhaustive Test Examples Groups of 2 3 minutes

15 The Purpose of Testing The purpose of testing is NOT to prove the program is correct Instead, it is to find problems in the program so that the found problems can be fixed Interesting phenomena of testing results: Successful test: The more faults are found, the more successful is a test case/ set. Quality of the code: The more faults are found in a unit of code, it normally means the worse is the quality of the code. Studies found that more faults may still go undetected in the same piece of code. Although you can say the quality of the code has been improved after the found faults are fixed.

16 Hierarchy of V&V techniques V&V Dynamic Techniques Symbolic Execution Testing Informal Analysis Formal Analysis Static Techniques Review Inspection Walkthrough

17 Hierarchy of Testing Testing Program Testing Top Down Bottom Up Integration TestingUnit Testing System Testing Big Bang Sandwich Black Box White Box Function Performance Reliability Availability Acceptance Testing Properties Security Equivalence Boundary Decision Table State Transition Use Case Domain Analysis Control FlowData Flow Usability Documentation Portability Capacity Ad hoc Pilot Alpha Beta

18 Hierarchy of Testing Testing Program Testing Top Down Bottom Up Integration TestingUnit Testing System Testing Big Bang Sandwich Black Box White Box Function Performance Reliability Availability Acceptance Testing Properties Security Equivalence Boundary Decision Table State Transition Use Case Domain Analysis Control FlowData Flow Usability Documentation Portability Capacity Ad hoc Pilot Alpha Beta

19 Types of System Testing Function Testing: Integrated system performs specified function Scenarios Black-box techniques Properties Testing: Integrated system tests against non-functional requirements Performance, Reliability, Security, Usability, etc. Performance:  Stress test: maximum throughput  Overload test: exceed specification  Volume test: sustained large throughput  Response time  Robustness: test things not specified, but quite possible  Recovery: crash, or, recoverable? Acceptance Testing: Customers test system Pilot ( initial, customer ), Alpha test ( in-house ), Beta test ( on-site )

20 Types of Faults ( not exhaustive ) Algorithmic: algorithm or logic does not produce the proper output for the given input Syntax: improper use of language constructs Computation (precision): formula’s implementation wrong or result not to correct degree of accuracy Documentation: documentation does not match what program does Stress (overload): data structures filled past capacity Capacity: system performance unacceptable as activity reaches its specified limit Timing: coordinating of events not correct Throughput: system does not perform at speed required Recovery: failure encountered and does not recover (crashed)

21 Who Are Involved? Professional/Third party testers: organize and run tests Analysts: involved in system requirements definition and specification Designers: involved in the design and understand proposed solution and solution’s constraints Implementers: involved in the implementation and understand the constraints associated with implementation Configuration management representative: arranges for changes to be made consistently across all artifacts The Advice View testing as part of the development process Testing is the last line of defense: Errors indicate that there is a problem with the development process

22 Test Plan Objectives of Test Plan: Facilitate task of testing (strategy): The scope, approach, resources, and schedule Test techniques Test criteria Test documentation requirements Avoid repetition Improve test coverage Improve test efficiency Provide structure for final tests Improve communication about testing Provide structure for: Organizing Scheduling Managing

23 Test Plan Report Table of Contents DOCUMENT CONTROL Approval Document Change Control Distribution List Change Summary 1. INTRODUCTION 1.1. Purpose 1.2. Scope 1.3. System Overview 1.4. Test Approach Overview 2. APPLICABLE REFERENCES 3. TESTING APPROACH 4. TEST SCHEDULE 5. TEST XX

24 3. TESTING APPROACH Specify types of tests to be performed List specific tests Test descriptions are in section 5 May include Test Management Requirements: how testing is to be managed Personnel Requirements Hardware Requirements Software Requirements Cost

25 Section 5 and later Test No. Current Status (Passed / Failed / Pending) Test title Testing approach Concluding Remarks Testing Team Date Completed:

26 Assignments Lead: V&V Due Date: Tuesday, April 19, 2011 Due Time: Midnight MDT


Download ppt "CS4311 Spring 2011 Verification & Validation Dr. Guoqiang Hu Department of Computer Science UTEP."

Similar presentations


Ads by Google