Presentation is loading. Please wait.

Presentation is loading. Please wait.

Topic 11Summer 2003 1 ICS 52: Introduction to Software Engineering Lecture Notes for Summer Quarter, 2003 Michele Rousseau Topic 11 Partially based on.

Similar presentations


Presentation on theme: "Topic 11Summer 2003 1 ICS 52: Introduction to Software Engineering Lecture Notes for Summer Quarter, 2003 Michele Rousseau Topic 11 Partially based on."— Presentation transcript:

1 Topic 11Summer 2003 1 ICS 52: Introduction to Software Engineering Lecture Notes for Summer Quarter, 2003 Michele Rousseau Topic 11 Partially based on lecture notes written by Sommerville, Frost, Van Der Hoek, Taylor & Tonne. Duplication of course material for any commercial purpose without the written permission of the lecturers is prohibited

2 Topic 11Summer 2003 2 Today’s Lecture l Quality assurance l An introduction to testing

3 Topic 11Summer 2003 3 ICS 52 Life Cycle Requirements phase Verify Design phase Verify Implementation phase Test Testing phase Verify

4 Topic 11Summer 2003 4 Implementation/Testing Interaction Implementation (previous lecture) Testing (this lecture)

5 Topic 11Summer 2003 5 The Seriousness of the problem… l Mars Pathfinder – Metric or English system l Audi 5000 – auto accelerate – feature or fault? l Mariner 1 launch – veered off course l AT&T telephone network - down for 9 hours l Ariane 5 l Pentium – FPU error l X-ray machine – over-radiation l LAS

6 Topic 11Summer 2003 6 Impact of Failures l Not just “out there” Mars Pathfinder Mariner 1 Ariane 5 l But also “at home” Your car Your call to your mom Your homework Your hospital visit Peter Neumann’s Risks Forum: http://catless.ncl.ac.uk/Risks

7 Topic 11Summer 2003 7 Quality Assurance l What qualities do we want to assure? l Correctness (most important?) l How to assure correctness? By running tests How else? l Can qualities other than correctness be “assured” ? l How is testing done? l When is testing done? l Who tests? l What are the problems?

8 Topic 11Summer 2003 8 Software Qualities l Correctness l Reliability l Robustness l Performance l Usability l Verifiability l Maintainability l Repairability l Safety l Evolvability l Reusability l Portability l Survivability l Understandability We want to show relevant qualities exist

9 Topic 11Summer 2003 9 Quality Assurance l Assure that each of the software qualities is met Goals set in requirements specification Goals realized in implementation l Sometimes easy, sometimes difficult Portability versus safety l Sometimes immediate, sometimes delayed Understandability versus evolvability l Sometimes provable, sometimes doubtful Size versus correctness

10 Topic 11Summer 2003 10 Verification and Validation l Verification “Are we building the product right?” (Boehm) The Software should conform to its specification testing, reviews, walk-throughs, inspections internal consistency; consistency with previous step l Validation “Are we building the right product?” The software should do what the user really requires ascertaining software meets customer’s intent l Correctness has no meaning independent of specifications

11 Topic 11Summer 2003 11 Problem #1:Eliciting the Customer’s Intent Actual Specs“Correct” Specs Real needs No matter how sophisticated the QA process is, there is still the problem of creating the initial specification

12 Topic 11Summer 2003 12 Problem #2: QA is tough l Complex data communications Electronic fund transfer l Distributed processing Web search engine l Stringent performance objectives Air traffic control system l Complex processing Medical diagnosis system Sometimes, the software system is extremely complicated making it tremendously difficult to perform QA

13 Topic 11Summer 2003 13 Problem #3: Management Aspects of QA l Who does what part of the testing? QA (Quality Assurance) team? Are developers involved? How independent is the independent testing group? l What happens when bugs are found? l What is the reward structure? Project Management Development GroupQA Group ? ?

14 Topic 11Summer 2003 14 Problem #4: QA vs Developers l Quality assurance lays out the rules You will check in your code every day You will comment your code You will… l Quality assurance also uncovers the faults Taps developers on their fingers Creates image of “competition” l Quality assurance is viewed as cumbersome “Just let me code” l What about rewards? Quality assurance has a negative connotation

15 Topic 11Summer 2003 15 Problem #5: Can’t test exhaustively There are 10 14 possible paths! If we execute one test per millisecond, it would take 3.170 years to test this program!!  Out of question loop < 20x

16 Topic 11Summer 2003 16 Simple Example: A 32-Bit Multiplier l Input: 2 32-bit integers l Output: the 64-bit product of the inputs l Testing hardware: checks one billion products per second (or roughly one check per 2 -30 seconds) l How long to check all possible products? 2 64  2 -30 = 2 34 seconds  512 years l What if the implementation is based on table lookups? l How would you know that the spec is correct?

17 Topic 11Summer 2003 17 An Idealized View of QA Design, in formal notation executable machine code Execution on verified hardware Code, in verifiable language Complete formal specs of problem to be solved Correctness-preserving transformation

18 Topic 11Summer 2003 18 A Realistic View of QA Design, in mixed notation Pentium machine code Execution on commercial hardware Code, in C++, Ada, Java, … Mixture of formal and informal specifications Manual transformation Compilation by commercial compiler Commercial firmware

19 Topic 11Summer 2003 19 l Is a whole life-cycle process - V & V must be applied at each stage in the software process. l Has two principal objectives The discovery of defects in a system The assessment of whether or not the system is usable in an operational situation. The V & V process

20 Topic 11Summer 2003 20 l Software inspections Concerned with analysis of the static system representation to discover problems (static verification) May be supplement by tool-based document and code analysis l Software testing Concerned with exercising and observing product behaviour (dynamic verification) The system is executed with test data and its operational behaviour is observed Static and dynamic verification

21 Topic 11Summer 2003 21 Static and dynamic V&V

22 Topic 11Summer 2003 22 V & V confidence l Depends on system’s purpose, user expectations and marketing environment Software function »The level of confidence depends on how critical the software is to an organisation User expectations »Users may have low expectations of certain kinds of software Marketing environment »Getting a product to market early may be more important than finding defects in the program

23 Topic 11Summer 2003 23 l Careful Planning is essential l Start Early – remember the V model Perpetual Testing l Balance static verification and testing l Define standards for the testing process rather than describing product tests V & V planning

24 Topic 11Summer 2003 24 Static Analysis l Software Inspection l Examine the source representation with the aim of discovering anomalies and defects l May be used before implementation l May be applied to any representation of the system (requirements, design, test data, etc.) l Very effective technique for discovering errors

25 Topic 11Summer 2003 25 Inspection success l Many different defects may be discovered in a single inspection. l In testing, one defect,may mask another so several executions are required l They reuse domain and programming knowledge so reviewers are likely to have seen the types of error that commonly arise

26 Topic 11Summer 2003 26 Inspections and testing l Inspections and testing are complementary and not opposing verification techniques l Both should be used during the V & V process l Inspections can check conformance with a specification Can’t check conformance with the customer’s real requirements Cannot validate dynamic behaviour l Inspections cannot check non-functional characteristics such as performance, usability, etc.

27 Topic 11Summer 2003 27 Inspections and testing l Inspections and testing are complementary and not opposing verification techniques l Both should be used during the V & V process l Inspections can check conformance with a specification but not conformance with the customer’s real requirements l Inspections cannot check non-functional characteristics such as performance, usability, etc.

28 Topic 11Summer 2003 28 Testing l The only validation technique for non-functional requirements l Should be used in conjunction with static verification to provide full V&V coverage “Program testing can be used to show the presence of bugs, but never to show their absence.” — E. W. Dijkstra

29 Topic 11Summer 2003 29 What is Testing l Exercising a module, collection of modules, or system Use predetermined inputs (“test case”) Capture actual outputs Compare actual outputs to expected outputs l Actual outputs equal to expected outputs  test case succeeds l Actual outputs unequal to expected outputs  test case fails

30 Topic 11Summer 2003 30 Limits of software testing l “Good” testing will find bugs l “Good” testing is based on requirements, i.e. testing tries to find differences between the expected and the observed behavior of systems or their components l V&Vshould establish confidence that the software is fit for purpose l BUT remember: Testing can only prove the presence of bugs - never their absence – can’t prove it is defect free l Rather, it must be good enough for its intended use and the type of use will determine the degree of confidence that is needed

31 Topic 11Summer 2003 31 Testing Terminology l Failure: Incorrect or unexpected output, based on specifications Symptom of a fault l Fault: Invalid execution state Symptom of an error May or may not produce a failure l Error: Defect or anomaly or “bug” in source code May or may not produce a fault

32 Topic 11Summer 2003 32 l Defect testing & debugging –Different processes l V&V -establishes existence of defects in a program l Debugging – locate and repair Testing and debugging

33 Topic 11Summer 2003 33 The debugging process

34 Topic 11Summer 2003 34 Testing Goals l Reveal failures/faults/errors l Locate failures/faults/errors l Show system correctness l Improve confidence that the system performs as specified (verification) l Improve confidence that the system performs as desired (validation) l Desired Qualities: Accurate Complete / thorough Repeatable Systematic

35 Topic 11Summer 2003 35 Test Tasks l Devise test cases Target specific areas of the system Create specific inputs Create expected outputs l Choose test cases Not all need to be run all the time »Regression testing l Run test cases Can be labor intensive All in a systematic, repeatable, and accurate manner

36 Topic 11Summer 2003 36 Levels of Testing l Unit/component testing: testing of code unit (subprogram, class, method/function, small subsystem) Often requires use of test drivers l Integration testing: testing of interfaces between units Incremental or “big bang” approach? Often requires drivers and stubs l System or acceptance testing: testing complete system for satisfaction of requirements often performed by user / customer

37 Topic 11Summer 2003 37 What is the problem we need to address? l Want to verify software --> l Need to test --> l Need to decide on test cases --> l But, no set of test cases guarantees absence of bugs, l What is a systematic approach to the selection of test cases that will lead to the accurate, acceptably thorough, and repeatable identification of errors, faults, and failures? So,

38 Topic 11Summer 2003 38 Two Approaches l White box (or Glass Box) testing Structural testing Test cases designed, selected, and ran based on structure of the code Scale: tests the nitty-gritty Drawbacks: need access to source code l Black box testing Specification-based testing Test cases designed, selected, and ran based on specifications Scale: tests the overall system behavior Drawback: less systematic

39 Topic 11Summer 2003 39 Test Oracles l Provide a mechanism for deciding whether a test case execution succeeds or fails l Critical to testing Used in white box testing Used in black box testing l Difficult to automate Typically relies on humans Typically relies on human intuition Formal specifications may help

40 Topic 11Summer 2003 40 Example l Your test shows cos(0.5) = 0.8775825619 l You have to decide whether this answer is correct? l You need an oracle Draw a triangle and measure the sides Look up cosine of 0.5 in a book Compute the value using Taylor series expansion Check the answer with your desk calculator

41 Topic 11Summer 2003 41 Use the Principles l Rigor and formality l Separation of concerns Modularity Abstraction l Anticipation of change l Generality l Incrementality


Download ppt "Topic 11Summer 2003 1 ICS 52: Introduction to Software Engineering Lecture Notes for Summer Quarter, 2003 Michele Rousseau Topic 11 Partially based on."

Similar presentations


Ads by Google