Presentation is loading. Please wait.

Presentation is loading. Please wait.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.

Similar presentations


Presentation on theme: "Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering."— Presentation transcript:

1 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering Lecture 8 Duplication of course material for any commercial purpose without the explicit written permission of the professor is prohibited.

2 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 2 Today’s Lecture Quality assurance Testing Structural Testing Specification-based Testing

3 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 3 What Do These Have in Common? Airbus 320 Audi 5000 Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K

4 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 4 They All Failed! Airbus 320 Audi 5000 Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K

5 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 5 They All Failed! Airbus 320 – http://catless.ncl.ac.uk/Risks/10.02.html#subj1.1 Audi 5000 – “unintended” acceleration problem – A figure of speech: “We’re Audi 5000!” Mariner 1 launch – http://catless.ncl.ac.uk/Risks/5.73.html#subj2.1 AT&T telephone network – Ripple effect, from switch to switch, network down/dark for 2-3 days Ariane 5 – http://catless.ncl.ac.uk/Risks/18.24.html#subj2.1 Word 3.0 for MAC – “Plagued with bugs”, replaced for free later Word 3.0.1 Radiation therapy machine – http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_5.html NSA – Spy computer crash, system down/dark for a couple of days Y2K

6 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 6 Impact of Failures Not just “out there” – Space shuttle – Mariner 1 – Ariane 5 – NSA But also “at home” – Your car – Your call to your mom – Your wireless network, social network, mobile app – Your homework – Your hospital visit Peter Neumann’s Risks Digest: http://catless.ncl.ac.uk/Risks

7 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 7 Verification and Validation Verification – Ensure software meets specifications – Internal consistency – “Are we building the product right?” Validation – Ensure software meets customer’s intent – External consistency – “Are we building the right product?”

8 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 8 Software Qualities Correctness Reliability Efficiency Integrity Usability Maintainability Testability Flexibility Portability Reusability Interoperability Performance, etc.

9 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 9 Quality Assurance Assure that each of the software qualities is met – Goals set in requirements specification – Goals realized in implementation Sometimes easy, sometimes difficult – Portability versus safety Sometimes immediate, sometimes delayed – Understandability versus evolvability Sometimes provable, sometimes doubtful – Size versus correctness

10 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 10 An Idealized View of QA Design, in formal notation Executable machine code Execution on verified hardware Code, in verifiable language Complete formal specification of problem to be solved Correctness-preserving transformation

11 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 11 A Realistic View of QA Design, in mixed notation (Intel Pentium-based) machine code Execution on commercial hardware Code, in C++, Java, Ada, … Mixture of formal and informal specifications Manual transformation Compilation by commercial compiler Commercial firmware

12 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 12 First Complication Real needs Actual Specification “Correct” Specification No matter how sophisticated the QA process, the problem of creating the initial specification remains

13 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 13 Second Complication Complex data communications – Electronic fund transfer Distributed processing – Web search engine Stringent performance objectives – Air traffic control system Complex processing – Medical diagnosis system Sometimes, the software system is extremely complicated making it tremendously difficult to perform QA

14 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 14 Third Complication It is difficult to divide the particular responsibilities involved when performing quality assurance Project Management Development Group Quality Assurance Group

15 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 15 Fourth Complication Quality assurance lays out the rules – You will check in your code every day – You will comment your code – You will… Quality assurance also uncovers the faults – Taps developers on their fingers – Creates image of “competition” Quality assurance is viewed as cumbersome, “heavy” – “Just let me code” Quality assurance has a negative connotation

16 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 16 Available Techniques Formal program verification Static analysis of program properties – Concurrent programs: deadlock, starvation, fairness – Performance: min/max response time Code reviews and inspections Testing Most techniques are geared towards verifying correctness

17 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 17 Reminder: Use the Principles Rigor and formality Separation of concerns – Modularity – Abstraction Anticipation of change Generality Incrementality

18 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 18 Testing Exercise a module, collection of modules, or system – Use predetermined inputs (“test case”) – Capture actual outputs – Compare actual outputs to expected outputs Actual outputs equal to expected outputs  test case succeeds Actual outputs not equal to expected outputs  test case fails

19 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 19 V-Model of Development and Testing Develop Acceptance Tests Acceptance Test ReviewRequirements Review Develop RequirementsExecute System TestsDevelop Integration Tests Integration Tests ReviewDesign Review DesignExecute Integration TestsDevelop Unit Tests Unit Tests ReviewCode Review CodeExecute Unit Tests

20 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 20 Testing Terminology Failure – Incorrect or unexpected output – Symptom of a fault Fault – Invalid execution state – Symptom of an error – May or may not produce a failure Error – Defect or anomaly in source code – Commonly referred to as a “bug” – May or may not produce a fault

21 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 21 Testing Goals Reveal failures/faults/errors Locate failures/faults/errors Show system correctness – Within the limits of optimistic inaccuracy Improve confidence that the system performs as specified (verification) Improve confidence that the system performs as desired (validation) Program testing can be used to show the presence of bugs, but never to show their absence [Dijkstra]

22 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 22 Levels of Testing Unit testing – Testing of a single code unit – Requires use of test drivers Integration testing – Testing of interfaces among integrated units Incremental “Big bang” – Often requires test drivers and test stubs Acceptance testing – Testing of complete system for satisfaction of requirements

23 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 23 Test Tasks Devise test cases – Target specific areas of the system – Create specific inputs – Create expected outputs Choose test cases – Not all need to be run all the time Regression testing Run test cases – Can be labor intensive – Opportunity for automation All in a systematic, repeatable, and accurate manner

24 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 24 Test Automation Opportunities – Test execution – Scaffolding Executing test cases – Most repetitive, non-creative aspect of the test process – Design once, execute many times – Tool support available jUnit for java, xUnit in general

25 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 25 Scaffolding Term borrowed from construction, civil engineering Additional code to support development – But usually not included or visible in the deployed/shipped code – Not experienced by the end user Test driver – A function or program (“main”) for driving a test Test stub – A replacement of the “real code” that’s being called by the program Test harness – A replacement of any (possibly many) other parts of deployed system

26 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 26 Test Oracles Provide a mechanism for deciding whether a test case execution succeeds or fails Critical to testing – Used in white box testing – Used in black box testing Difficult to automate – Typically relies on humans – Typically relies on human intuition – Formal specifications may help

27 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 27 Oracle Example: Cosine Your test execution shows cos(0.5) = 0.87758256189 You have to decide whether this answer is correct? You need an oracle – Draw a triangle and measure the sides – Look up cosine of 0.5 in a book – Compute the value using Taylor series expansion – Check the answer with your desk calculator

28 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 28 Two Approaches White box testing – Structural testing – Test cases designed, selected, and ran based on structure of the code – Scale: tests the nitty-gritty – Drawbacks: need access to source Black box testing – Specification-based testing – Test cases designed, selected, and ran based on specifications – Scale: tests the overall system behavior – Drawback: less systematic

29 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 29 Structural Testing Use source code to derive test cases – Build a graph model of the system Control flow Data flow – State test cases in terms of graph coverage Choose test cases that guarantee different types of coverage – Node coverage – Edge coverage – Loop coverage – Condition coverage – Path coverage

30 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 30 Example: Building the program graph 1Node getSecondElement() { 2 Node head = getHead(); 3 if (head == null) 4 return null; 5 if (head.next == null) 6 return null; 7 return head.next.node; 8} 13 2456 7

31 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 31 Example: Averaging homework grades! 1float homeworkAverage(float[] scores) { 2 float min = 99999; 3 float total = 0; 4 for (int i = 0 ; i < scores.length ; i++) { 5 if (scores[i] < min) 6 min = scores[i]; 7 total += scores[i]; 8 } 9 total = total – min; 10 return total / (scores.length – 1); 11} 137 824569 10

32 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 32 Node Coverage Select test cases such that every node in the graph is visited – Also called statement coverage Guarantees that every statement in the source code is executed at least once Selects minimal number of test cases 137 824569 10 Test case: { 2 }

33 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 33 Edge Coverage Select test cases such that every edge in the graph is visited – Also called branch coverage Guarantees that every branch in the source code is executed at least once More thorough than node coverage – More likely to reveal logical errors 137 824569 10 Test case: { 1, 2 }

34 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 34 Other Coverage Criteria Loop coverage – Select test cases such that every loop boundary and interior is tested Boundary: 0 iterations Interior: 1 iteration and > 1 iterations – Watch out for nested loops – Less precise than edge coverage Condition coverage – Select test cases such that all conditions are tested if (a > b || c > d) … – More precise than edge coverage

35 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 35 Other Coverage Criteria Path coverage – Select test cases such that every path in the graph is visited – Loops are a problem 0, 1, average, max iterations Most thorough… …but is it feasible?

36 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 36 Challenges Structural testing can cover all nodes or edges without revealing obvious faults – No matter what input, program always returns 0 Some nodes, edges, or loop combinations may be infeasible – Unreachable/unexecutable code “Thoroughness” – A test suite that guarantees edge coverage also guarantees node coverage… – …but it may not find as many faults as a different test suite that only guarantees node coverage

37 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 37 More Challenges Interactive programs Listeners or event-driven programs Concurrent programs Exceptions Self-modifying programs Mobile code Constructors/destructors Garbage collection

38 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 38 Specification-Based Testing Use specifications to derive test cases – Requirements – Design – Function signature Based on some kind of input domain Choose test cases that guarantee a wide range of coverage – Typical values – Boundary values – Special cases – Invalid input values

39 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 39 “Some Kind of Input Domain” Determine a basis for dividing the input domain into subdomains – Subdomains may overlap Possible bases – Size – Order – Structure – Correctness – Your creative thinking Select test cases from each subdomain – One test case may suffice

40 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 40 Example 1float homeworkAverage(float[] scores) { 2 3 float total = 0; 4 for (int i = 0 ; i < scores.length ; i++) { 5 if (scores[i] < min) 6 min = scores[i]; 7 total += scores[i]; 8 } 9 total = total – min; 10 return total / (scores.length – 1); 11}

41 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 41 Possible Bases Array length – Empty array – One element – Two or three elements – Lots of elements Input domain: float[] Basis: array length one small empty large

42 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 42 Possible Bases Position of minimum score – Smallest element first – Smallest element in middle – Smallest element last Input domain: float[] Basis: position of minima somewhere in middle first last

43 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 43 Possible Bases Number of minima – Unique minimum – A few minima – All minima Input domain: float[] Basis: number of minima all data equal 1 minimum 2 minima

44 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 44 Testing Matrix Test case (input) Basis (subdomain) Expected output Notes

45 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 45 homeworkAverage 1 Test case (input) Basis: Array lengthExpected output Notes EmptyOneSmallLarge ()x0.099999! (87.3)x87.3 crashes! (90,95,85)x92.5 (80,81,82,83, 84,85,86,87, 88,89,90,91) x86.0

46 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 46 homeworkAverage 2 Test case (input) Position of minimumExpected output Notes FirstMiddleLast (80,87,88,89)x88.0 (87,88,80,89)x88.0 (99,98,0,97,96)x97.5 (87,88,89,80)x88.0

47 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 47 homeworkAverage 3 Test case (input) Number of minimaExpected output Notes OneSeveralAll (80,87,88,89)x88.0 (87,86,86,88)x87.0 (99,98,0,97,0)x73.5 (88,88,88,88)x88.0

48 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 48 How to Avoid Problems of Structural Testing Interactive programs Listeners or event-driven programs Concurrent programs Exceptions Self-modifying programs Mobile code Constructors/destructors Garbage collection


Download ppt "Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering."

Similar presentations


Ads by Google