Presentation is loading. Please wait.

Presentation is loading. Please wait.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.

Similar presentations


Presentation on theme: "Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering."— Presentation transcript:

1 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering Lecture 8-2 May 21, 2015 Emily Navarro Duplication of course material for any commercial purpose without the explicit written permission of the professor is prohibited.

2 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 2 Today’s Lecture Quality assurance Testing

3 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 3 Today’s Lecture Quality assurance Testing

4 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 4 What Do These Have in Common? Airbus 320 Audi 5000 Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K

5 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 5 They All Failed! Airbus 320 Audi 5000 Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K

6 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 6 They All Failed! Airbus 320 – http://catless.ncl.ac.uk/Risks/10.02.html#subj1.1 Audi 5000 – “unintended” acceleration problem Mariner 1 launch – http://catless.ncl.ac.uk/Risks/5.73.html#subj2.1 AT&T telephone network – Ripple effect, from switch to switch, network down/dark for 2-3 days Ariane 5 – http://catless.ncl.ac.uk/Risks/18.24.html#subj2.1 Word 3.0 for MAC – “Plagued with bugs”, replaced for free later Word 3.0.1 Radiation therapy machine – http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_5.html Y2K

7 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 7 Y2K Facts Bug description – Date formats were MM/DD/YY, e.g., 01/01/98, 02/02/99, 03/03/00 – 98 -> 1998, 99 -> 1999 – But does 00 mean 2000 or 1900? – Does 1999 turn to 19100? Effects – Relatively minor Cost: $300 billion!

8 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 8 Impact of Failures Not just “out there” – Space shuttle – Mariner 1 – Ariane 5 But also “at home” – Your car – Your call to your mom – Your wireless network, social network, mobile app – Your homework – Your hospital visit Peter Neumann’s Risks Digest: http://catless.ncl.ac.uk/Risks

9 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 9 Verification and Validation Verification – Ensure software meets specifications – Internal consistency – “Are we building the product right?” – e.g., testing, inspections, program analysis Validation – Ensure software meets customer’s intent – External consistency – “Are we building the right product?” – e.g., usability testing, user feedback Validation

10 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 10 Verification and Validation Verification – Ensure software meets specifications – Internal consistency – “Are we building the product right?” – e.g., testing, inspections, program analysis Validation – Ensure software meets customer’s intent – External consistency – “Are we building the right product?” – e.g., usability testing, user feedback “Implement the idea properly” “Implement the proper idea” Validation

11 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 11 Software Qualities Correctness Reliability Efficiency Integrity Usability Maintainability Testability Flexibility Portability Reusability Interoperability Performance, etc.

12 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 12 Quality Assurance All activities designed to measure and improve quality in a product Assure that each of the software qualities is met – Goals set in requirements specification – Goals realized in implementation Sometimes easy, sometimes difficult – Portability versus safety Sometimes immediate, sometimes delayed – Understandability versus evolvability Sometimes provable, sometimes doubtful – Size versus correctness

13 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 13 An Idealized View of QA Design, in formal notation Executable machine code Execution on verified hardware Code, in verifiable language Complete formal specification of problem to be solved Correctness-preserving transformation

14 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 14 A Realistic View of QA Design, in mixed notation (Intel Pentium-based) machine code Execution on commercial hardware Code, in C++, Java, Ada, … Mixture of formal and informal specifications Manual transformation Compilation by commercial compiler Commercial firmware

15 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 15 First Complication Real needs Actual Specification “Correct” Specification No matter how sophisticated the QA process, the problem of creating the initial specification remains

16 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 16 Second Complication There are often multiple, sometimes conflicting qualities to be tested, making QA a challenge.

17 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 17 Third Complication Complex data communications – Electronic fund transfer Distributed processing – Web search engine Stringent performance objectives – Air traffic control system Complex processing – Medical diagnosis system Sometimes, the software system is extremely complicated making it tremendously difficult to perform QA

18 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 18 Fourth Complication It is difficult to divide the particular responsibilities involved when performing quality assurance Project Management Development Group Quality Assurance Group

19 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 19 Fourth Complication Quality assurance lays out the rules – You will check in your code every day – You will comment your code – You will… Quality assurance also uncovers the faults – Taps developers on their fingers – Creates image of “competition” Quality assurance is viewed as cumbersome, “heavy” – “Just let me code” Quality assurance has a negative connotation

20 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 20 Available Techniques Formal program verification Static analysis of program properties – Concurrent programs: deadlock, starvation, fairness – Performance: min/max response time Code reviews and inspections Testing Most techniques are geared towards verifying correctness

21 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 21 Which Technique to Use? There is no “silver bullet” of testing – A mixture of techniques is needed Different approaches are needed for different faults – E.g., testing for race conditions vs. performance issues Different approaches are needed at different times – E.g., unit testing vs. usability testing vs. system testing

22 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 22 How do we know when we are done? We can never find all faults – But we cannot test forever! We aim to reveal as many faults as possible in a given period of time – More faults found and fixed = good – More bugs found = more bugs not found Aim to meet the quality requirements established for the project

23 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 23 How do we know when we are done? Number of problems found per hour Day 1 Time Day 2 Day 3 Day 4 Day 5

24 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 24 How do we know when we are done? Number of problems found per hour Day 1 Time Day 2 Day 3 Day 4 Day 5 Could stop testing when the problem find rate stabilizes to near zero

25 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 25 How do we know when we are done? Confidence in module being tested Number of test cases with correct outputs -- 100% Sweet spot?

26 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 26 How do we know when we are done? Can pepper the code with defects and observe how many of the seeded defects are discovered Scenario – The program is seeded with 10 defects – After some test cases are executed 7 of the seeded defects are found 45 nonseeded defects are found – Since 70% of the seeded defects are found, and 30% not found, assume that the nonseeded defects follow the same pattern 45 is 70% of 64, so there are 19 (64-45) defects left to be found This technique assumes that nonseeded defects are similar to the seeded ones

27 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 27 Reminder: Use the Principles Rigor and formality Separation of concerns – Modularity – Abstraction Anticipation of change Generality Incrementality

28 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 28 Today’s Lecture Quality assurance Testing

29 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 29 Testing Using a set of techniques to detect and correct errors in a software product Exercise a module, collection of modules, or system – Use predetermined inputs (“test case”) – Capture actual outputs – Compare actual outputs to expected outputs Actual outputs equal to expected outputs  test case succeeds Actual outputs not equal to expected outputs  test case fails

30 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 30 Testing Process Model 1.Decide what to test. 2.Select a test case input. 3.Determine the expected output E. 4.Run the system with the test case input. 5.Capture the actual output A. 6.Compare E and A. Different? Inform programmer 7.Loop back to 1 or 2, if time permits.

31 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 31 V-Model of Development and Testing Develop Acceptance Tests Acceptance Test ReviewRequirements Review Develop RequirementsExecute System TestsDevelop Integration Tests Integration Tests ReviewDesign Review DesignExecute Integration TestsDevelop Unit Tests Unit Tests ReviewCode Review CodeExecute Unit Tests

32 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 32 Spiral Risk analysis Risk analysis Risk analysis Risk analysis Rapid prototype Specification Design Implementation Verify

33 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 33 The RUP Model Management Environment Business Modeling Implementation Test Analysis & Design Preliminary Iteration(s) Iter. #1 Phases Process Workflows Iterations Supporting Workflows Iter. #2 Iter. #n Iter. #n+1 Iter. #n+2 Iter. #m Iter. #m+1 Deployment Configuration Mgmt Requirements ElaborationTransitionInceptionConstruction

34 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 34 Extreme Programming

35 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 35 Extreme Programming

36 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 36 Test-Driven Development

37 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 37 Testing Terminology Error – A human action that produces an incorrect result – May or may not produce a fault Fault – A condition that may (or may not) cause a failure – Caused by an error – Commonly referred to as a “bug” Failure – The inability of a system to perform a function according to its specifications – Result of a fault

38 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 38 Error, Fault, Failure Error (in programmer’s mind) Fault or defect (in code) Failure (in execution or output)

39 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 39 Error, Fault, Failure Error (in programmer’s mind) Fault or defect (in code) Failure (in execution or output)

40 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 40 Error, Fault, Failure

41 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 41 Error, Fault, Failure

42 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 42 Testing Goals Detect failures/faults/errors Locate failures/faults/errors Fix failures/faults/errors Show system correctness – Within the limits of optimistic inaccuracy Improve confidence that the system performs as specified (verification) Improve confidence that the system performs as desired (validation) Program testing can be used to show the presence of bugs, but never to show their absence [Dijkstra]

43 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 43 Testing Process Quality Goals Accurate Complete Thorough Repeatable Systematic

44 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 44 Testing Planning Set quality goal for the project Choose test methodologies and techniques Assign resources Bring in tools Set a schedule

45 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 45 Who Does the Testing? Programmers – Unit testing Testers – Non-programmers Users – Acceptance testing – Alpha testing – Beta testing

46 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 46 Levels of Testing Unit testing – Testing of a single code unit – Requires use of test drivers Functional/integration testing – Testing of interfaces among integrated units Incremental “Big bang” – Often requires test drivers and test stubs System/acceptance testing – Testing of complete system for satisfaction of requirements

47 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 47 Levels of Testing Meal String: mealName int: numCalories void setNumCalories(…) CalorieTracker MealList List : mealList App: connectedApp ZotMyHealth LoginManager CalorieTracker WorkoutTracker SleepTracker Unit testing Meal.setNumCalories(…) Functional/integration testing CalorieTracker.addMeal(Meal m) System/acceptance testing Add a meal Delete a workout Login Logout SettingsManager void addMeal(…) void deleteMeal(…)

48 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 48 Test Tasks Devise test cases – Target specific areas of the system – Create specific inputs – Create expected outputs Choose test cases – Not all need to be run all the time Regression testing Run test cases – Can be labor intensive – Opportunity for automation All in a systematic, repeatable, and accurate manner Test Case: A group of input values that cause a program to take some defined action, with an expected output

49 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 49 How to Choose Test Cases (I) There are usually an infinite number of possible test cases for a given function There are too many input-output pairs to exhaustively verify, so we must take a small sample Example: multiplier – Input: two integers – Output: one integer int multiplier(int a, int b) { return a * b; }

50 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 50 How to Choose Test Cases (II) Practically, testing can only select a very small set of inputs – Our goal should be to choose the best ones What are the best five test cases for a multiplier? – (AKA: what five test cases, if they don’t reveal any bugs, will give you the most confidence that the multiplier is working correctly?)

51 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 51 How to Choose Test Cases (III) Intuition Specification (black-box testing) – Equivalence class partitioning – Boundary-value analysis Code (white-box testing) – Path analysis Existing test cases (regression testing) Faults – Error guessing – Error-prone analysis

52 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 52 Test Automation Opportunities – Test execution – Scaffolding Executing test cases – Most repetitive, non-creative aspect of the test process – Design once, execute many times – Tool support available jUnit for java, xUnit in general White box testing can be partially automated Black box testing requires “formal” specifications to automate

53 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 53 Scaffolding Term borrowed from construction, civil engineering Additional code to support development – But usually not included or visible in the deployed/shipped code – Not experienced by the end user Test driver – A function or program (“main”) for driving a test Test stub – A replacement of the “real code” that’s being called by the program

54 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 54 Test Drivers/Stubs

55 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 55 Test Drivers/Stubs

56 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 56 Test Oracles Provide a mechanism for deciding whether a test case execution succeeds or fails Critical to testing – Used in white box testing – Used in black box testing Difficult to automate – Typically relies on humans – Typically relies on human intuition – Formal specifications may help

57 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 57 Oracle Example: Cosine Your test execution shows cos(0.5) = 0.87758256189 You have to decide whether this answer is correct? You need an oracle – Draw a triangle and measure the sides – Look up cosine of 0.5 in a book – Compute the value using Taylor series expansion – Check the answer with your desk calculator

58 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 58 Two Approaches Black box testing – Specification-based testing – Test cases designed, selected, and ran based on specifications – Scale: tests the higher-level system behavior – Drawback: less systematic White box testing – Structural testing – Test cases designed, selected, and ran based on structure of the code – Scale: tests the nitty-gritty – Drawbacks: need access to source

59 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 59 Reminder Discussion tomorrow – Bring a laptop or tablet

60 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 60 Next time Black-box (Specification-based) Testing


Download ppt "Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering."

Similar presentations


Ads by Google