Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Engineering Lecture 14: Testing Techniques and Strategies.

Similar presentations


Presentation on theme: "Software Engineering Lecture 14: Testing Techniques and Strategies."— Presentation transcript:

1 Software Engineering Lecture 14: Testing Techniques and Strategies

2 Today’s Topics l Chapters 17 & 18 in SEPA 5/e l Testing Principles & Testability l Test Characteristics l Black-Box vs. White-Box Testing l Flow Graphs & Basis Path Testing l Testing & Integration Strategies

3 Software Testing l Opportunities for human error Specifications, design, coding Communication l “Testing is the ultimate review” l Can take 30-40% of total effort l For critical apps, can be 3 to 5 times all other efforts combined!

4 Testing Objectives l Execute a program with the intent of finding errors l Good tests have a high probability of discovering errors l Successful tests uncover errors l ‘No errors found’: not a good test! l Verifying functionality is a secondary goal

5 Testing Principles l Tests traceable to requirements l Tests planned before testing l Pareto principle: majority of errors traced to minority of components l Component testing first, then integrated testing l Exhaustive testing is not possible l Independent tests: more effective

6 Software Testability l Operability l Observability l Controllability l Decomposability Characteristics that lead to testable software: l Simplicity l Stability l Understandability

7 Operability l System has few bugs l No bugs block execution of tests l Product evolves in functional stages The better it works, the more efficiently it can be tested

8 Observability l Distinct output for each input l States & variables may be queried l Past states are logged l Factors affecting output are visible l Incorrect output easily identified l Internal errors reported l Source code accessible What you see is what you test

9 Controllability l All possible outputs can be generated by some input l All code executable by some input l States, variables directly controlled l Input/output consistent, structured l Tests are specified, automated, and reproduced The better we can control the software, the more the testing can be automated

10 Decomposability l Independent modules l Modules can be tested separately By controlling the scope of testing, we can more quickly isolate problems and perform smarter retesting

11 Simplicity l Minimum feature set l Minimal architecture l Code simplicity The less there is to test, the more quickly we can test it

12 Stability l Changes made to system: are infrequent are controlled don’t invalidate existing tests l Software recovers from failure The fewer the changes, the fewer the disruptions to testing

13 Understandability l Design is well-understood l Dependencies are well understood l Design changes are communicated l Documentation is: accessible well-organized specific, detailed and accurate The fewer the changes, the fewer the disruptions to testing

14 Test Characteristics l Good test has a high probability of finding an error l Good test is not redundant l A good test should be “best of breed” l A good test is neither too simple nor too complex

15 Test Case Design l ‘Black Box’ Testing Consider only inputs and outputs l ‘White Box’ or ‘Glass Box’ Testing Also consider internal logic paths, program states, intermediate data structures, etc.

16 White-Box Testing l Guarantee that all independent paths have been tested l Exercise all conditions for ‘true’ and ‘false’ l Execute all loops for boundary conditions l Exercise internal data structures

17 Why White-Box Testing? l More errors in ‘special case’ code which is infrequently executed l Control flow can’t be predicted accurately in black-box testing l Typo errors can happen anywhere!

18 Basis Path Testing l White-box method [McCabe ‘76] l Analyze procedural design l Define basis set of execution paths l Test cases for basis set execute every program statement at least once

19 Basis Path Testing [2] Flow Graph: Representation of Structured Programming Constructs [From SEPA 5/e]

20 Cyclomatic Complexity V(G)=E-N+2 = 4 Independent Paths 1: 1,11 2: 1,2,3,4,5,10,1,11 3: 1,2,3,6,8,9,10,1,11 4: 1,2,3,6,7,9,10,1,11 V(G): upper bound on number of tests to ensure all code has been executed [From SEPA 5/e]

21 Black Box Testing l Focus on functional requirements l Incorrect / missing functions l Interface errors l Errors in external data access l Performance errors l Initialization and termination errors

22 Black Box Testing [2] l How is functional validity tested? l What classes of input will make good test cases? l Is the system sensitive to certain inputs? l How are data boundaries isolated?

23 Black Box Testing [3] l What data rates and volume can the system tolerate? l What effect will specific combinations of data have on system operation?

24 Comparison Testing l Compare software versions l “Regression testing”: finding the outputs that changed l Improvements vs. degradations l Net effect depends on frequency and impact of degradations l When error rate is low, a large corpus can be used

25 Generic Testing Strategies l Testing starts at module level and moves “outward” l Different testing techniques used at different times l Testing by developer(s) and independent testers l Testing and debugging are separate activities

26 Verification and Validation l Verification “Are we building the product right?” l Validation “Are we building the right product?” l Achieved by life-cycle SQA activities, assessed by testing l “You can’t create quality by testing”

27 Organization of Testing [From SEPA 5/e]

28 How Much Test Time is Necessary? Logarithmic Poisson execution-time model With sufficient fit, model predicts testing time required to reach acceptable failure rate [From SEPA 5/e]

29 Unit Testing [From SEPA 5/e]

30 Top-Down Integration PRO: Higher-level (logic) modules tested early CON: Lower-level (reusable) modules tested late [From SEPA 5/e]

31 Bottom-Up Integration PRO: Lower-level (reusable) modules tested early CON: Higher-level (logic) modules tested late [From SEPA 5/e]

32 Hybrid Approaches l Sandwich Integration: combination of top-down and bottom-up l Critical Modules address several requirements high level of control complex or error prone definite performance requirements l Test Critical Modules ASAP!

33 Questions?

34 Software Engineering for Information Technology Lecture 12: System Design

35 Today’s Topics l Design Elements l Principles for Quality Design l Modularity & Partitioning l Effective Modular Design l Architectural Styles l Mapping Models to Modules

36 Design Elements l Data Design data structures for data objects l Architectural Design modular structure of software l Interface Design internal / external communication l Component-Level Design procedural description of modules

37 [From SEPA 5/e] Increasing Detail Design Elements Linked to Analysis Models

38 Evaluating A Design l A design must implement: explicit requirements (analysis model) customer’s implicit requirements l A design must be readable, understandable by coders & testers l A good design provides a complete view of data, function, and behavior

39 Design Principles [Davis ‘95] l Consider > 1 design alternative l Design traceable to analysis model l Use design patterns l Design structure should reflect structure of problem domain l Consistent style, well-defined interfaces

40 Design Principles [2] l Structured to accommodate change (easy to modify & update) l Structured to degrade gently l “Design is not coding, coding is not design” l Assess quality during creation l Review design for semantic errors

41 Design Process Goals l A hierarchical organization making use of the control characteristics of the software l A modular design which logically partitions software into functional elements l Useful abstractions for both data and procedures

42 Design Goals [2] l Modules should be functionally independent l Modular interfaces should have minimal complexity l Explicit linking of design elements to requirements analysis models

43 Modularity and Software Cost [From SEPA 5/e]

44 Modular Design [Meyer ‘88] l Decomposability effective decomposition reduces complexity l Composability enable reuse of existing design elements l Understandability modules that can be understood in isolation are easier to build and change

45 Modular Design [2] l Continuity changes to requirements should trigger localized changes to specific modules l Protection error conditions should be considered on a per-module basis

46 Architectural Terminology [From SEPA 5/e]

47 Partitioning l Horizontal branches for each major function l Vertical control & execution are top-down l Increase in horizontal partitioning = increased number of interfaces l Vertically partitioned structures more resilient to change

48 [From SEPA 5/e] Partitioning Examples

49 Procedural Layering [From SEPA 5/e]

50 Effective Modular Design l Functional independence maximize cohesion of modules minimize coupling between modules promote robustness in the design l Cohesion one task per procedure is optimal l Coupling minimize module interconnection

51 Types of Coupling [From SEPA 5/e]

52 Design Heuristics l Reduce coupling (implode) l Improve cohesion (explode) l Minimize fan-out & strive for fan-in l Scope of effect = scope of control l Reduce interface complexity l Predictable “black box” modules l Controlled entry (no GOTOs!)

53 Program Structures [From SEPA 5/e]

54 Architectural Styles l Data-Centered l Data-Flow l Call-and-Return main program / subprogram remote procedure call l Layered

55 Data-Centered Architecture [From SEPA 5/e]

56 Data Flow Architectures [From SEPA 5/e]

57 Layered Architecture [From SEPA 5/e]

58 Mapping Models to Modules l Goal: map DFDs to a modular architecture l Transform Mapping data flow is modeled as a series of functions with input / output l Transaction Mapping: data flow is modeled as a chain of events (transactions)

59 Level 0 DFD for SafeHome [From SEPA 5/e]

60 Level 1 DFD for SafeHome [From SEPA 5/e]

61 Level 2 DFD for SafeHome Refines “monitor sensors” process [From SEPA 5/e]

62 Level 3 DFD for SafeHome Refines “monitor sensors” process, with flow boundaries [From SEPA 5/e]

63 First-Level Factoring Flow boundaries used to determine program structure and modules Additional factoring to introduce more detail [From SEPA 5/e]

64 Questions?


Download ppt "Software Engineering Lecture 14: Testing Techniques and Strategies."

Similar presentations


Ads by Google