Software Engineering Lecture 14: Testing Techniques and Strategies.

Slides:



Advertisements
Similar presentations
Software Testing Techniques
Advertisements

Chapter 14 Software Testing Techniques - Testing fundamentals - White-box testing - Black-box testing - Object-oriented testing methods (Source: Pressman,
Lecture 12 - Software Testing Techniques & Software Testing Strategies
Software Testing Technique. Introduction Software Testing is the process of executing a program or system with the intent of finding errors. It involves.
Unit-V testing strategies and tactics.
Chapter 14 Testing Tactics
Lecture 8: Testing, Verification and Validation
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Design Concepts and Principles
Chapter 13 Design Concepts and Principles
Background information Formal verification methods based on theorem proving techniques and model­checking –to prove the absence of errors (in the formal.
Chapter 14: Design Method --- data and architectural design Design -- A multistep process in which representations of data structure, program structure,
Software Testing Techniques. December Introduction Many aspects to achieving software quality –Formal reviews (of both the software process and.
Testing an individual module
Software Engineering CSE470: Systems Engineering 35 Computer System Engineering Computer System Engineering is a problem-solving activity. Itemize desired.
Chapter 10: Architectural Design
Introduction to Software Testing
Software Testing & Strategies
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
Chapter 13 & 14 Software Testing Strategies and Techniques
Chapter 10 Architectural Design
CMSC 345 Fall 2000 Unit Testing. The testing process.
1 Chapter 14 Architectural Design 2 Why Architecture? The architecture is not the operational software. Rather, it is a representation that enables a.
Lecture 11 Testing and Debugging SFDV Principles of Information Systems.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Software Testing Testing types Testing strategy Testing principles.
SOFTWARE DESIGN (SWD) Instructor: Dr. Hany H. Ammar
SOFTWARE DESIGN.
Agenda Introduction Overview of White-box testing Basis path testing
1 Chapter : Testing Tactics. 2 Testing Fundamental Software engineer attempts to build software from an abstract concept to a tangible product. Next is.
Black Box Testing Focuses in the functional requirements of the program It is not an alternative to white-box techniques It is a complementary approach.
INTRUDUCTION TO SOFTWARE TESTING TECHNIQUES BY PRADEEP I.
Design Concepts and Principles Instructor: Dr. Jerry Gao.
Chapter 13 Design Concepts and Principles Software Engineering: A Practitioner's Approach, 5/e.
1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 14a: Software Testing Techniques Software Engineering: A Practitioner’s Approach, 6/e Chapter.
Software Engineering Saeed Akhtar The University of Lahore.
Theory and Practice of Software Testing
SOFTWARE TESTING. Introduction Software Testing is the process of executing a program or system with the intent of finding errors. It involves any activity.
Software Quality Assurance and Testing Fazal Rehman Shamil.
Formal Verification. Background Information Formal verification methods based on theorem proving techniques and model­checking –To prove the absence of.
 Software Testing Software Testing  Characteristics of Testable Software Characteristics of Testable Software  A Testing Life Cycle A Testing Life.
Dynamic Testing.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
SOFTWARE TESTING LECTURE 9. OBSERVATIONS ABOUT TESTING “ Testing is the process of executing a program with the intention of finding errors. ” – Myers.
Software Testing. SE, Testing, Hans van Vliet, © Nasty question  Suppose you are being asked to lead the team to test the software that controls.
1 Software Testing. 2 What is Software Testing ? Testing is a verification and validation activity that is performed by executing program code.
Testing Integral part of the software development process.
Chapter 9 Architectural Design. Why Architecture? The architecture is not the operational software. Rather, it is a representation that enables a software.
Design Engineering 1. Analysis  Design 2 Characteristics of good design 3 The design must implement all of the explicit requirements contained in the.
Software Engineering Lecture 13: User Interface and Component-Level Design.
Software Testing.
Software Testing Techniques
Software Engineering (CSI 321)
Software Testing An Introduction.
Lecture on Black Box Testing
Lecture 9- Design Concepts and Principles
Chapter 13 & 14 Software Testing Strategies and Techniques
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Chapter 14 Software Testing Techniques
Verification and Validation Unit Testing
Lecture 9- Design Concepts and Principles
Chapter 18 Testing Conventional Applications
Chapter 10 – Software Testing
Chapter 9 Architectural Design.
Chapter 18 Testing Conventional Applications.
UNIT-4 BLACKBOX AND WHITEBOX TESTING
By: Lecturer Raoof Talal
Presented by KARRI GOVINDA RAO ,
Chapter 13 & 14 Software Testing Strategies and Techniques 1 Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Presentation transcript:

Software Engineering Lecture 14: Testing Techniques and Strategies

Today’s Topics l Chapters 17 & 18 in SEPA 5/e l Testing Principles & Testability l Test Characteristics l Black-Box vs. White-Box Testing l Flow Graphs & Basis Path Testing l Testing & Integration Strategies

Software Testing l Opportunities for human error Specifications, design, coding Communication l “Testing is the ultimate review” l Can take 30-40% of total effort l For critical apps, can be 3 to 5 times all other efforts combined!

Testing Objectives l Execute a program with the intent of finding errors l Good tests have a high probability of discovering errors l Successful tests uncover errors l ‘No errors found’: not a good test! l Verifying functionality is a secondary goal

Testing Principles l Tests traceable to requirements l Tests planned before testing l Pareto principle: majority of errors traced to minority of components l Component testing first, then integrated testing l Exhaustive testing is not possible l Independent tests: more effective

Software Testability l Operability l Observability l Controllability l Decomposability Characteristics that lead to testable software: l Simplicity l Stability l Understandability

Operability l System has few bugs l No bugs block execution of tests l Product evolves in functional stages The better it works, the more efficiently it can be tested

Observability l Distinct output for each input l States & variables may be queried l Past states are logged l Factors affecting output are visible l Incorrect output easily identified l Internal errors reported l Source code accessible What you see is what you test

Controllability l All possible outputs can be generated by some input l All code executable by some input l States, variables directly controlled l Input/output consistent, structured l Tests are specified, automated, and reproduced The better we can control the software, the more the testing can be automated

Decomposability l Independent modules l Modules can be tested separately By controlling the scope of testing, we can more quickly isolate problems and perform smarter retesting

Simplicity l Minimum feature set l Minimal architecture l Code simplicity The less there is to test, the more quickly we can test it

Stability l Changes made to system: are infrequent are controlled don’t invalidate existing tests l Software recovers from failure The fewer the changes, the fewer the disruptions to testing

Understandability l Design is well-understood l Dependencies are well understood l Design changes are communicated l Documentation is: accessible well-organized specific, detailed and accurate The fewer the changes, the fewer the disruptions to testing

Test Characteristics l Good test has a high probability of finding an error l Good test is not redundant l A good test should be “best of breed” l A good test is neither too simple nor too complex

Test Case Design l ‘Black Box’ Testing Consider only inputs and outputs l ‘White Box’ or ‘Glass Box’ Testing Also consider internal logic paths, program states, intermediate data structures, etc.

White-Box Testing l Guarantee that all independent paths have been tested l Exercise all conditions for ‘true’ and ‘false’ l Execute all loops for boundary conditions l Exercise internal data structures

Why White-Box Testing? l More errors in ‘special case’ code which is infrequently executed l Control flow can’t be predicted accurately in black-box testing l Typo errors can happen anywhere!

Basis Path Testing l White-box method [McCabe ‘76] l Analyze procedural design l Define basis set of execution paths l Test cases for basis set execute every program statement at least once

Basis Path Testing [2] Flow Graph: Representation of Structured Programming Constructs [From SEPA 5/e]

Cyclomatic Complexity V(G)=E-N+2 = 4 Independent Paths 1: 1,11 2: 1,2,3,4,5,10,1,11 3: 1,2,3,6,8,9,10,1,11 4: 1,2,3,6,7,9,10,1,11 V(G): upper bound on number of tests to ensure all code has been executed [From SEPA 5/e]

Black Box Testing l Focus on functional requirements l Incorrect / missing functions l Interface errors l Errors in external data access l Performance errors l Initialization and termination errors

Black Box Testing [2] l How is functional validity tested? l What classes of input will make good test cases? l Is the system sensitive to certain inputs? l How are data boundaries isolated?

Black Box Testing [3] l What data rates and volume can the system tolerate? l What effect will specific combinations of data have on system operation?

Comparison Testing l Compare software versions l “Regression testing”: finding the outputs that changed l Improvements vs. degradations l Net effect depends on frequency and impact of degradations l When error rate is low, a large corpus can be used

Generic Testing Strategies l Testing starts at module level and moves “outward” l Different testing techniques used at different times l Testing by developer(s) and independent testers l Testing and debugging are separate activities

Verification and Validation l Verification “Are we building the product right?” l Validation “Are we building the right product?” l Achieved by life-cycle SQA activities, assessed by testing l “You can’t create quality by testing”

Organization of Testing [From SEPA 5/e]

How Much Test Time is Necessary? Logarithmic Poisson execution-time model With sufficient fit, model predicts testing time required to reach acceptable failure rate [From SEPA 5/e]

Unit Testing [From SEPA 5/e]

Top-Down Integration PRO: Higher-level (logic) modules tested early CON: Lower-level (reusable) modules tested late [From SEPA 5/e]

Bottom-Up Integration PRO: Lower-level (reusable) modules tested early CON: Higher-level (logic) modules tested late [From SEPA 5/e]

Hybrid Approaches l Sandwich Integration: combination of top-down and bottom-up l Critical Modules address several requirements high level of control complex or error prone definite performance requirements l Test Critical Modules ASAP!

Questions?

Software Engineering for Information Technology Lecture 12: System Design

Today’s Topics l Design Elements l Principles for Quality Design l Modularity & Partitioning l Effective Modular Design l Architectural Styles l Mapping Models to Modules

Design Elements l Data Design data structures for data objects l Architectural Design modular structure of software l Interface Design internal / external communication l Component-Level Design procedural description of modules

[From SEPA 5/e] Increasing Detail Design Elements Linked to Analysis Models

Evaluating A Design l A design must implement: explicit requirements (analysis model) customer’s implicit requirements l A design must be readable, understandable by coders & testers l A good design provides a complete view of data, function, and behavior

Design Principles [Davis ‘95] l Consider > 1 design alternative l Design traceable to analysis model l Use design patterns l Design structure should reflect structure of problem domain l Consistent style, well-defined interfaces

Design Principles [2] l Structured to accommodate change (easy to modify & update) l Structured to degrade gently l “Design is not coding, coding is not design” l Assess quality during creation l Review design for semantic errors

Design Process Goals l A hierarchical organization making use of the control characteristics of the software l A modular design which logically partitions software into functional elements l Useful abstractions for both data and procedures

Design Goals [2] l Modules should be functionally independent l Modular interfaces should have minimal complexity l Explicit linking of design elements to requirements analysis models

Modularity and Software Cost [From SEPA 5/e]

Modular Design [Meyer ‘88] l Decomposability effective decomposition reduces complexity l Composability enable reuse of existing design elements l Understandability modules that can be understood in isolation are easier to build and change

Modular Design [2] l Continuity changes to requirements should trigger localized changes to specific modules l Protection error conditions should be considered on a per-module basis

Architectural Terminology [From SEPA 5/e]

Partitioning l Horizontal branches for each major function l Vertical control & execution are top-down l Increase in horizontal partitioning = increased number of interfaces l Vertically partitioned structures more resilient to change

[From SEPA 5/e] Partitioning Examples

Procedural Layering [From SEPA 5/e]

Effective Modular Design l Functional independence maximize cohesion of modules minimize coupling between modules promote robustness in the design l Cohesion one task per procedure is optimal l Coupling minimize module interconnection

Types of Coupling [From SEPA 5/e]

Design Heuristics l Reduce coupling (implode) l Improve cohesion (explode) l Minimize fan-out & strive for fan-in l Scope of effect = scope of control l Reduce interface complexity l Predictable “black box” modules l Controlled entry (no GOTOs!)

Program Structures [From SEPA 5/e]

Architectural Styles l Data-Centered l Data-Flow l Call-and-Return main program / subprogram remote procedure call l Layered

Data-Centered Architecture [From SEPA 5/e]

Data Flow Architectures [From SEPA 5/e]

Layered Architecture [From SEPA 5/e]

Mapping Models to Modules l Goal: map DFDs to a modular architecture l Transform Mapping data flow is modeled as a series of functions with input / output l Transaction Mapping: data flow is modeled as a chain of events (transactions)

Level 0 DFD for SafeHome [From SEPA 5/e]

Level 1 DFD for SafeHome [From SEPA 5/e]

Level 2 DFD for SafeHome Refines “monitor sensors” process [From SEPA 5/e]

Level 3 DFD for SafeHome Refines “monitor sensors” process, with flow boundaries [From SEPA 5/e]

First-Level Factoring Flow boundaries used to determine program structure and modules Additional factoring to introduce more detail [From SEPA 5/e]

Questions?