Presentation is loading. Please wait.

Presentation is loading. Please wait.

References & User group Reference: Software Testing and Analysis Mauro Pezze Software Engineering Ian Sommerville Eight Edition (2007) User group:

Similar presentations


Presentation on theme: "References & User group Reference: Software Testing and Analysis Mauro Pezze Software Engineering Ian Sommerville Eight Edition (2007) User group:"— Presentation transcript:

1

2 References & User group Reference: Software Testing and Analysis Mauro Pezze Software Engineering Ian Sommerville Eight Edition (2007) User group: AAST_Software_Verification_SE_492@yahoogroups.com

3 Lecture 1 Software Confirmation 1.1 Verification and Validation Basic Concepts 1.2 Verification vs. Validation Planning 1.3 Software Inspections 1.4 Automated Static Analysis 1.5 Cleanroom Software Development

4 1.1 Verification and Validation Basic Concepts  Verification "Are we building the product right”  The software should conform to its specification  Validation "Are we building the right product”  The software should do what the user really requires.  i.e. is the customer need (requirements ) met?

5 The V & V process  Is a whole life-cycle process - V & V must be applied at each stage in the software process  Has two principal objectives  The discovery of defects in a system;  The assessment of whether or not the system is useful and useable in an operational situation.

6 V & V confidence  Depends on system’s purpose, user expectations and marketing environment  Software function  The level of confidence depends on how critical the software is to an organisation  User expectations  Users may have low expectations of certain kinds of software  Marketing environment  Getting a product to market early may be more important than finding defects in the program

7 Static and dynamic verification  Static Verification (Software inspections) Concerned with analysis of the static system representation to discover problems  be supplement by tool-based document and code analysis  Dynamic Verification ( Software testing ) Concerned with exercising and observing product behaviour  The system is executed with test data and its operational behaviour is observed

8 Static and dynamic V&V

9 Types of testing  Defect testing  Tests designed to discover system defects  A successful defect test is one which reveals the presence of defects in a system  Validation testing  Intended to show that the software meets its requirements  A successful test is one that shows that a requirement has been properly implemented

10 Testing and debugging  Defect testing and debugging are distinct processes.  Verification and validation is concerned with establishing the existence of defects in a program  Debugging is concerned with locating and repairing these errors  Debugging involves formulating a hypothesis about program behaviour then testing these hypotheses to find the system error

11 The debugging process

12 1.2 V & V planning  Careful planning is required to get the most out of testing and inspection processes  Planning should start early in the development process  The plan should identify the balance between static verification and testing  Test planning is about defining standards for the testing process rather than describing product tests

13 The V-model of development V & V planning is a part of the development

14 The software test plan 1 3 4 5 6 7 2

15 1.3 Software Inspections  These involve people examining the source representation with the aim of discovering anomalies and defects  Inspections not require execution of a system so may be used before execution  They may be applied to any representation of the system (requirements, design, configuration data, test data, etc.)  They have been shown to be an effective technique for discovering program errors

16 Inspection pre-conditions  A precise specification must be available  Team members must be familiar with the organisation standards  Syntactically correct code or other system representations must be available  An error checklist should be prepared  Management must accept that inspection will increase costs early in the software process  Management should not use inspections for staff evaluation i.e. finding out who makes mistakes

17 The inspection process

18 Inspection roles

19 Inspection checklists  Checklist of common errors should be used to drive the inspection  Error checklists are programming language dependent and reflect the characteristic errors that are likely to arise in the language  In general, the 'weaker' the type checking, the larger the checklist  Examples: Initialisation, Constant naming, loop termination, array bounds, etc.

20 Inspection checks 1

21 Inspection checks 2

22 Inspection rate  500 statements/hour during overview  125 source statement/hour during individual preparation  90-125 statements/hour can be inspected  Inspection is therefore an expensive process  Inspecting 500 lines costs about 40 man/hours effort - about £2800 at UK rates

23 1.4 Automated Static Analysis ( not manual as inspections)  Static analysers are software tools for source text processing  They analize the program text and try to discover potentially incorrect conditions and bring these to the attention of the V & V team  They are very effective as an aid to inspections - they are a supplement to but not a replacement for inspections

24 Examples of Static Analysis Tools  Vigilant Sentry - Advanced static code analysis for C and C++. Vigilant Sentry  TechExcel DevTrack - Track bugs with configurable workflows, process tracking and customizable reports.TechExcel DevTrack  LDRA Testbed - A fully automated tool for static analysis and code coverage. LDRA Testbed  PVS-Studio - Static Code Analyzer for C/C++/C++11 - Integrate into Visual Studio 2005/2008/2010 PVS-Studio - Static Code Analyzer for C/C++/C++11

25 Static analysis checks

26 Stages of static analysis  Control flow analysis Checks for loops with multiple exit or entry points, finds unreachable code, etc.  Data use analysis Detects uninitialized variables, variables written twice without an intervening assignment, variables which are declared but never used, etc.  Interface analysis Checks the consistency of routine and procedure declarations and their use

27 Stages of static analysis-cont  Information flow analysis Identifies the dependencies of output variables. Does not detect anomalies itself but highlights information for code inspection or review  Path analysis Identifies paths through the program and sets out the statements executed in that path. Again, potentially useful in the review process  All these stages generate vast amounts of information. They must be used with care.

28 1.5 Cleanroom Software Development  The name is derived from the 'Cleanroom' process in semiconductor fabrication ( see next slide).  It is a software development process intended to produce software with a certifiable level of reliabilitysoftware development processreliability  The philosophy is defect avoidance rather than defect removal  This software development process is based on:  Incremental development  Formal specification  Static verification using correctness arguments  Statistical testing to determine program reliability

29 NASA's Glenn Research Center cleanroom ( to create pure semiconductor)

30 The Cleanroom process

31 Cleanroom process teams  Specification team Responsible for developing and maintaining the system specification  Development team Responsible for developing and verifying the software. The software is NOT executed or even compiled during this process.  Certification team Responsible for developing a set of statistical tests to exercise the software after development. Reliability growth models used to determine when reliability is acceptable.

32 Assignment 1 Verification and validation measurements could be accomplished using software metrics such as primitive defect/error/fault metrics which could be : 1. Number of faults detected in each module 2. Number of requirements, design, and coding faults found during unit and integration testing 3. Number of errors by type (e.g., logic, computational, interface, documentation) 4. Number of errors by cause or origin 5. Number of errors by severity (e.g., severe, average, minor)

33 Assignment 1-cont  Fault density (FD) This measure is computed by dividing the number of faults (weighted by severity) by the size (usually in KLOC, thousands of lines of code).  FD can be used to predict remaining faults by comparison with expected fault density; determine if sufficient testing has been completed based on predetermined goals.  It may be weighted by severity using the equation FD = (W 1 S/N + W 2 A/N + W 3 M/N) / Size

34 Assignment 1-cont where: N = total number of faults S = number of severe faults A = number of average severity faults M = number of minor severity faults W i = weighting factors (defaults are 10, 3, and 1)

35 Assignment 1-cont  Given for a software application : total number of faults = 145 number of severe faults = 15 number of average severity faults = 50 number of minor severity faults = 80 W 1 = 10, W 2 = 3, W 3 = 1 Size of the program is 9000 lines Expected FD = 0.0003 (a) Compute the practical fault density FD in this first case

36 Assignment 1-cont (b)Consider another program application with the same length and its fault classification was as follows : total number of faults = 145 number of severe faults = 48 number of average severity faults = 72 number of minor severity faults = 25 W 1 = 10, W 2 = 3, W 3 = 1 Compute the practical fault density FD in this second case

37 Assignment 1-cont (c) Consider the first program application with different length( size = 15000 lines) and its fault classification was the same as in (a) Compute the practical fault density FD in this third case (d) Give your comment, based on your results in (a),(b),(c)


Download ppt "References & User group Reference: Software Testing and Analysis Mauro Pezze Software Engineering Ian Sommerville Eight Edition (2007) User group:"

Similar presentations


Ads by Google