Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Verification & Validation IS301.

Similar presentations


Presentation on theme: "1 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Verification & Validation IS301."— Presentation transcript:

1 1 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Verification & Validation IS301 – Software Engineering Lecture #30 – M. E. Kabay, PhD, CISSP Assoc. Prof. Information Assurance Division of Business & Management, Norwich University V:

2 2 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Topics Verification and validation planning Software inspections Automated static analysis Cleanroom software development

3 3 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Verification and Validation Assuring that software system meets user's needs

4 4 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Objectives To introduce software verification and validation and to discuss distinction between them To describe program inspection process and its role in V & V To explain static analysis as verification technique To describe Cleanroom software development process

5 5 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Verification vs Validation Verification: Are we building the product right? software should conform to its specification Validation: Are we building the right product? software should do what user really requires

6 6 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. V & V Process Whole life-cycle process V & V at each stage in software process Two principal objectives Discover defects in system Assess whether system is usable in operational situation

7 7 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Static and Dynamic Verification Software inspections Analysis of static system representation Static verification May be supplemented by tool-based document and code analysis Software testing Exercising and observing product behavior (dynamic verification) System executed with test data and its operational behavior observed

8 8 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Static and Dynamic V&V

9 9 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Program Testing Can reveal Presence of errors But NOT their absence Successful test Discovers one or more errors Only validation technique for non-functional requirements Should be used in conjunction with static verification to provide full V&V coverage

10 10 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Types of Testing Defect testing Tests designed to discover system defects Successful defect test reveals presence of defects in system Statistical testing Tests designed to reflect frequency of user inputs Used for reliability estimation

11 11 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. V & V Goals Establish confidence that software is fit for purpose Does NOT mean completely free of defects Good enough for its intended use Type of use will determine degree of confidence needed

12 12 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. V & V Confidence Depends on systems purpose, user expectations and marketing environment Software function Level of confidence depends on how critical software is to organization User expectations Users may have low expectations of certain kinds of software Marketing environment Getting product to market early may be more important than finding defects in program Sound familiar ?

13 13 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing and Debugging Defect testing and debugging are distinct processes Verification and validation concerned with establishing existence of defects in program Debugging concerned with locating and repairing these errors Debugging involves formulating hypotheses about program behavior then testing these hypotheses to find system error(s)

14 14 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Debugging Process

15 15 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. V & V Planning Careful planning required to get most out of testing and inspection processes Start early in development process Identify balance between Static verification and Testing Test planning is about Defining standards for testing process rather than Describing product tests

16 16 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. V-Model of Development

17 17 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Structure of Software Test Plan Testing process Requirements traceability Tested items Testing schedule Test recording procedures Hardware and software requirements Constraints

18 18 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Software Inspections People examine representation of system to find anomalies and defects May be applied to any representation of system Requirements, design, test data... Do not require execution of system Used before implementation Very effective technique for discovering errors

19 19 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Success In testing, one defect may mask another so several executions are required Reuse domain and programming knowledge so reviewers are likely to have seen types of error that commonly arise Thus many different defects may be discovered in single inspection

20 20 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspections and Testing Inspections and testing are complementary and not opposing verification techniques Both should be used during V & V process Inspections can check Conformance with specification But not conformance with customers real requirements Inspections cannot check non-functional characteristics Performance, usability, etc.

21 21 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Program Inspections Formalized approach to document reviews Intended explicitly for defect DETECTION (not correction) Defects may be Logical errors, Anomalies in code that might indicate erroneous condition (E.g. uninitialized variable) Or non-compliance with standards

22 22 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Pre-Conditions Precise specification must be available Team members must be familiar with organization standards Syntactically correct code must be available Error checklist should be prepared Management must accept that inspection will increase costs early in software process Management must not use inspections for staff evaluations I.e., finding errors does not necessarily mean programmer is BAD

23 23 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Process

24 24 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Procedure System overview presented to inspection team Code and associated documents distributed to inspection team in advance Inspection takes place and discovered errors noted Modifications made to repair discovered errors Re-inspection may or may not be required

25 25 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Teams Made up of at least 4 members Author of code being inspected Inspector who finds errors, omissions and inconsistencies Reader who reads code to team Moderator who chairs meeting and notes discovered errors Other roles: Scribe Chief moderator

26 26 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Checklists Checklist of common errors should be used to drive inspection Error checklist is programming language dependent Weaker type checking needs larger checklist Examples: Initialization Constant naming Loop termination Array bounds, etc.

27 27 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Checks (1) Data Faults Are all program variables initialized before their values are used? Have all constants been named? Should the lower bound of arrays be 0, 1 or something else? Should the upper bound of arrays be equal to the size of the array or (size – 1)? If character strings are used, is a delimiter explicitly assigned?

28 28 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Checks (2) Control Faults For each conditional statement, is the condition correct? Is each loop certain to terminate? Are compound statements correctly bracketed? In case statements, are all possible cases accounted for?

29 29 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Checks (3) Input/Output Faults Are all input variables used? Are all output variables assigned a value before the are output? Interface Faults Do all function and procedure calls have the correct number of parameters? Do formal and actual parameter types match? Are the parameters in the right order? If components access shared memory (globals), do they have the same model of the shared memory structure?

30 30 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Checks (4) Storage Management Faults If a linked structure is modified, have all links been correctly reassigned? If dynamic storage is used, has space been allocated correctly? Is space explicitly deallocated after it is no longer required? Exception Management Faults Have all possible error conditions been taken into account?

31 31 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Inspection Rate Estimating cost of inspection for 500 lines of code: 500 statements/hour during overview = 1 hr per person x 4 people = 4 person-hours 125 source statements/hour during individual preparation = 4 hours x 4 people = 16 person-hours statements/hour can be inspected in meeting with 4 people in team = ~5 hours x 4 people = ~20 person-hours Inspection is therefore expensive process Inspecting 500 lines thus takes ~ = ~40 person-hours Estimate programmer salary $80K/2K hr ~$40/hr Multiply by 2 for extended costs = $80/hr Therefore costs of 40 person-hours effort = ~ $3,200

32 32 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Automated Static Analysis Static analyzers = software for source text processing Parse program text Try to discover potentially erroneous conditions Report to V & V team Effective aid to inspections Supplement to but not replacement for inspections

33 33 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Static Analysis Checks

34 34 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Stages of Static Analysis (1) Control flow analysis Checks for loops with multiple exit or entry points, finds unreachable code, etc. Data use analysis Detects uninitialized variables, variables written twice without intervening assignment, variables which are declared but never used, etc. Interface analysis Checks consistency of routine and procedure declarations and their use

35 35 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Stages of Static Analysis (2) Information flow analysis Identifies dependencies of output variables. Does not detect anomalies itself but highlights information for code inspection or review Path analysis Identifies paths through program and sets out statements executed in that path. Again, potentially useful in review process Both these stages generate vast amounts of information. Must be used with care.

36 36 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. LINT Static Analysis 138% more lint_ex.c #include printarray (Anarray) int Anarray; { printf(%d,Anarray); } main () { int Anarray[5]; int i; char c; printarray (Anarray, i, c); printarray (Anarray) ; } 139% cc lint_ex.c 140% lint lint_ex.c lint_ex.c(10): warning: c may be used before set lint_ex.c(10): warning: i may be used before set printarray: variable # of args. lint_ex.c(4) :: lint_ex.c(10) printarray, arg. 1 used inconsistently lint_ex.c(4) :: lint_ex.c(10) printarray, arg. 1 used inconsistently lint_ex.c(4) :: lint_ex.c(11) printf returns value which is always ignored

37 37 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Use of Static Analysis Particularly valuable for language such as C Weak typing Hence many errors undetected by compiler Less cost-effective for languages like Java Strong type checking Can therefore detect many errors during compilation

38 38 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Cleanroom Software Development Name derived from 'Cleanroom' process in semiconductor fabrication Philosophy is defect avoidance rather than defect removal Software development process based on: Incremental development Formal specification Static verification using correctness arguments Statistical testing to determine program reliability

39 39 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Cleanroom Process

40 40 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Cleanroom Process Characteristics Formal specification using state transition model Incremental development Structured programming Limited control and abstraction constructs are used Static verification using rigorous inspections Statistical testing of system (covered in Ch. 21)(but not in this course)

41 41 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Incremental Development

42 42 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Formal Specification and Inspections State-based model = system specification Inspection process checks program against this model Programming approach defined so that correspondence between model and system is clear Mathematical arguments (not proofs) are used to increase confidence in inspection process

43 43 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Cleanroom Process Teams Specification team Develops and maintains system specification Development team Develops and verifies software. Software NOT executed or even compiled during this process Certification team Develops set of statistical tests to exercise software after development Reliability growth models used to determine when reliability acceptable

44 44 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Cleanroom Process Evaluation Results in IBM impressive Few discovered faults in delivered systems Independent assessment: Process no more expensive than other approaches Fewer errors than in traditional development process Not clear how this approach can be transferred to environment with Less skilled or Less highly motivated engineers

45 45 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Key Points (1) Verification and validation are not same thing. Verification shows conformance with specification; validation shows that program meets customers needs Test plans should be drawn up to guide testing process. Static verification techniques involve examination and analysis of program for error detection

46 46 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Key points (2) Program inspections are very effective in discovering errors Program code in inspections is checked by small team to locate software faults Static analysis tools can discover program anomalies which may be indication of faults in code Cleanroom development process depends on incremental development, static verification and statistical testing

47 47 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Homework Required by Wed 17 Nov 2004: For 32 points 22.1 – & (think hard) Optional by Mon 29 Nov 2004 For up to 25 extra points, write detailed answers for any or all of 22.6,

48 48 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. DISCUSSION


Download ppt "1 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Verification & Validation IS301."

Similar presentations


Ads by Google