Verification and Validation

Slides:



Advertisements
Similar presentations
Software Testing and Analysis. Ultimate goal for software testing Quality Assurance.
Advertisements

Verification and Validation
Software Engineering-II Sir zubair sajid. What’s the difference? Verification – Are you building the product right? – Software must conform to its specification.
©Ian Sommerville 2000Software Engineering. Chapter 22Slide 1 Chapter 22 Verification and Validation.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
1 / 28 CS 425/625 Software Engineering Verification and Validation Based on Chapter 19 of the textbook [SE-6] Ian Sommerville, Software Engineering, 6.
Verification and Validation
1 Software Testing and Quality Assurance Lecture 1 Software Verification & Validation.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
Verification and Validation CIS 376 Bruce R. Maxim UM-Dearborn.
Verification and Validation
1CMSC 345, Version 4/04 Verification and Validation Reference: Software Engineering, Ian Sommerville, 6th edition, Chapter 19.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
©Ian Sommerville 1995 Software Engineering, 5th edition. Chapter 22Slide 1 Verification and Validation u Assuring that a software system meets a user's.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
Adaptive Processes © Adaptive Processes Simpler, Faster, Better Verification and Validation Assuring that a software system meets a user's needs.
Dr. Tom WayCSC Code Reviews & Inspections CSC 4700 Software Engineering.
Verification and Validation Chapter 22 of Ian Sommerville’s Software Engineering.
Verification and Validation Hoang Huu Hanh, Hue University hanh-at-hueuni.edu.vn.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 22 Slide 1 Verification and Validation Slightly adapted by Anders Børjesson.
Software testing techniques 2.Verification and validation From I. Sommerville textbook Kaunas University of Technology.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Software Testing Testing types Testing strategy Testing principles.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Chapter 19 Verification and Validation.
CS.436 Software Engineering By Ajarn..Sutapart Sappajak,METC,MSIT Chapter 13 Verification and validation Slide 1 1 Chapter 13 Verification and Validation.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
SoftwareVerification and Validation
This chapter is extracted from Sommerville’s slides. Textbook chapter
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Chapter 19 Verification and Validation.
Verification and Validation
Ch 22 Verification and Validation
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Bzupages.com Verification and Validation.
CHAPTER 9: VERIFICATION AND VALIDATION 1. Objectives  To introduce software verification and validation and to discuss the distinction between them 
Verification and Validation Assuring that a software system meets a user's needs.
Chapter 12: Software Inspection Omar Meqdadi SE 3860 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Software Engineering, 8th edition Chapter 22 1 Courtesy: ©Ian Somerville 2006 April 27 th, 2009 Lecture # 19 Verification and Validation.
Verification and Validation Assuring that a software system meets a user's needs.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
©Ian Sommerville Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation with edits by Dan Fleck Coming up: Objectives.
CS451 Lecture 10: Software Testing Yugi Lee STB #555 (816)
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation with edits by Dan Fleck.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
This chapter is extracted from Sommerville’s slides. Textbook chapter 22 1 Chapter 8 Validation and Verification 1.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Software inspections l Involve people examining the source representation with.
References & User group Reference: Software Testing and Analysis Mauro Pezze Software Engineering Ian Sommerville Eight Edition (2007) User group:
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Pradeep Konduri Niranjan Rao Julapelly.  Introduction and Overview  Verification Vs Validation  Process  Goals  Confidence  Role of V&V in different.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 22 Slide 1 Verification and Validation.
Laurea Triennale in Informatica – Corso di Ingegneria del Software I – A.A. 2006/2007 Andrea Polini XVII. Verification and Validation.
CX Introduction to Web Programming Testing.
Verification and Validation. Topics covered l Verification and validation planning l Program Testing l Software inspections.
Verification and Validation
CSC 480 Software Engineering
Verification and Validation
IS301 – Software Engineering V:
Chapter 8 – Software Testing
Verification & Validation
Verification and Validation
Verification and Validation
Verification and Validation
Verification and Validation Unit Testing
Verification and Validation
Verification & Validation
Chapter 7 Software Testing.
Presentation transcript:

Verification and Validation

What’s the difference? Verification Validation Are you building the product right? Software must conform to its specification Validation Are you building the right product? Software should do what the user really requires

Verification and Validation Process Must applied at each stage of the software development process to be effective Objectives Discovery of system defects Assessment of system usability in an operational situation

Static and Dynamic Verification Software inspections (static) Concerned with analysis of static system representations to discover errors May be supplemented by tool-based analysis of documents and program code Software testing (dynamic) Concerned with exercising product using test data and observing behavior

Requirements Verification and Validation Requirements Validation Check that the right product is being built Ensures that the software being developed (or changed) will satisfy its stakeholders Checks the software requirements specification against stakeholders goals and requirements Requirements Verification Check that product is being built right Ensures that each step followed in the process of building the software yields the right products Checks consistency of the software requirements specification artefacts and other software development products (design, implementation, ...) against the specification

Requirements Verification and Validation (2) Help ensure delivery of what the client wants Need to be performed at every stage during the (requirements) process Elicitation Checking back with the elicitation sources “So, are you saying that . . . . . ?” Analysis Checking that the domain description and requirements are correct Specification Checking that the defined system requirement will meet the user requirements under the assumptions of the domain/environment Checking conformity to well-formedness rules, standards…

The World and the Machine1 (or the problem domain and the system) These 6 slides are taken from Introduction to Analysis Specification (S) Domain properties (D) these are assumptions about the environment of the system-to-be Requirements (R) Hardware (C) Software (P) Validation question (do we build the right system?) : if the domain-to-be (excluding the system-to-be) has the properties D, and the system-to-be has the properties S, then the requirements R will be satisfied. D and S  R Verification question (do we build the system right?) : if the hardware has the properties H, and the software has the properties P, then the system requirements S will be satisfied. C and P  S Conclusion: D and C and P  R [1] M. Jackson, 1995

Example The assumption D3 is false because the plane may hydroplane on wet runway. Requirement (R) Reverse thrust shall only be enabled when the aircraft is moving on runway. Domain Properties (D1) Deploying reverse thrust in mid-flight has catastrophic effects. (D2) Wheel pulses are on if and only if wheels are turning. (D3) Wheels are turning if and only if the plane is moving on the runway. System specification (S) The system shall allow reverse thrust to be enabled if and only if wheel pulses are on. Does D1 and D2 and D3 and S  R? Are the domain assumptions (D) right? Are the requirement (R) or specification (S) what is really needed? based on P. Heymans, 2005

Program Testing Can only reveal the presence of errors, cannot prove their absence A successful test discovers 1 or more errors The only validation technique that should be used for non-functional (or performance) requirements Should to used in conjunction with static verification to ensure full product coverage

Types of Testing Defect testing Statistical testing Tests designed to discover system defects A successful defect test reveals the presence of defects in the system Statistical testing Tests designed to reflect the frequency of user inputs Used for reliability estimation

Verification and Validation Goals Establish confidence that software is fit for its intended purpose The software may or may not have all defects removed by the process The intended use of the product will determine the degree of confidence in product needed

Confidence Parameters Software function How critical is the software to the organization? User expectations Certain kinds of software have low user expectations Marketing environment getting a product to market early might be more important than finding all defects

Testing and Debugging These are two distinct processes Verification and validation is concerned with establishing the existence of defects in a program Debugging is concerned with locating and repairing these defects Debugging involves formulating a hypothesis about program behavior and then testing this hypothesis to find the error

Planning Careful planning is required to get the most out of the testing and inspection process Planning should start early in the development process The plan should identify the balance between static verification and testing Test planning must define standards for the testing process, not just describe product tests

The V-model of development

Software Test Plan Components Testing process Requirements traceability Items tested Testing schedule Test recording procedures Testing HW and SW requirements Testing constraints

Software Inspections People examine a source code representation to discover anomalies and defects Does not require systems execution so they may occur before implementation May be applied to any system representation (document, model, test data, code, etc.)

Inspection Success Very effective technique for discovering defects It is possible to discover several defects in a single inspection In testing one defect may in fact mask another They reuse domain and programming knowledge (allowing reviewers to help authors avoid making common errors)

Inspections and Testing These are complementary processes Inspections can check conformance to specifications, but not with customer’s real needs Testing must be used to check compliance with non-functional system characteristics like performance, usability, etc.

Program Inspections Formalizes the approach to document reviews Focus is on defect detection, not defect correction Defects uncovered may be logic errors, coding errors, or non-compliance with development standards

Inspection Preconditions A precise specification must be available Team members must be familiar with organization standards All representations must be syntactically correct An error checklist must be prepare in advance Management must buy into the the fact the inspections will increase the early development costs Inspections cannot be used to evaluate staff performance

Inspection Procedure System overview presented to inspection team Code and associated documents are distributed to team in advance Errors discovered during the inspection are recorded Product modifications are made to repair defects Re-inspection may or may not be required

Inspection Teams Have at least 4 team members product author inspector (looks for errors, omissions, and inconsistencies) reader (reads the code to the team) moderator (chairs meeting and records errors uncovered)

Inspection Checklists Checklists of common errors should be used to drive the inspection Error checklist should be language dependent The weaker the type checking in the language, the larger the checklist is likely to become

Inspection Fault Classes Data faults (e.g. array bounds) Control faults (e.g. loop termination) Input/output faults (e.g. all data read) Interface faults (e.g. parameter assignment) Storage management faults (e.g. memory leaks) Exception management faults (e.g. all error conditions trapped)

Inspection Rate 500 statements per hour during overview 125 statements per hour during individual preparation 90-125 statements per hour can be inspected by a team Including preparation time, each 100 lines of code costs one person day (if a 4 person team is used)

Automated Static Analysis Performed by software tools that process source code listing Can be used to flag potentially erroneous conditions for the inspection team to examine They should be used to supplement the reviews done by inspectors

Static Analysis Checks Data faults (e.g. variables not initialized) Control faults (e.g. unreachable code) Input/output faults (e.g. duplicate variables output) Interface faults (e.g. parameter type mismatches) Storage management faults (e.g. pointer arithmetic)

Static Analysis Stages - part 1 Control flow analysis checks loops for multiple entry points or exits find unreachable code Data use analysis finds initialized variables variable declared and never used Interface analysis check consistency of function prototypes and instances

Static Analysis Stages - part 2 Information flow analysis examines output variable dependencies highlights places for inspectors to look at closely Path analysis identifies paths through the program determines order of statements executed on each path

Defect Testing Component Testing Integration Testing usually responsibility of component developer test derived from developer’s experiences Integration Testing responsibility of independent test team tests based on system specification

Testing Priorities Exhaustive testing only way to show program is defect free Exhaustive testing is not possible Tests must exercise system capabilities, not its components Testing old capabilities is more important than testing new capabilities Testing typical situations is more important than testing boundary value cases

The defect testing process

Testing Approaches Covered in fairly well in CIS 375 Functional testing black box techniques Structural testing white box techniques Integration testing incremental black box techniques Object-oriented testing cluster or thread testing techniques

Interface Testing Needed whenever modules or subsystems are combined to create a larger system Goal is to identify faults due to interface errors or to invalid interface assumptions Particularly important in object-oriented systems development

Interface Types Parameter interfaces Shared memory interfaces data passed normally between components Shared memory interfaces block of memory shared between components Procedural interfaces set of procedures encapsulated in a package or sub-system Message passing interfaces sub-systems request services from each other

Interface Errors Interface misuse Interface misunderstanding parameter order, number, or types incorrect Interface misunderstanding call component makes incorrect assumptions about component being called Timing errors race conditions and data synchronization errors

Interface Testing Guidelines Design tests so actual parameters passed are at extreme ends of formal parameter ranges Test pointer variables with null values Design tests that cause components to fail Use stress testing in message passing systems In shared memory systems, vary the order in which components are activated

Testing Workbenches Provide a range of tools to reduce the time required and the total testing costs Usually implemented as open systems since testing needs tend to be organization specific Difficult to integrate with closed design and analysis work benches

A testing workbench

Testing Workbench Adaptation Scripts may be developed for user interface simulators and patterns for test data generators Test outputs may need to be developed for comparison with actual outputs Special purpose file comparison programs may also be useful

Types of Testing Defect testing Statistical testing Tests designed to discover system defects A successful defect test reveals the presence of defects in the system Statistical testing Tests designed to reflect the frequency of user inputs Used for reliability estimation

System Testing Testing of critical systems must often rely on simulators for sensor and activator data (rather than endanger people or profit) Test for normal operation should be done using a safely obtained operational profile Tests for exceptional conditions will need to involve simulators

Program Testing Can only reveal the presence of errors, cannot prove their absence A successful test discovers 1 or more errors The only validation technique that should be used for non-functional (or performance) requirements Should to used in conjunction with static verification to ensure full product coverage

Verification and Validation Goals Establish confidence that software is fit for its intended purpose The software may or may not have all defects removed by the process The intended use of the product will determine the degree of confidence in product needed

Confidence Parameters Software function How critical is the software to the organization? User expectations Certain kinds of software have low user expectations Marketing environment getting a product to market early might be more important than finding all defects

Testing and Debugging These are two distinct processes Verification and validation is concerned with establishing the existence of defects in a program Debugging is concerned with locating and repairing these defects Debugging involves formulating a hypothesis about program behavior and then testing this hypothesis to find the error

Planning Careful planning is required to get the most out of the testing and inspection process Planning should start early in the development process The plan should identify the balance between static verification and testing Test planning must define standards for the testing process, not just describe product tests

The V-model of development

Software Test Plan Components Testing process Requirements traceability Items tested Testing schedule Test recording procedures Testing HW and SW requirements Testing constraints

Software Inspections People examine a source code representation to discover anomalies and defects Does not require systems execution so they may occur before implementation May be applied to any system representation (document, model, test data, code, etc.)

Inspection Success Very effective technique for discovering defects It is possible to discover several defects in a single inspection In testing one defect may in fact mask another They reuse domain and programming knowledge (allowing reviewers to help authors avoid making common errors)

Inspections and Testing These are complementary processes Inspections can check conformance to specifications, but not with customer’s real needs Testing must be used to check compliance with non-functional system characteristics like performance, usability, etc.

Arithmetic Errors Use language exception handling mechanisms to trap errors Use explicit error checks for all identified errors Avoid error-prone arithmetic operations when possible Never use floating-point numbers Shut down system (using graceful degradation) if exceptions are detected

Algorithmic Errors Harder to detect than arithmetic errors Always err on the side of safety Use reasonableness checks on all outputs that can affect people or profit Set delivery limits for specified time periods, if application domain calls for them Have system request operator intervention any time a judgement call must be made