University of Toronto Department of Computer Science © 2004-5 Steve Easterbrook. This presentation is available free for non-commercial use with attribution.

Slides:



Advertisements
Similar presentations
Bernd Bruegge & Allen Dutoit Object-Oriented Software Engineering: Conquering Complex and Changing Systems 1 Software Engineering September 12, 2001 Capturing.
Advertisements

Unit 2. Software Lifecycle
CSE 470 : Software Engineering The Software Process.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 4 Slide 1 Software Processes.
Introduction To System Analysis and Design
L4-1-S1 UML Overview © M.E. Fayad SJSU -- CmpE Software Architectures Dr. M.E. Fayad, Professor Computer Engineering Department, Room #283I.
University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution.
© Betty HC Cheng. This presentation is available free for non-commercial use with attribution under a creative commons license. Acknowledge: S.
Lecture 13 Revision IMS Systems Analysis and Design.
Chapter 1 Software Development. Copyright © 2005 Pearson Addison-Wesley. All rights reserved. 1-2 Chapter Objectives Discuss the goals of software development.
Software Engineering General Project Management Software Requirements
Software Engineering CSE470: Requirements Analysis 1 Requirements Analysis Defining the WHAT.
University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution.
Overview of Software Requirements
University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution.
COMP 350: Object Oriented Analysis and Design Lecture 2
1 CMPT 275 Software Engineering Requirements Analysis Process Janice Regan,
University of Toronto Department of Computer Science CSC444 Lec04- 1 Lecture 4: Software Lifecycles The Software Process Waterfall model Rapid Prototyping.
System Design Chapter 8. Objectives  Understand the verification and validation of the analysis models.  Understand the transition from analysis to.
The Software Development Life Cycle: An Overview
S/W Project Management
1 Building and Maintaining Information Systems. 2 Opening Case: Yahoo! Store Allows small businesses to create their own online store – No programming.
Petter Nielsen Information Systems/IFI/UiO 1 Software Prototyping.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 6 Slide 1 Chapter 6 Requirements Engineering Process.
SE-02 SOFTWARE ENGINEERING LECTURE 3 Today: Requirements Analysis Requirements tell us what the system should do - not how it should do it. Requirements.
CSCI 6231 Software Engineering ( Chapter 10?) Requirements Workflow Instructor: Morris Lancaster.
Business Analysis and Essential Competencies
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 6 Slide 1 Requirements Engineering Processes l Processes used to discover, analyse and.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 4 Slide 1 Software Processes.
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
©Ian Sommerville 2000, Mejia-Alvarez 2009 Slide 1 Software Processes l Coherent sets of activities for specifying, designing, implementing and testing.
Interaction Design Process COMPSCI 345 S1 C and SoftEng 350 S1 C Lecture 5 Chapter 3 (Heim)
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 7 Slide 1 Requirements Engineering Processes.
Requirements Engineering Requirements Elicitation Process Lecture-8.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution.
© Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 1 Lecture 2:Requirements.
Approaching a Problem Where do we start? How do we proceed?
Requirements as Usecases Capturing the REQUIREMENT ANALYSIS DESIGN IMPLEMENTATION TEST.
1 Capturing Requirements As Use Cases To be discussed –Artifacts created in the requirements workflow –Workers participating in the requirements workflow.
1 Software Requirements l Specifying system functionality and constraints l Chapters 5 and 6 ++
1-1 Software Development Objectives: Discuss the goals of software development Identify various aspects of software quality Examine two development life.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
Connecting with Computer Science2 Objectives Learn how software engineering is used to create applications Learn some of the different software engineering.
Requirements Validation
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Winter 2011SEG Chapter 11 Chapter 1 (Part 1) Review from previous courses Subject 1: The Software Development Process.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
CSCE 240 – Intro to Software Engineering Lecture 2.
Requirements Analysis
Requirement engineering & Requirement tasks/Management. 1Prepared By:Jay A.Dave.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
21/1/ Analysis - Model of real-world situation - What ? System Design - Overall architecture (sub-systems) Object Design - Refinement of Design.
Requirement Analysis SOFTWARE ENGINEERING. What are Requirements? Expression of desired behavior Deals with objects or entities, the states they can be.
1 Specification A broad term that means definition Used at different stages of software development for different purposes Generally, a statement of agreement.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
1 CEN 4020 Software Engineering PPT4: Requirement analysis.
Your Interactive Guide to the Digital World Discovering Computers 2012 Chapter 12 Exploring Information System Development.
44222: Information Systems Development
1 7 Systems Analysis and Design in a Changing World, 2 nd Edition, Satzinger, Jackson, & Burd Chapter 7 The Object-Oriented Approach to Requirements.
Requirement Elicitation Review – Class 8 Functional Requirements Nonfunctional Requirements Software Requirements document Requirements Validation and.
Requirements. Outline Definition Requirements Process Requirements Documentation Next Steps 1.
Dillon: CSE470: ANALYSIS1 Requirements l Specify functionality »model objects and resources »model behavior l Specify data interfaces »type, quantity,
 The processes used for RE vary widely depending on the application domain, the people involved and the organisation developing the requirements.  However,
1 SYS366 Week 2 - Lecture Visual Modeling and Process.
Software Processes.
Introduction to Software Engineering
Chapter 2 – Software Processes
Software Verification, Validation, and Acceptance Testing
Lecture # 7 System Requirements
Presentation transcript:

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 1 Lecture 19: Verification and Validation Ü Some Refreshers:  Summary of Modelling Techniques seen so far  Recap on definitions for V&V Ü Validation Techniques  Inspection (see lecture 6)  Model Checking (see lecture 16)  Prototyping Ü Verification Techniques  Consistency Checking  Making Specifications Traceable (see lecture 21) Ü Independent V&V

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 2 We’ve seen the following UML diagrams:  Activity diagrams  capture business processes involving concurrency and synchronization  good for analyzing dependencies between tasks  Class Diagrams  capture the structure of the information used by the stakeholders  good for analysing the relationships between data items  also help to identify a modular structure for the system  Statecharts  capture all possible responses of an system (or object) to events  good for modeling the dynamic behavior of a class of objects  good for analyzing event ordering, reachability, deadlock, etc.  Use Cases  capture the user’s view of the system and its main functions  good starting point for specification of functionality  good visual overview of the main functional requirements  Sequence Diagrams (collaboration diagrams are similar)  capture an individual scenario (one path through a use case)  good for modelling interactions between users and system objects  good for identifying which objects (classes) participate in each use case  helps you check that you identified all the necessary classes and operations

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 3 …and the following non-UML diagrams:  Goal Models  Capture strategic goals of stakeholders  Good for exploring ‘how’ and ‘why’ questions with stakeholders  Good for analysing trade-offs, especially over design choices  Fault Tree Models (as an example risk analysis technique)  Capture potential failures of a system and their root causes  Good for analysing risk, especially in safety-critical applications  Strategic Dependency Models (i*)  Capture relationships between actors in an organisational setting  Helps to relate stakeholders’s goals to their organisational setting  Good for understanding how the organisation will be changed  Entity-Relationship Models  Capture the relational structure of information to be stored  Good for understanding constraints and assumptions about the subject domain  Good basis for database design  Mode Class Tables, Event Tables and Condition Tables (SCR)  Capture the dynamic behaviour of a real-time reactive system  Good for representing functional mapping of inputs to outputs  Good for making behavioural models precise, for automated reasoning

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 4 Verification and Validation Problem Statement Implementation Statement System Validation Verification Ü Validation:  “Are we building the right system?”  Does our problem statement accurately capture the real problem?  Did we account for the needs of all the stakeholders? Ü Verification:  “Are we building the system right?”  Does our design meet the spec?  Does our implementation meet the spec?  Does the delivered system do what we said it would do?  Are our requirements models consistent with one another? Problem Situation

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 5 Refresher: V&V Criteria Ü Some distinctions:  Domain Properties: things in the application domain that are true anyway  Requirements: things in the application domain that we wish to be made true  Specification: a description of the behaviours the program must have in order to meet the requirements Ü Two verification criteria:  The Program running on a particular Computer satisfies the Specification  The Specification, given the Domain properties, satisfies the Requirements Ü Two validation criteria:  Did we discover (and understand) all the important Requirements?  Did we discover (and understand) all the relevant Domain properties? Source: Adapted from Jackson, 1995, p Application Domain Machine Domain

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 6 V&V Example Ü Example:  Requirement R:  “Reverse thrust shall only be enabled when the aircraft is moving on the runway”  Domain Properties D:  Wheel pulses on if and only if wheels turning  Wheels turning if and only if moving on runway  Specification S:  Reverse thrust enabled if and only if wheel pulses on Ü Verification  Does the flight software, P, running on the aircraft flight computer, C, correctly implement S?  Does S, in the context of assumptions D, satisfy R? Ü Validation  Are our assumptions, D, about the domain correct? Did we miss any?  Are the requirements, R, what is really needed? Did we miss any?

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 7 Inquiry Cycle Prior Knowledge (e.g. customer feedback) Observe (what is wrong with the current system?) Model (describe/explain the observed problems) Design (invent a better system) Intervene (replace the old system) Note similarity with process of scientific investigation: Requirements models are theories about the world; Designs are tests of those theories Initial hypotheses Look for anomalies - what can’t the current theory explain? Create/refine a better theory Design experiments to test the new theory Carry out the experiments (manipulate the variables)

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 8 Shortcuts in the inquiry cycle Prior Knowledge (e.g. customer feedback) Observe (what is wrong with the current system?) Model (describe/explain the observed problems) Design (invent a better system) Intervene (replace the old system) Build a Prototype Get users to try it Get users to try it (what is wrong with the prototype?) Analyze the model Check properties of the model (what is wrong with the model?)

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 9 Prototyping “A software prototype is a partial implementation constructed primarily to enable customers, users, or developers to learn more about a problem or its solution.” [Davis 1990] “Prototyping is the process of building a working model of the system” [Agresti 1986] Ü Approaches to prototyping  Presentation Prototypes  used for proof of concept; explaining design features; etc.  explain, demonstrate and inform – then throw away  Exploratory Prototypes  used to determine problems, elicit needs, clarify goals, compare design options  informal, unstructured and thrown away.  Breadboards or Experimental Prototypes  explore technical feasibility; test suitability of a technology  Typically no user/customer involvement  Evolutionary (e.g. “operational prototypes”, “pilot systems”):  development seen as continuous process of adapting the system  “prototype” is an early deliverable, to be continually improved.

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 10 Throwaway or Evolve? Ü Throwaway Prototyping  Purpose:  to learn more about the problem or its solution…  discard after desired knowledge is gained.  Use:  early or late  Approach:  horizontal - build only one layer (e.g. UI)  “quick and dirty”  Advantages:  Learning medium for better convergence  Early delivery  early testing  less cost  Successful even if it fails!  Disadvantages:  Wasted effort if reqts change rapidly  Often replaces proper documentation of the requirements  May set customers’ expectations too high  Can get developed into final product Ü Evolutionary Prototyping  Purpose  to learn more about the problem or its solution…  …and reduce risk by building parts early  Use:  incremental; evolutionary  Approach:  vertical - partial impl. of all layers;  designed to be extended/adapted  Advantages:  Requirements not frozen  Return to last increment if error is found  Flexible(?)  Disadvantages:  Can end up with complex, unstructured system which is hard to maintain  early architectural choice may be poor  Optimal solutions not guaranteed  Lacks control and direction Brooks: “Plan to throw one away - you will anyway!”

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 11 Model Analysis Ü Verification  “Is the model well-formed?”  Are the parts of the model consistent with one another? Ü Validation:  Animation of the model on small examples  Formal challenges:  “if the model is correct then the following property should hold...”  ‘What if’ questions:  reasoning about the consequences of particular requirements;  reasoning about the effect of possible changes  “will the system ever do the following...”  State exploration  E.g. use model checking to find traces that satisfy some property

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 12 Basic Cross-Checks for UML Use Case Diagrams  Does each use case have a user?  Does each user have at least one use case?  Is each use case documented?  Using sequence diagrams or equivalent Class Diagrams  Does the class diagram capture all the classes mentioned in other diagrams?  Does every class have methods to get/set its attributes? Sequence Diagrams  Is each class in the class diagram?  Can each message be sent?  Is there an association connecting sender and receiver classes on the class diagram?  Is there a method call in the sending class for each sent message?  Is there a method call in the receiving class for each received message? StateChart Diagrams  Does each statechart diagram capture (the states of) a single class?  Is that class in the class diagram?  Does each transition have a trigger event?  Is it clear which object initiates each event?  Is each event listed as an operation for that object’s class in the class diagram?  Does each state represent a distinct combination of attribute values?  Is it clear which combination of attribute values?  Are all those attributes shown on the class diagram?  Are there method calls in the class diagram for each transition?  …a method call that will update attribute values for the new state?  …method calls that will test any conditions on the transition?  …method calls that will carry out any actions on the transition?

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 13 Independent V&V Ü V&V performed by a separate contractor  Independent V&V fulfills the need for an independent technical opinion.  Cost between 5% and 15% of development costs  Studies show up to fivefold return on investment:  Errors found earlier, cheaper to fix, cheaper to re-test  Clearer specifications  Developer more likely to use best practices Ü Three types of independence:  Managerial Independence:  separate responsibility from that of developing the software  can decide when and where to focus the V&V effort  Financial Independence:  Costed and funded separately  No risk of diverting resources when the going gets tough  Technical Independence:  Different personnel, to avoid analyst bias  Use of different tools and techniques

University of Toronto Department of Computer Science © Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 14 Summary Ü Validation checks you are solving the right problem  Prototyping - gets customer feedback early  Inspection - domain experts read the spec carefully  Formal Analysis - mathematical analysis of your models  …plus meetings & regular communication with stakeholders Ü Verification checks your engineering steps are sound  Consistency checking - do the models agree with one another?  Traceability - do the design/code/test cases reflect the requirements? Ü Use appropriate V&V:  Early customer feedback if your models are just sketches  Analysis and consistency checking if your models are specifications  Independence important if your system is safety-critical