Presentation is loading. Please wait.

Presentation is loading. Please wait.

James Nowotarski 10 October 2006 SE 325/425 Principles and Practices of Software Engineering Autumn 2006.

Similar presentations


Presentation on theme: "James Nowotarski 10 October 2006 SE 325/425 Principles and Practices of Software Engineering Autumn 2006."— Presentation transcript:

1 James Nowotarski 10 October 2006 SE 325/425 Principles and Practices of Software Engineering Autumn 2006

2 2 Topic Duration Requirements process recap 30 minutes V model45 minutes Testing techniques15 minutes *** Break Current event reports 30 minutes Testing techniques60 minutes Wrap-up15 minutes Today’s Agenda

3 3 Context Communication project initiation requirements Modeling analysis design Construction code test Deployment delivery support Planning & Managing elicitation Requirements engineering tasks (Ch. 7-8) elaboration specification Primary deliverables functional reqts non-functional reqts analysis model software reqts spec

4 Begin with the end in mind - Sample SRS Overview Revision History Table of Contents 1.0 Introduction 1.1Purpose 1.2Scope 1.3References 1.4Assumptions and Dependencies 2.0Use-Cases 3.0Requirements 3.1Functional Requirements 3.2Non-Functional Requirements 3.2.1Usability 3.2.2Reliability 3.2.3Performance 3.2.4Supportability 4.0Online User Documentation and Help System Requirements 5.0Design Constraints 6.0Purchased Components 7.0Interfaces 7.1User Interfaces 7.2Hardware Interfaces 7.3Software Interfaces 7.4Communication Interfaces 8.0Licensing Requirements 9.0Legal, Copyright, and Other notices 10.0Applicable Standards Index Glossary

5 5 Functional vs. Non-Functional A functional requirement (FR) describes what the system needs to do. Example: ‘The system shall display the current customer balance’.

6 6 Functional vs. Non-Functional A non-functional requirement (NFR) describes a constraint upon the solution space. Examples: Performance, flexibility, reliability, usability, portability, maintainability, safety, and security. Also called “quality” requirements, “ilities”, or even “systemic” requirements. Emergent Properties: An NFR that is realized through the careful implementation of other requirements on which it depends. Example: “The query must return its results in less than three seconds” is only realizable once the architecture and much of the system functionality has been implemented.

7 7 The Requirements Process Elicitation: Proactively working with stakeholders to discover their needs, identify potential conflicts, and establish a clear scope and boundaries for the project. Elaboration (Analysis): Gaining a deeper understanding of the product and its interactions. Specification: Production of a series of documents that capture the system and software requirements in order to support their systematic review, evaluation, and approval. Validation: Inspecting requirements to ensure their correctness. Management: Issues such as software configuration management, traceability, impact analysis, and version control.

8 8 Key Question:Deliverables StepsTechniques What does the system need to do? How well does it need to do it? Functional requirements Quality requirements 1.Review as-is system 2. Identify requirements of to-be system Re-engineeringAHP Interviewing Prototyping Observation Surveys/Focus Groups Joint Application Design (JAD) Benchmarking Elicitation RolesEstimating guidelines Business analyst

9 9 Joint Application Design (JAD)

10 10 Elicitation Techniques: AHP Develop Software PerformanceUsabilityFlexibility Architecture Choice 1 Architecture Choice 2 Goal Alternatives Quality reqts.08.64.28.41.59

11 11 Requirement Qualities Each individual requirement should be: Concise Correct Non-ambiguous Feasible Verifiable Traceable Manageable

12 12 Topic Duration Requirements process recap 30 minutes V model45 minutes Testing techniques15 minutes *** Break Current event reports 30 minutes Testing techniques60 minutes Wrap-up15 minutes Today’s Agenda

13 13 Where time is spent on big systems projects Activity% of time Planning and Modeling 33% ???50% All other Construction and Deployment 17%

14 14 Why we test We test software because we cannot guarantee its correctness – under normal development practices. Testing is the art of devising and executing test cases that have a high likelihood of finding errors. A small subset of faults accounts for most failures during operation. We need to ‘test smart’ in order to find these faults.

15 15 Verification & Validation Testing is just part of a broader topic referred to as Verification and Validation (V&V) Pressman/Boehm: Verification: Are we building the product right? Validation: Are we building the right product? IEEE standard 1012-1998: Requirements validation is the process of evaluating an implemented system to determine whether it conforms to the specified requirements.

16 16 Some overriding principles DRTFT Do it right the first time “The first mistake that people make is thinking that the testing team is responsible for assuring quality.” -- Brian Marick. Stage Containment

17 17 Stage Containment Communication project initiation requirements Modeling analysis design Construction code test Deployment delivery support Planning & Managing errorError detectiondefectfault Error origination 

18 18 V-Model RequirementsFunctional Design Integration Test Unit Test Code Technical Design Detailed Design Acceptance Test System Test Testing: Test that the product implements the specification Flow of Work Verification Validation Legend:

19 19 Terminology Testing - Ensures that the components of the application are put together correctly, according to the different levels of specification. Testing consists of exercising a newly integrated portion of the application by running a number of test cases in controlled mode. Each test case is designed to test a statement of the specification. Verification - Verification is the checking of a deliverable against a standard of work set out for the process that produces the deliverable. The sources for this standard include: 1) Exit criteria defined for each task; 2) Design and coding standards set up by the project team; 3) Evidence that the process prescribed for executing the activity has been followed; 4) Consistency and completeness checks; 5) Models and templates described as part of the application architecture; 6) Standards set up for using the technical architecture. Verification is usually done through reviews, inspections, and walk- throughs. Validation - Validation is the checking of a deliverable against the specification/requirements implemented by that deliverable (e.g., checking a database design against the data model). Stage Containment - Stage containment is a project management objective driven by the desire to minimize the number of defects or faults discovered after the work has been completed and handed off to the next stage of the development process. Activities to achieve stage containment include verification, validation, and testing. Entry/Exit Criteria - Entry and exit criteria are predefined standards that deliverables must meet before exiting one development stage and entering another. A team handing work off to another part of the project must fully satisfy their exit criteria, while the receiving team verifies that the work meets their standard entry criteria. The entry criteria for a receiving team are frequently the same as the exit criteria for the delivering team. V Model - The V model of verification, validation, and testing provides a structured testing framework throughout the development process and ensures that both verification and validation are applied to deliverables within a system.

20 20 RequirementsFunctional Design Integration Test Unit Test Code Technical Design Detailed Design Acceptance Test System Test Testing: Test that the product implements the specification Flow of Work Verification Validation Legend: May include: Operational readiness test Benefits realization test May include: Performance test Usability test Stress test Inter-Application integration test Hardware/Software integration test Restart/Recovery test V-Model: Optional Tests

21 21 RequirementsFunctional Design Integration Test Unit Test Code Technical Design Detailed Design Acceptance Test System Test Testing: Test that the product implements the specification Flow of Work Verification Validation Legend: V-Model: The Testing Process For each activity on left side of V: Develop test conditions and cycles Develop entry and exit criteria for corresponding test For each activity on right side of V: Set up environment and test team Execute test and capture actual and expected results, identify errors, correct, and regression test

22 22 Good tests High probability of finding error Not redundant “Bang for buck”

23 23 Statement coverage: Goal is to execute each statement at least once. Branch coverage Goal is to execute each branch at least once. Path coverage Where a path is a feasible sequence of statements that can be taken during the execution of the program. What % of each type of coverage does this test execution provide? 5/10 = 50% 2/6  33% ¼  25% Where does the 4 come from? = tested = not tested Test Coverage Metrics

24 24 Design for testability Understandability/Simplicity Operability/Controllability Antibugging

25 25

26 26 V-Model RequirementsFunctional Design Integration Test Unit Test Code Technical Design Detailed Design Acceptance Test System Test Testing: Test that the product implements the specification Flow of Work Verification Validation Legend: Unit testing

27 27 Unit testing Focuses on a single software component or module. Design description guides test generation to Ensure coverage of important control paths Test the boundaries of the module. Focuses on internal processing logic and data structures. Specific tests/Common errors

28 28 As no unit operates in a vacuum it is necessary to create stubs and drivers. Module to be tested Stub Driver RESULTS Test cases Interface Local data structures Boundary conditions Independent paths Error handling paths Pressman: 6 th ed., Figure 13.4 Unit test environment

29 29 V-Model RequirementsFunctional Design Integration Test Unit Test Code Technical Design Detailed Design Acceptance Test System Test Testing: Test that the product implements the specification Flow of Work Verification Validation Legend: Integration testing

30 30 Integration testing Integration testing is a systematic technique for constructing the software architecture while at the same time conducting tests to uncover errors associated with the interfacing. BIG BANG integration is not advisable! Incremental, piece-meal approaches: Top-down Bottom-up Sandwich Regression testing Ensure changes do not introduce unintended side effects

31 31 Modules are integrated by moving downward through the control hierarchy. Depth-first approach incorporates all components on a major control path. Example: M 1, M 2, M 6, M 8 Breadth-first approach incorporates all components directly subordinate at each level. Example: M 2, M 3, M 4. M1M1 M3M3 M2M2 M4M4 M6M6 M5M5 M7M7 M8M8 Top-down integration

32 32 MaMa Mc D3D3 Cluster 1 Cluster 3 MbMb Cluster 2 D1D1 D2D2 Bottom-up integration

33 33 The re-execution of some subset of tests that have already been conducted to ensure that changes have not introduced unintended side effects. Whenever change is introduced or existing tests uncover errors that are fixed – there is opportunity for new errors to be introduced. Supported by capture/playback tools. Regression testing includes: A representative sample of tests that will exercise all software functions. Additional tests that focus on software functions that are likely to be affected by the change. Tests focusing on the software components that have been changed. Regression testing

34 34 Topic Duration Requirements process recap 30 minutes V model45 minutes Testing techniques15 minutes *** Break Current event reports 30 minutes Testing techniques60 minutes Wrap-up15 minutes Today’s Agenda

35 35 White box vs. Black box

36 36 White box vs. Black box Inputs designed to test a system function Outputs (should match intended functionality)

37 37 White box vs. Black box White boxBlack box Structural testing Logic paths Loops Internal variables Error conditions Functional testing Incorrect/missing functions Interface errors Performance errors

38 38 V-Model Integration Test Unit Test Code Acceptance Test System Test White box Black box

39 39 First proposed by Tom McCabe in 1976. Enables the test case designer to derive a logical complexity measure of the procedural design. Uses this measure as the basis for defining an upper bound on the number of execution paths needed to guarantee that every statement in the program ins executed at least once. Uses a notation known as a flow graph. Each structured notation has a corresponding flow graph symbol. Basis Path Testing

40 40 Sequence if Case whileuntil Where each circle represents one or more nonbranching set of source code statements. Flow Graph Notation

41 41 Flow chart and corresponding flow graph. 1 2 3 6 4 578 9 10 11 1 2,3 10 11 6 9 84,57

42 42 Compound logic A compound condition occurs when one or more Boolean operators (logical OR, AND, NAND, NOR) is present in a conditional statement. Example: if a OR b then do X else do Y end if a b xy x a b xy

43 43 Any path through the program that introduces at least one new set of processing statements or a new condition. In terms of a flow graph, an independent path must move along at least one edge that has not previously been traversed. In the previous example: path 1: 1-11 path 2: 1-2-3-4-5-10-1-11 path 3: 1-2-3-6-8-9-10-1-11 path 4: 1-2-3-6-7-9-10-1-11 The path: 1-2-3-4-5-10-1-2-3-6-8-9-10-1-11 is NOT an independent path because it does not traverse any new edges. Independence path

44 44 These paths constitute a basis set for the flow graph. Design tests to execute these paths. Guarantees: Every statement has been executed at least once. Every condition has been executed on both its true and false sides. There is more than ONE correct set of basis paths for a given problem. How many paths should we look for? Calculate cyclomatic complexity V(G) V(G) = E-N+2 (E = # edges, N = # nodes) V(G) = P + 1 (Where P = number of predicate nodes) V(G) = R (Where R = number of regions) Basis Set

45 45 An Example: Procedure Average (Pressman, p.397) * This procedure computes the average of 100 or fewer numbers that lie between bounding values; it also computes the sum of the total number valid. INTERFACE RETURNS average, total.input, total.valid; INTERFACE ACCEPTS value, minimum, maximum; TYPE value[1:100] IS SCALAR ARRAY; TYPE average, total.input, total.valid; minimum, maximum, sum IS SCALAR; TYPE i IS INTEGER;

46 46 Continued… i = 1; total.input = total.valid = 0; sum = 0; DO WHILE value[i] <> -999 AND total.input = minimum and value[i] <= maximum THEN increment total.valid by 1; sum = sum + value[i] ELSE skip ENDIF increment I by 1; ENDDO IF total.valid > 0 THEN average = sum / total.valid; ELSE average = -999; ENDIF END average 1 2 3 4 5 6 7 8 9 10 11 12 13

47 47 1. Use the design or code as a foundation and draw corresponding flow graph. 1 2 3 4 5 6 7 8 9 10 13 1211 Steps for deriving test cases 2. Determine the cyclomatic complexity of the resultant flow graph. V(G) = 17 edges – 13 nodes + 2 = 6 V(G) = 5 predicate nodes + 1 = 6.

48 48 3. Determine a basis set of linearly independent paths. 1 2 3 4 5 6 7 8 9 10 13 1211 Steps for deriving test cases Path 1: 1-2-10-11-13 Path 2: 1-2-10-12-13 Path 3: 1-2-3-10-11-13 Path 4: 1-2-3-4-5-8-9-2… Path 5: 1-2-3-4-5-6-8-9-2… Path 6: 1-2-3-4-5-6-7-8-9-2… 4. Prepare test cases that will force execution of each path in the basis set.

49 49 Infeasible Paths Some paths are infeasible… Begin 1.Readln (a); 2. If a > 15 then 3. b:=b+1; else 4. c:=c+1; 5. if a < 10 then 6.d:=d+1; 7.end V(G) = 3: There are three basis paths: Path 1: 1,2,3,5,7 Path 2: 1,2,4,5,7 Path 3: 1,2,3,5,6,7 Which of these paths is non- executable and why?

50 50 Basis Path Testing In small groups Derive a data flow graph from the code (see next slide). Define independence paths (basis set) Derive tests to drive each path in the basis set

51 51 Activity Code (Valid pins 10000-19999): Procedure Validate_Pin (Valid_Pin, Return_Code) Valid_Pin = FALSE Return_Code = GOOD Pin_Count = 0 do until Valid_Pin = TRUE or Pin_Count > 2 or Return_Code = CANCEL begin get Pin_Number (Pin_Number, Return_Code) if (Return_Code ≠ CANCEL) begin call Validate Pin_Number (Pin_Number, Valid_Pin) if (Valid_Pin = FALSE) then begin output “Invalid PIN, please re-enter PIN” Pin_Count = Pin_Count + 1 end return (Valid_Pin, Return_Code)

52 52 Read Pressman Chapters 21-23 (Project Planning and Management) Mid-term quiz due (see course home page) Current event reports: Castiglione Hill Hogan Semler For October 17

53 53 Extra slides

54 54 Change Control Process Create Initial Sections Create/Modify Draft Review Draft (V&V) Create Changes to Incorporate Changes Needed In Document Document Approved CreateReviewReviseReview Approved Time... Document in Production and Under Formal Change Control Document Under Development and User Change Control

55 55 Waterfall model System requirements Software requirements Analysis Program design Coding Testing Operations Source: Royce, W. "Managing the Development of Large Software Systems."

56 56 RUP Artifacts by Phase and Discipline Discipline InceptionElaborationConstructionTransition Business Modeling Requirements Vision Use Cases (20-80%) Actors Software Req Spec Glossary Analysis & Design Software Arch Doc Implementation Build Plan Build Test Results Test Test Plan Test Script Test Data Test Results Test Strategy Deployment Deployment Plan Training Materials Support Materials Acceptance Test Results Change Requests Product Executable Architecture User Interface Prototype User Interface Design Use Case Realization Design Model Database Design Business Architecture

57 57 RUP Artifacts by Phase and Discipline Discipline InceptionElaborationConstructionTransition Configuration and Change Management Project Management Risk List Risk Mgmt Plan Business Case QA Plan Software Dev Plan Environment Dev Case (Process) Tools Guidelines Templates Support CM Plan CM Environment Change Requests

58 58 Technology Process People The focus of SE 425 is the process component of software engineering Core Concepts Technology Process People … for the delivery of technology-enabled business solutions

59 59 V-Model Integration Test Unit Test Code Acceptance Test System Test


Download ppt "James Nowotarski 10 October 2006 SE 325/425 Principles and Practices of Software Engineering Autumn 2006."

Similar presentations


Ads by Google