Presentation is loading. Please wait.

Presentation is loading. Please wait.

1. Software Engineering Process 2007. 2 IT projects have a terrible track record –A 1995 Standish Group study (CHAOS) found that only 16.2% of IT projects.

Similar presentations


Presentation on theme: "1. Software Engineering Process 2007. 2 IT projects have a terrible track record –A 1995 Standish Group study (CHAOS) found that only 16.2% of IT projects."— Presentation transcript:

1 1. Software Engineering Process 2007

2 2 IT projects have a terrible track record –A 1995 Standish Group study (CHAOS) found that only 16.2% of IT projects were successful and over 31% were canceled before completion, costing over $81 B in the U.S. alone The need for IT projects keeps increasing –In 2000, there were 300,000 new IT projects –In 2001, over 500,000 new IT projects were started Motivation for Software Engineering

3 3 The Four “P’s” of Software Engineering People (by whom it is done) Process (the manner in which it is done) Project (the doing of it) Product (the application artifacts)

4 4 People “It’s always a people problem” Gerald Weinberg, “The Secrets of Consulting” Developer productivity: 10-to-1 range -Improvements: -Team selection -Team organization –Motivation

5 5 People 2 Other success factors –Matching people to tasks –Career development –Balance: individual and team –Clear communication

6 6 Optimal Size for Interaction (Approximate) Number of people with whom developer must frequently interact Developer communicates regularly with no one. No communication time lost, but developer is too isolated and has no help. Key:= engineer 3 Effectiveness per developer

7 7 Optimal Size for Interaction (Approximate) Number of people with whom developer must frequently interact Developer communicates regularly with eleven people. Communication time outweighs benefits of interaction Developer communicates regularly with no one. No communication time lost, but developer is too isolated and has no help. Key:= engineer Approximate optimal range 37 Effectiveness per developer

8 8 Organizations for Larger Projects Team of leaders

9 9 Application must satisfy predetermined standards. Methods to attain quality goals: Inspection team-oriented process for ensuring quality applied to all stages of the process. Quality

10 10 Application must satisfy predetermined quality level. Methods to attain quality level: Inspection team-oriented process for ensuring quality applied to all stages of the process. Formal methods mathematical techniques to convince ourselves and peers that our programs do what they are meant to do applied selectively Testing at the unit (component) level at the whole application level Project control techniques predict costs and schedule control artifacts (versions, scope etc.) Quality

11 11 Set of activities carried out to produce an application Process

12 12 A Defined Process

13 13 Process 2 Types: Management & Technical Development fundamentals Quality assurance Risk management Lifecycle planning

14 14 Process 2 Customer orientation Process maturity improvement Rework avoidance

15 15 Development sequence: Waterfall Iterative Process frameworks: Personal Software Process SM Team Software Process SM Capability Maturity Model SM -- for organizations Standards: Institute of Electrical and Electronic Engineers International Standards Organization... Process

16 16 What Is a Project? A project is “a temporary endeavor undertaken to accomplish a unique product or service” (PMBOK® Guide 2000, p. 4) Attributes of projects –unique purpose –temporary –require resources, often from various areas –should have a primary sponsor and/or customer –involve uncertainty

17 17 The Triple Constraint Every project is constrained in different ways by its –Scope goals: What is the project trying to accomplish? –Time goals: How long should it take to complete? –Cost goals: What should it cost? It is the project manager’s duty to balance these three often competing goals

18 18 The Triple Constraint

19 19 The Triple Constraint Fast, cheap, good. Choose two.

20 20 The Triple Constraint Know which of these are fixed & variable for every project

21 21 The Variables of Project Management Can somewhat vary the following factors. 1. The total cost of the project,  e.g., increase expenditures 2. The capabilities of the product,  e.g., subtract from a list of features 3. The quality of the product,  e.g., increase the mean time between failure 4. The date on which the job is completed.  e.g., reduce the schedule by 20%  e.g., postpone project's completion date one month

22 22 Project Variables cost capabilityduration defect density Target : $70K Target : 30 wks Target : 4 defects/Kloc Target: 100%

23 23 Project Variables cost capabilityduration defect density Target : $70K Actual: 100% Target : 30 wks Target : 4 defects/Kloc this project Actual: 1 defect/Kloc Actual: 20 wks Actual: $90K Target: 100%

24 24 Product The “tangible” dimension Product size management Product characteristics and requirements Feature creep management

25 Software Engineering Roadmap

26 26 next chapter: Plan development process Plan configuration management - how to manage documents & code - document: SCMP Plan quality assurance - how to ensure quality - document: SQAP Integrate & test system Analyze requirements Design Maintain Test unitsImplement Software Engineering Roadmap Identify corpor- ate practices - assess capability - specify standards - e.g., CMM level Development phases Plan verification & validation - verify the product satisfies requirements - validate each phase by showing it succeeded document: SVVP Plan project

27 27 Plan development process Plan configuration management - how to manage documents & code - document: SCMP Plan quality assurance - how to ensure quality - document: SQAP Integrate & test system Analyze requirements Design Maintain Test unitsImplement Software Engineering Roadmap Identify corpor- ate practices - assess capability - specify standards - e.g., CMM level Development phases Plan verification & validation - verify the product satisfies requirements - validate each phase by showing it succeeded document: SVVP Plan project

28 28 Typical Project Roadmap 1. Understand nature & scope of proposed product 2. Select the development process, and create a plan 4. Design and build the product 6. Deliver and maintain the product 3. Gather requirements 5. Test the product

29 29 Planning Determine requirements Determine resources Select lifecycle model Determine product features strategy

30 30 Tracking Cost, effort, schedule Planned vs. Actual How to handle when things go off plan?

31 31 Measurements To date and projected –Cost –Schedule –Effort –Product features Alternatives –Earned value analysis –Defect rates –Productivity (ex: SLOC) –Complexity (ex: function points)

32 32 Project Phases

33 33 Lifecycle Relationships

34 History (very shortly)

35 35 Structured Programming Function definition handleAccount(…) getDetailsFromUser(…) getAccount(…) doTransaction(…) …… Function definition getDetailsFromUser (…) getName(…) …... Function definition getAccount(…) getFirstName(…) …... TOP DOWN

36 36 Object Orientation Software design entities Account getDetails() Transaction execute() Customer getFirstName() Direct correspondence Real world concepts Skljkvjkvjfkavjafkk saovjsdvjfvkfjvkfjk Skljkvjkvjfkavjafkk saovjsdvjfvkfjvkfjk Skljkvjkvjfkavjafkk saovjsdvjfvkfjvkfjk Skljkvjkvjfkavjafkk saovjsdvjfvkfjvkfjk

37 37 Components Software may be built using existing services Reusable Software Make-or-(better) Buy Design for Changeability, Exchangeability

38 38 What is a Component? Definition: A component denotes a self-contained entity that provides its functionality via standard interfaces uses functionality from its environment via standard interfaces and may support builder tools in plugging components and applications together

39 39 Komponent: peidetud realisatsioon

40 40 Hajuskomponent

41 Expectations for Software Engineering Process

42 42 Five Key Expectations Influenced by people Used for process development Part of the project Aspect of the product 3. Keep all work visible 5. Measure quality 4. A. Design only against requirements B. Program only against designs C. Test only against requirements and designs 1. Predetermine quantitative quality goals 2. Accumulate data for subsequent use

43 43 Artifacts and Roles Artifacts: the entities that software engineering deals with. Document Model -- a view of the application (design etc.) Component -- physical (source code etc.) Workers: responsibilities allocated to people (roles). SW Dev Proc term Symbol & examples Worker Worker instance (e.g., Joe Smith)

44 Software Engineering Process alternatives

45 45 Main Phases of Software Process 1.Requirements Analysis (answers “WHAT?”) Specifying what the application must do 2.Design (answers “HOW?”) Specifying what the parts will be, and how they will fit together 3.Implementation (A.K.A. “CODING”) Writing the code 4.Testing (type of VERIFICATION) Executing the application with test data for input 5.Maintenance (REPAIR or ENHANCEMENT) Repairing defects and adding capability

46 46 Software Process Phases Requirements Analysis: Text produced e.g., “ … The application shall display the balance in the user’s bank account. …” Design: Diagrams and text e.g., “ … The design will consist of the classes CheckingAccount, SavingsAccount, …” Implementation: Source and object code e.g., … class CheckingAccount{ double balance; … } … Testing: Test cases and test results e.g., “… With test case: deposit $44.92 / deposit $32.00 / withdraw $ / … the balance was $ , which is correct. …” Maintenance: Modified design, code, and text e.g., Defect repair: “Application crashes when balance is $0 and attempt is made to withdraw funds. …” e.g., Enhancement: “Allow operation with Pesos.”

47 47 The Waterfall Model Design Implementation & unit testing Integration System testing Concept analysis Analysis Maintenance

48 48 The Waterfall Software Process time Requirements Analysis Design Milestone(s) Phases (activities) Implementation Testing Maintenance Release product X Two phases may occur at the same time for a short period

49 49 Why a Pure Waterfall Process is Usually Not Practical  Don’t know up front everything wanted and needed oUsually hard to visualize every detail in advance  We can only estimate the costs of implementing requirements oTo gain confidence in an estimate, we need to design and actually implement parts, especially the riskiest ones oWe will probably need to modify requirements as a result  We often need to execute intermediate builds oStakeholders need to gain confidence oDesigners and developers need confirmation they're building what’s needed and wanted  Team members can't be idle while the requirements are being completed oTypically put people to work on several phases at once

50 50 complete targeted requirements Step n: Analyze requirements Step n+3: Test Step n+2: Implement Step n+1: Design Product:class models + Product: requirements specifications Product: code +Product: test results + Spiral Development Sequence

51 51 The Spiral Process time 1 Requirements analysis Design Coding Testing 1Iteration # Product released X Intermediate version* completed X *typically a prototype M I L E S T O N E S

52 52 Iteration No. Iterative Development Analyze requirements Test whole Implement Design Test units Integrate Update SRS 3 Update SDD 2 Update source code Update Test documentation Update SPMP 1 1 Software Project Mangement Plan 2 Software Design Document 3 Software Requirements Specification

53 53 Iterative development

54 54 The Unified (Software Development) Process: Classification of Iterations Inception iterations: preliminary interaction with stakeholders –primarily customer –users –financial backers etc. Elaboration iterations : finalization of what’s wanted and needed; set architecture baseline Construction iterations : results in initial operational capability Transition iterations : completes product release

55 55 UP Terminology (1) ElaborationInceptionConstructionTransition Requirements Analysis Iter. #1 Iter. #n Iter. #n+1 Iter. #m Iter. #m+1 Iter. #k ….. ….. Prelim. iterations Design Implemen- tation Test UP calls these “disciplines” (Classically called “phases”) Classification of iterations Individual iteration

56 56 UP Terminology (2) Requirements Analysis Design Implementation Test Requirements analysis Implementation UP Terminology Classical Terminology Integration Design Test

57 57 Unified Process Matrix ElaborationInceptionConstructionTransition Requirements Analysis Prelim. iterations Iter. #1 Iter. #n Iter. #n+1 Iter. #m Iter. #m+1 Iter. #k ….. Design Implemen- tation Test.. Amount of effort expended on the requirements phase during the first Construction iteration

58 58 The Six UP Models (Views of the Application) Use-case model Analysis model Design model Deployment model Implementation model Test model

59 59 Project Documentation SRS software requirements specifications STD software test documention SCMP software configuration management plan SDD software design document SPMP software project management plan Source Code Project status Configuration Testing Requirements Design Code User’s manual Operation SQAP software quality assurance plan Quality assurance SVVP software validation & verification plan Verification & validation Customer- oriented Developer- oriented Architecture Detailed design

60 Software Engineering Process Quality

61 61 Verification: are we building the thing right? Validation: are we building the right thing? Meaning of “V&V”(Boehm)

62 62 QA Involvement 3. Plan4. Design and build 5. Deliver & main- tain the product 1. Specify how to manage project documents 2. Identify process QA 1. QA Develops and/or reviews configuration management plans, standards QA develops and/or reviews provision for QA activities 2. QA reviews process for conformance to organizational policy 5. QA reviews, inspects & tests 4. QA reviews, inspects & tests

63 63 OVERVIEW CAUSAL ANALYSIS 4. REWORK 5. FOLLOW-UP Inspection Process & Example Times 6. IMPROVE PROCESS 2. PREPARATION 3. INSPECTION Nominal process 1. PLANNING Non-nominal process

64 64 Time/Costs per 100 LoC* -- one company’s estimates Planning1 hr  (1 person) [ Overview 1 hr  (3-5) ] Preparation1 hr  (2-4 people) Inspection meeting1 hr  (3-5 people) Rework1 hr  (1 person) [ Analysis 1 hr  (3-5) ] Total: approx person-hours * lines of non-commented code

65 65 Hours Per Defect: One estimate … at inspection … at integration timetime Hours to.... detect 0.7 to to 10.. repair 0.3 to Total 1.0 to to 19+ If defect found...

66 66 Prepare For & Conduct Inspections 1. Build inspections into the project schedule –plan to inspect all phases, starting with requirements –allow for preparation (time consuming!) & meeting time 2. Prepare for collection of inspection data –include # defects per work unit (e.g., KLOC), time spent –develop forms: include description, severity and type –decide who, where, how to store and use the metric data default: appoint a single person to be responsible failure to decide usually results in discarding the data 3. Assign roles to participants –Three adequate (author; moderator/recorder; reader) –Two far better than none (author; inspector) 4. Ensure every participant prepares –bring defects pre-entered on forms to inspection meeting Adapted from Software Engineering: An Object-Oriented Perspective by Eric J. Braude (Wiley 2001), with permission.

67 Software Engineering Process Documentation

68 68 Project Documentation SCMP software configuration management plan SPMP software project management plan Project status Configuration SQAP software quality assurance plan Quality assurance SVVP software validation & verification plan Verification & validation

69 69 Example of Documentation Set (with Dynamic Content shown) SRS software requirements specifications STP software test plan SCMP software configuration management plan SDD software design document SPMP software project management plan Source Code References to all other documents Project status* Configuration* Test results* Direction of hyperlink * Dynamic component Updates*

70 70 Configuration Items (CI’s) Units tracked officially –down to the smallest unit worth tracking –includes most official documents A1 S6 E3 C4 D5 Payroll v Payroll v Payroll v

71 71 Configuration Items (CI’s) Units tracked officially –down to the smallest unit worth tracking –includes most official documents A1 S6 E3 C4 D5 A1 S7 E3 C4 D5 Payroll v A1 D5 Payroll v S7 C4 E3 F1 Payroll v

72 72 Software engineering an extensive challenge Major process models: waterfall; spiral; incremental Capability frameworks: CMM; TSP; PSP Quality is the professional difference –metrics to define –inspection throughout –rigorous testing –include continuous self-improvement process Documentation: SCMP, SVVP, SQAP, SPMP, SRS, SDD, STP, Code, User’s manual Summary


Download ppt "1. Software Engineering Process 2007. 2 IT projects have a terrible track record –A 1995 Standish Group study (CHAOS) found that only 16.2% of IT projects."

Similar presentations


Ads by Google