Presentation is loading. Please wait.

Presentation is loading. Please wait.

James Nowotarski 6 November 2008 SE 325/425 Principles and Practices of Software Engineering Autumn 2008.

Similar presentations


Presentation on theme: "James Nowotarski 6 November 2008 SE 325/425 Principles and Practices of Software Engineering Autumn 2008."— Presentation transcript:

1 James Nowotarski 6 November 2008 SE 325/425 Principles and Practices of Software Engineering Autumn 2008

2 2 Topic Duration Recap last class30 minutes Software process improvement45 minutes *** Break Current events15 minutes Software metrics60 minutes Today’s Agenda

3 3 Categories of software risk Project Technical Business Legal

4 4 “It is futile to try to eliminate risk” -- Peter Drucker, management guru Risk management

5 5 Risk management process Identify Analyze Plan Cost of protection Cost of exposure $$ Control

6 6 Risk management process: artifacts Identify Analyze Plan Control List of risks Probability Impact Cutoff Risk exposure Mitigation plan Monitoring plan Contingency plan

7 7 “Worrying about what might go wrong may not be as glamorous a job as speculating about the future, but it is a more essential job right now.” Carr, N. (2003, May). IT doesn’t matter. Harvard Business Review. Retrieved September 8, 2008 from EBSCO Host – Business Source Premier database. Carr advised focusing on low-cost, low-risk

8 88 Software Engineering Body of Knowledge Software requirements Software design Software construction Software testing Software maintenance Software configuration management Software engineering management Software engineering process Software engineering tools and methods Software quality Source: Guide to the Software Engineering Body of Knowledge. (2004). IEEE. www.swebok.orgwww.swebok.org What is SE? tonight

9 9 Tom West: A Good Man in a Storm

10 10 Layered behavioral model

11 High-Performance Teams

12 12 Characteristics of High- Performing Teams

13 Copyright © 2006 The McGraw-Hill Companies. All rights reserved. Splitting/MultitaskingSplitting/Multitasking FIGURE 8.10 FIGURE 8.11

14 Copyright © 2006 The McGraw-Hill Companies. All rights reserved. Splitting/MultitaskingSplitting/Multitasking Splitting/Multitasking –A scheduling technique use to get a better project schedule and/or increase resource utilization. Involves interrupting work on an activity to employ the resource on another activity, then returning the resource to finish the interrupted work. Is feasible when startup and shutdown costs are low. Is considered the major reason why projects fail to meet schedule.

15 15 If you do only one thing to improve... If you only do one thing to improve, decrease the amount of task switching Fragmentation due mostly to managerial sloppiness Doesn’t have to occur, won’t if you take pains to stop it You’ll thus reduce frustration and waste and make it possible for your teams to take off -- Tom Demarco, Project Management guru, 1995

16 Copyright © 2006 The McGraw-Hill Companies. All rights reserved. The Five-Stage Team Development Model FIGURE 11.1

17 Copyright © 2006 The McGraw-Hill Companies. All rights reserved. The Punctuated Equilibrium Model of Group Development FIGURE 11.2

18 18 Topic Duration Recap last class30 minutes Software process improvement45 minutes *** Break Current events15 minutes Software metrics60 minutes Today’s Agenda

19 19 Where to find software process improvement ideas

20 20 Software process assessment and improvement Software Process Assessment is examined by identifies capabilities and risk of identifies modifications to Software Process Improvement Capability Determination leads to motivates

21 21 Software Process Improvement Models ISO 15504 ISO 9000-3 TickIT Capability Maturity Model Integration (CMMI) IT specific models A number of models enable software development organizations to compare their practices to a set of “best practices” Total Quality Management (TQM) Six Sigma General models

22 22 Capability Maturity Model Integration (CMMI) “the de facto process improvement framework for software developers” - Gartner Group

23 23 What is CMMI CMMI = Capability Maturity Model Integration Developed in1991 by Software Engineering Institute (SEI) to assess the software engineering capability of government contractors A framework for software process improvement (SPI) that has gained wide acceptance in the industry A roadmap of effective practices that build on one another in a logical progression coherent ordered set of incremental improvements

24 24 What is SEI SEI = Software Engineering Institute Federally funded research & development center Sponsored by Department of Defense Affiliated with Carnegie Mellon University in Pittsburgh Established in 1984 Research and publications oriented Mission is to improve the state of the practice of software engineering

25 25 Brief History - CMMI 1989 - Publication of Managing the Software Process by Watts Humphrey 1991 -Capability Maturity Model for Software (CMM) v1.0 released by Software Engineering Institute (SEI) 1993 - CMM v1.1 released 1994 - Systems engineering (SE) CMM released 2001 - CMM Integration (CMMI)-SE/SW v1.0 released 2002 - CMMI-SE/SW/IPPD/SS v1.1 released 2006 - CMMI-Dev v1.2 released (August)

26 26 A proliferation of models Different capability maturity models Software CMM (SW) Systems Engineering CMM (SE) Integrated Product and Process Development CMM (IPPD) Supplier Sourcing (SS) Software Acquisition (ACQ) Services (SVC) Team Software Process Personal Software Process People CMM (P-CMM)

27 27 Why CMMI? Practical Structured Proven reputation Quantitative benefits (median): cost34% productivity:61% time to market:50% post-release defects:48% customer satisfaction:14% return on investment:4:1 Benefits

28 28 CMMI Maturity Levels Managed (2) Managed (2) Defined (3) Defined (3) Quantitatively Managed (4) Quantitatively Managed (4) Optimized (5) Optimized (5) Initial (1) Initial (1) Process poorly controlled and unpredictable Process characterized for projects and is often reactive Process characterized for the organization and is proactive Process measured and controlled Process improvement (“nirvana”)

29 29 Process areas (PAs) Maturity levels Process areas Contain

30 30 CMMI Process Areas Level 3 Defined Requirements Development Technical Solution Product Integration Verification Validation Organization Process Focus Organization Process Definition Organizational Training Integrated Project Management Risk Management Decision Analysis & Resolution Level 2 Managed Requirements Management Project Planning Project Monitoring & Control Supplier Agreement Management Measurement & Analysis Product & Process Quality Assurance Configuration Management Level 5 Optimized Causal Analysis & Resolution Organizational Innovation & Deployment Level 4 Quantitatively Managed Organizational Process Performance Quantitative Project Management Process Areas

31 31 Process areas (PAs) Maturity levels Process areas Contain Specific practices Contain Specific goals Achieve Process area categories Contain

32 32 Process areas (PAs) Process area “A cluster of related practices in an area that, when performed collectively, satisfy a set of goals considered important for making significant improvement in that area.” Specific goals What must be achieved to satisfy the process area Specific practices Refine a goal into a set of process-related activities

33 33 Process areas (PAs) Level 2 - Managed Project planning Process area Determine estimates of effort and cost Specific practice Establish estimates Specific goal Maturity level Project management Process area category

34 34 Level 1: Initial Instability Dependence on “heroes” Inability to meet targets Key process areas: none

35 35 Class Activity Summarize and explain to the rest of the class: The 22 key process areas

36 36 Appraisal process CMMI Reference model Standard CMMI Appraisal Method for Process Improvement (SCAMPI) Appraisal process used by

37 37 CMMI Appraisal Method Team Selection 1 Maturity Questionnaire 2 Response Analysis 3 On-site visit Interviews & document reviews 4 Findings based on the CMMI 5 PA Profile 6

38 38 Appraisal Process For internal purposes: Performed in open, collaborative environment Focuses on improving the organization’s software process For external credential: Performed in a more audit-oriented environment Focuses on identifying risks associated with a contractor Team’s recommendation will help select contractors or set fees

39 39 CMMI Issues in the Real-World “Level envy” Areas not addressed: Business strategy and linkage to IT Operations, help desk, support Management of the IT human resource Application portfolio Tools Many question whether it is worth the effort to pursue levels 4 and 5

40 40 Process Maturity Profile 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 0% Initial 19.3% Repeatable 43.2% Defined 23.4% Managed 7.3% Optimized 6.8% % of Organizations 1998 thru August 2002 Based on assessments from 1998-2002 of 1124 organizations

41 41 Process Maturity Profile, April 2002-June 2006

42 42 Time to Move Up # of months to move to next level 0 75 50 25 1 to 2 23 22 2 to 3 28 3 to 4 17 4 to 5 Largest observed value that is not an outlier 75th percentile Median (50th percentile) 25th percentile Smallest observed value that is not an outlier Recommended time between appraisals (18-30 mos)

43 43 CMMI Market Pressure Marketing tool to win clients, who are based predominantly in US and Europe Clients using Indian service providers should have certain key processes in place: service level agreements identifying business requirements scoping requirements managing changes Many, if not most, of the publicly-acknowledged Level 5 CMM-certified organizations are in India

44 44 CMMI-based Software Process Improvement (SPI) Time and cost often exceed expectations 18-24 months to advance 1 level Can cost $2K per software engineer per year 1-2% full-time resources (e.g., 5-10 in a 500-person organization) 2-4% of rest of organization’s time Key success factors Senior management is engaged Participation and buy-in at all levels, including middle management and technical staff Clearly stated, well understood SPI goals Clear assignment of responsibility Software Engineering Process Group (SEPG) staffed by highly respected people

45 45 For more information http://www.sei.cmu.edu/cmmi/cmmi.html

46 46 Topic Duration Recap last class30 minutes Software process improvement45 minutes *** Break Current events15 minutes Software metrics60 minutes Today’s Agenda

47 47 Why Measure? “You can’t control what you can’t measure” (Tom Demarco) “Show me how you will measure me, and I will show you how I will perform” (Eli Goldratt) “Anything that can’t be measured doesn’t exist” (Locke, Berkeley, Hume)

48 48 Scope of our discussion Director - IS/IT Manager, Systems Development & Maintenance Manager, Computer Operations Financial Systems Manufacturing Systems Customer Fulfillment Systems Our focus Sample IT Organization

49 49 Examples of systems development metrics CategoryMetricUnits of measure Speed of delivery Delivery rate Elapsed months/Function point Schedule reliability Duration variance % Schedule variance % Software quality Fault density Faults/Function point ProductivityProductivity rate Functions points/Staff month

50 50 Example: Speed of delivery 0 10 20 30 40 50 60 70 020004000600080001000012000 Developed Function Points Elapsed Months = Is a single project release (Average elapsed months =14.8, n=33). Industry Average line is determined from Software Productivity Research

51 51 Example: Schedule reliability 0% 10% 20% 30% 40% 50% 60% 20004000600080001000012000 Developed Function Points Schedule Variance above commitment = Is a single project release (n=33). Industry Average line is determined from Software Productivity Research

52 52 Example: Software quality 0 1000 2000 3000 4000 5000 6000 7000 020004000600080001000012000 Developed Function Points Faults (3 months) Faults reported over the first three months in operations (n=27) An estimated industry average for faults found in the first three months of operations. The assumption is that half the total faults are found in the first three months in operation. This average is one half of the industry average of the total faults from C. Jones, Applied Software Measurement, 1996, p.232.

53 53 Example: Productivity 0 2 4 6 8 10 12 020004000600080001000012000 Developed Function Points Function Points per Staff Month Is a single project release (n=33) Industry Average line is determined from Software Productivity Research.

54 54 Objectives of Software Measurement

55 55 Objectives of Software Measurement Help a systems development unit understand their performance Evaluate performance relative to goals Allow for comparisons to, e.g.,: Other organizations Alternative development approaches (custom, packaged, outsourced, etc.) and technologies Other standards/targets Improve estimating ability Promote desired behaviors, e.g., reuse

56 56 Hawthorne Effect Famous study conducted in the Hawthorne plant of General Electric Corporation Plant managers implemented changes in working conditions and recorded data on the plant’s production output They found that production increased no matter what changes in working conditions they implemented! What does this example reveal about how people act when they know that an experiment is being conducted?

57 57 Goal Question Metric Goal 1Goal 2 Question Metric

58 58 Goal Question Metric Technique for identifying suitable measurements to collect Assumption: It is only worthwhile measuring things to satisfy goals Goals are desired end states Questions identify the information needs associated with goals, help determine whether or not goals are being met Metrics are specific items that can be measured to answer the questions

59 59 GQM Example Improve timeliness of change request processing from the project manager’s viewpoint Goal What is the current change request processing speed? Is the performance of the process improving? Question Metric Average cycle time Standard deviation Subjective rating from project manager (Current average/ Baseline average) x 100 % cases outside upper limit

60 60 Case Study Exercise 1. Get team assignment (5 teams) 2. Read the case study 3. Fill in the worksheet Identify 1 goal Identify 2-3 questions pertinent to this goal. Identify at least 1 metric (indicator) per question Brief the class

61 61 Readings Current event reports (overflow) For November 13

62 Extra Slides

63 63 Characteristics of High- Performing Teams Shared purpose and goals Clear roles and organization Well-defined processes High team development and capability Open communication Civilized disagreement Atmosphere of creativity/innovation High performance standards Accountability and rewards

64 64 Maslow’s Hierarchy

65 65 Expectancy Theory M = V * E * I V I E

66 Managing Diversity

67 67 Functional Cultural Generational Gender Organizational Managing Diversity

68 Global delivery Planning; high level tasks Execution Common processes technology and tools

69 Copyright © 2006 The McGraw-Hill Companies. All rights reserved. Hofstede Cultural Dimensions Framework Individualism versus collectivism –Identifies whether a culture holds individuals or the group responsible for each member’s welfare. Power distance –Describes degree to which a culture accepts status and power differences among its members. Uncertainty avoidance –Identifies a culture’s willingness to accept uncertainty and ambiguity about the future. Masculinity-femininity –Describes the degree to which the culture emphasizes competitive and achievement-oriented behavior or displays concerns for relationships.

70 Copyright © 2006 The McGraw-Hill Companies. All rights reserved. Sample Country Clusters on Hofstede’s Dimensions of Individualism-Collectivism and Power Distance FIGURE 15.5

71 71 Why do some virtual teams thrive while others stumble? 1. Tools for social networking and collaboration 2. Include some people who already know each other 3. “Boundary spanners” Grow them Staff them (~15% of team) 4. Encourage frequent communication, but don’t force social events Differentiators

72 72 Why do some virtual teams thrive while others stumble? 5. Assign tasks to team members in different locations that allow them to move ahead at their own pace 6. Work that is interesting, compelling, meaningful One of biggest reasons virtual teams fail 7. Solicit volunteers as much as possible Differentiators (cont.)

73 73 Sources of improvement ideas Post-mortems Metrics Benchmarking with other organizations Published best practices Standards Process frameworks also known as “meta-models” or “software process improvement models”

74 74 GQM Example (High Level) Improve systems delivery performance Goal What is the quality of our deliverables? How predictable is our process? How quickly do we deliver? How efficient are we? Question Metric Fault density Delivery rateProductivity rate Duration variance percentage

75 75 Measurement and Continuous Improvement Continuous Improvement Measurement

76 76 Measurement and Continuous Improvement Continuous Improvement Measurement Focuses program objectives Enables tracking of improvement progress Enables communication of program benefit Clarifies measurement’s purpose and role Clarifies which measures to collect Provides a mechanism for acting on findings Enables top-to-bottom organizational support

77 77 Continuous Process Improvement Approach to Quality and Measurement Plan Do Check Act 1. Identify performance standards and goals 2. Measure project performance 3. Compare metrics against goals 4. Eliminate causes of deficient performance - fix defects - fix root causes

78 78 Metrics Program Change Plan QUALITY MANAGEMENT Enable Change Technology Process People Metrics Awareness Education Metrics Network Vital Few Metrics Definitions Vital Few Metrics Implementation Technology Strategy KM Support for Measurement Community of Practice Measurement Process Improvement Large Project Network Metrics Strategy Commitment / Ownership Distributed Support Units Metrics Repository and tools Measurement Process Definition Roles & Responsibilities PROGRAM MANAGEMENT Achieve-1 Change Sustain Change Achieve-2 Change Metrics Rollout Education/Training Pilot Project Group Ongoing Metrics Education / Training System Building Improvement Goals Metrics Definition & Implementation for Delivery Centers Metrics Embedded in System Building Methods Dashboard metrics Implementation Pilot Selected Projects and Selected Delivery Centers Enable Large Projects and Remaining Centers

79 79 Measurement Program Mortality Most programs fail, usually within 2 years Number of companies 400 350 300 250 200 150 100 50 0 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 Year Cumulative starts Cumulative successes

80 80 Reasons for Metric Program Failure Lack of [visible] executive sponsorship Lack of alignment with organizational goals Tendency to collect too much data Measures not calibrated, normalized, or validated Not comparing apples-to-apples Fear of [individual] evaluation Learning curve (e.g., function points) Cost overhead

81 81 Key Success Factors Ensure that measurement is part of something larger, typically performance improvement “Trojan Horse” strategy Ensure alignment with organizational goals Start small, iterate Strongly recommend doing a pilot test Automate capture of metrics data Rigorously define a limited, balanced set of metrics “Vital Few” Portfolio approach Comparability Aggregate appropriately Focus should be on processes, not individuals Obtain [visible] executive sponsorship Understand and address the behavioral implications

82 82 Other Quotes “Count what is countable, measure what is measurable, and what is not measurable, make measurable” Galileo

83 83 Other Quotes “In God we trust – All others must bring data” W. Edwards Deming


Download ppt "James Nowotarski 6 November 2008 SE 325/425 Principles and Practices of Software Engineering Autumn 2008."

Similar presentations


Ads by Google