Presentation is loading. Please wait.

Presentation is loading. Please wait.

CMMI® Version 1.3 and Beyond November 2010

Similar presentations


Presentation on theme: "CMMI® Version 1.3 and Beyond November 2010"— Presentation transcript:

1 CMMI® Version 1.3 and Beyond November 2010
Mike Phillips Software Engineering Institute Carnegie Mellon University Excerpted by Pat Wegerson for North Star INCOSE 17 February 2011 ® CMMI is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University. © 2004 by Carnegie Mellon University

2 Organizations Are Complex Systems
Organizational System Strategic Technological Inputs Human, Financial, Technological, Material, Resources Outputs Products, Services Managerial Human/ Cultural Structural Input-output flow of materials, energy, information Adapted from Kast and Rosenzweig, 1972.

3 What Is a Process? A process is a set of interrelated activities, which transform inputs into outputs, to achieve a given purpose. Result Another definition of process - A continuing development involving many changes; or a particular method for doing something, usually involving a number of steps or operations (Websters, 1976) While process is often described as a leg of the process-people-technology triad, it may also be considered the “glue” that unifies the other aspects Process Improvement flows from and extends the general management theories developed over the past ~30 years (Juran, Deming, Crosby, etc.)

4 Processes can make the difference!
How Do You Want to Work? Random motion – lots of energy, not much progress No teamwork – individual effort Frequent conflict You never know where you’ll end up Directed motion – every step brings you closer to the goal Coordinated efforts Cooperation Predictable results Processes can make the difference!

5 Symptoms of Process Failure
Quality Problems Too much rework No product documentation Functions that don’t work correctly Customer complaints after delivery Delivery of embarrassing products Wide variation in how people perform identical tasks Work with wrong versions of work products No View to the Future No concern for process improvement No feedback on process effectiveness Program cancellation STUDENT NOTES: Consequences of process failure can have monumental effects on your program. INSTRUCTOR NOTES: KEY POINT More indicators of Process Failure SUGGESTED SCRIPT More symptoms of process failure ● Quality issues ● No view of the future © 2007 by Carnegie Mellon University

6 Civilian AT&L Workforce (2006)
Workforce Challenges Huntsville, AB SPIN June 26, 2008 “DoD faces significant challenges related to mitigating the pending departure of its highly experienced and seasoned talent – the critical challenge” Frank Anderson, Jr., Director, AT&L Human Capital Initiatives and President, Defense Acquisition University 2007 National (2005) DoD (2006) Civilian AT&L Workforce (2006) Generation Workforce (Millions) % Workforce Traditionalists (Born before 1946) 11.5 7.5% 45,625 6.7% 8,322 7.4% Baby Boomers ( ) 61.5 42.0% 438,971 64.5% 77,779 68.7% Generation X ( ) 43.5 29.5% 132,948 19.5% 17,581 15.5% Generation Y ( ) 31.5 21.0% 62,676 9.2% 9,394 8.3% Millennium ( present) 51.0 0% 153 The Baby Boomer and older generations comprise 71 percent of the DOD workforce 76 percent of the AT&L civilian workforce 50 percent of the National workforce Source: Anderson 2007, NDIA STEM Initiative Strategy Session © 2008 Carnegie Mellon University

7 Characteristics of Effective Processes
documented trackable simple enforced measurable trained flexible STUDENT NOTES: To be effective (usable or executable), processes need certain characteristics. In the absence of these characteristics ● You’re more likely to encounter resistance ● your process is more likely to fail. INSTRUCTOR NOTES: KEY POINT ● Characteristics of Effective Processes SUGGESTED SCRIPT ● Effective processes should be: animation ○ SIMPLE: A 1” thick INDEX to procedures is a problem sign ○ DOCUMENTED: Written, readable, and accessible ○ TRAINED: “Go read the procedures” is not training. Training can be formal or via mentoring ○ PRACTICED: Universally practiced, all the time by all the staff ○ SUPPORTED: Appropriate tools on hand. Built into planning processes ○ STABLE: Changing processes produce changing results ○ DEFINED GATES: Establishes verifiable checkpoints throughout the process ○ FLEXIBLE: No process is perfect. Allow some adaptation to meet changing conditions ○ ENFORCED: Encouraged, audited, rewarded ○ TRACKABLE: Must be able to monitor the use ○ MEASURABLE: Know what it costs. Know how it benefits TRANSITION ● Lots of guidance for process improvement supported Well-defined gates STABLE practiced

8 Process Definition Inputs
Strategic Plans, Goals, Objectives Process Needs Policies Process Architecture Process Descriptions, Procedures, Instructions Asset Library Measurement Repository Process Scope © 2004 by Carnegie Mellon University

9 Signs that Processes Are Insufficient
Unmet commitments Late delivery Last minute crunches Spiraling costs Little or no management visibility You’re always being surprised Quality problems Too much rework Functions do not work correctly Customer dissatisfaction post-delivery; continuing high costs Poor morale Frustration Is anyone in charge? Mars Orbiter (Sep 1999) A 125 million dollar orbiter was lost because one team used English units of measure while another used metric units of measure. Process failure can result from: Improper Implementation Lack of discipline Noncompliance Poor execution Clicking on the Power point icon in the lower right brings up the Petrobas slide show, about an oil drilling platform which sank before it before it was installed and pumping oil

10 Process Improvement Whether intentional or not, you already have processes in place. Are they the RIGHT processes? Something is wrong… … if no one uses the processes (except under duress) … if everyone has their own interpretation of the process … if you find you are always tailoring your processes STUDENT NOTES: INSTRUCTOR NOTES: KEY POINT When do you need process improvement? SUGGESTED SCRIPT You always have process, whether or not you consciously design them You may need to invest in process improvement activities if: ● the processes are used only under duress ● everyone does things their own way ● you need to tailor the process for every use TRANSITION Critical success factors for process improvement © 2007 by Carnegie Mellon University

11 SEI’s IDEALSM Approach

12 Common Misconceptions
I don’t need process, I have … Really good people Advanced technology An experienced manager Process… Interferes with creativity = bureaucracy + regimentation Is only useful on large projects Hinders agility in fast-moving markets Costs too much STUDENT NOTES: There are all sorts of excuses on why NOT to do process. What the PM has to realize is that these are excuses, and not “real” justification for not performing processes. ● Some think that experience or skill or technology are substitutes for process ● Some think that process is stifling and expensive ● They’re wrong. Studies show significant improvements resulting from process improvement Some might say they don’t need process because they are on a “fast track program” -- despite the need for speed, such programs are in definite need for controls and procedures so that they don’t get out of control INSTRUCTOR NOTES: KEY POINT ● Common misconceptions about process TRANSITION ● Threats to Process Improvement © 2007 by Carnegie Mellon University

13 Threats to Process Improvement
Senior management problems Change or loss of sponsorship Inadequate support and resources Desire for quick fixes Unreasonable expectations Termination before institutionalization Inconsistent reinforcement Middle management resistance “If it ain’t broke don’t fix it” “Flavor of the day” “This is another management initiative I can outlast” Understand resistance before trying to eliminate it. It may be justified! STUDENT NOTES: INSTRUCTOR NOTES: KEY POINT Sources of Process Improvement failure SUGGESTED SCRIPT Every one of these items ALONE could derail your process improvement program. You’re always fighting something. You have to be able to go from the symptom to the problem. Symptom: if you implement something and expect to see a fix right away ● Production rate drops after a process improvement has been implemented. PM says, “This isn’t working.” and he reverts to the old process. What happened here? (quick fixes, unreasonable expectations, termination before institutionalized) TRANSITION Recovery © 2007 by Carnegie Mellon University

14 Interchangeable parts
How Can Process Help? Process supports the goals of the company, enabling Repeatability Insight and oversight Control and tracking Measurement Improvement Training Transformation (via consistency, integration, coordination) Interchangeable parts STUDENT NOTES: INSTRUCTOR NOTES: KEY POINT ● Advantages of Process SUGGESTED SCRIPT ● The military has been using process longer than anyone. Early emphasis for process came from the US military. They know the good stuff and the bad stuff about process. e.g., there is a 60 page specification for chocolate chip cookies. ○ Repeatability is the big advantage ○ Interchangeable weapon parts made possible by process ○ Proved to be an early advantage for US manufacturing and the US military ● Operational benefits ○ Encourages repeatable results ○ Enables control and tracking ○ Promotes measurability ○ Develops operational insight ○ Enables improvements ○ Enables training ○ Enables transformation as military moves toward net-centric world (There is no networked, coordinated action without process) In this case, it Provides consistency, Facilitates integration, and Improves coordination TRANSITION ● Characteristics of Effective Process

15 Value of Fixing Defects Early
Error Correction Costs By Phase Operation $$$ Relative Cost to Correct Error Validation Integration Implementation Detailed Design TIME

16 Early Defects Detection
Require- ments Design Code Software Test System Test Field Use Relative Cost Level 5 5% 20% 40% 20% 10% <5% $800 Level 4 3% 12% 30% 30% 20% 5% $1,000 Level 3 0% 2% 20% 38% 32% 8% $1,400 Level 2 0% 0% 3% 30% 50% 17% $2,500 Level 1 0% 0% 2% 15% 50% 33% $4,000 Source: SEPG Asia Pacific 2009 presented by Ravindra Nath, KUGLER MAAG CIE GmbH

17 Late Discovery of System-Level Problems
Presentation Title 3/27/2017 20.5% 110x Requirements Engineering Acceptance Test 0%, 9% 40x System Design System Test 70%, 3.5% 10%, 50.5% 1x 16x Software Architectural Design Integration Test 20%, 16% Component Software Design Unit Test 5x Sources: NIST Planning report 02-3, The Economic Impacts of Inadequate Infrastructure for Software Testing, May 2002. D. Galin, Software Quality Assurance: From Theory to Implementation, Pearson/Addison-Wesley (2004) B.W. Boehm, Software Engineering Economics, Prentice Hall (1981) Where faults are introduced Code Development Where faults are found The estimated nominal cost for fault removal © 2006 Carnegie Mellon University

18 An Investment Is Required
Present State Transition State Desired State P r odu c t i v y T i m e

19 Critical Success Factors for Process Improvement
Commitment to improve must start at the top. First understand the current process. Structured change must become a way of life. Improvement requires investment. When failure occurs, focus on the process, not the people. Institutionalizing improvements requires vigilance and periodic reinforcement. SUCCESS STATUS QUO FAILURE STUDENT NOTES: These are characteristics of successful process improvement. This is particularly difficult to achieve in the DoD context because of the challenge of losing personnel regularly and frequently. INSTRUCTOR NOTES: KEY POINT Critical success factors for Process Improvement SUGGESTED SCRIPT This list, while not complete, indicates a number of factors critical to the success of Process Improvement efforts ● Commitment from top management is crucial. Process improvement is slow and resource intensive. Without on-going top management commitment, you won’t get the resources or the time that you need ● Your current processes are the starting point. Don’t discard them carelessly. ○ People understand them ○ People USE them ○ They were created to solve a problem, and for the most part, they probably work ● You never finish PI. The culture of the organization must be led to understand this. Here is an audio of a news report that we heard on “All Things Considered” on National Public Radio. The audio file is talking about Tim Duncan's free throw shooting problems in the 2003 NBA playoffs and the world record holder for free throws talking about his process. ● {Play audio file} ● PS – Dr Tom Amberry is an 80+ yr old Podiatrist from LA, who was hired by the Lakers, improved Shaq's performance considerably TRANSITION Lets look at some representative Process Improvement methods © 2007 by Carnegie Mellon University

20 What Is CMMI? Presentation Title 3/27/2017
© 2006 Carnegie Mellon University

21 “M” Is for Model THE REAL WORLD
Models are simplified views of the real world. Integrated product teams Systems Engineering People issues Organizational culture Maturity Levels Process Areas Practices CMMI Technology Marketing Process descriptions, models, and instantiations are below the level of detail of the CMMs. Process Descriptions “The Capability Maturity Model for Software” (TR 24) sets the domain boundary “All models are wrong, but some are useful.” - George Box

22 Improving processes for better products
CMMI in a Nutshell CMMI is a collection of characteristics of effective processes that provides guidance for improving an organization’s processes and ability to manage the development, acquisition, and maintenance of products or services. CMMI places proven approaches into a structure that helps an organization examine the effectiveness of its processes establishes priorities for improvement helps implement these improvements Improving processes for better products See notes on page 26

23 CMMI Product Suite CMMI Models CMMI for Development
CMMI for Acquisition CMMI for Services SCAMPISM (Standard CMMI Appraisal Method for Process Improvement) Class A (results in ratings) Class B (deployment) Class C (approach) Training Introduction to CMMI Advanced training courses Models Training SCAMPI

24 CMMI Framework Training Appraisals Models Services
Understanding CMMI High Maturity Practices CMMI-Based Process Improvement Overview Implementing CMMI for High Performance, an Executive Seminar Introduction to CMMI (for Development) Introduction to CMMI for Services Acquisition Supplement for CMMI Services Supplement for CMMI CMMI Level 2 for Practitioners (DEV only) CMMI Level 3 for Practitioners (DEV only) CMMI Level 2 for Practitioners (ACQ only) CMMI Instructor Training CMMI Upgrade Training Intermediate Concepts of CMMI SCAMPI Lead AppraiserSM Training SCAMPI Class A Appraisal Method (Results in Ratings) SCAMPI Class B Appraisal Method SCAMPI Class C Appraisal Method Appraisal Requirements for CMMI Models Services Point out that the CMF is comprised of the process areas associated with the three categories shown in the CMF figure. These PAs are used by an organization and their projects to manage and support what they do. Point out that what they do is defined in the constellations’ process areas. For example in the Development constellation, this includes the Engineering PAs and SAM. In the Acquisition constellation, there are six PAs in the Acquisition category. Note that REQM is an exception to this figure and is a CMF PA located in Engineering in DEV and Project Management in ACQ. Models, Training Materials and Appraisal Materials are comprised of the CMF (always) plus additions from one or more constellations. CMMI Model Foundation Development Acquisition

25 Five Reasons to Adopt CMMI
CMMI helps your organization to … Improve delivery of performance, cost, and schedule Collaborate with external stakeholders and integrate their expectations into day-to-day activities Provide competitive world-class products and services Implement an integrated enterprise business and engineering perspective Use common, integrated, and improving processes for systems and software © 2004 by Carnegie Mellon University

26 Evolution of Process Capability
Level Process Characteristics Time/$/... Predicted Performance Process improvement is institutionalized 5 Product and process are quantitatively controlled 4 Software engineering and management processes are defined and integrated 3 Project management system is in place; performance is repeatable 2 Process is informal and unpredictable 1

27 Example Chart from CMMI Level 5 Company: Multi-Performance Results Summary
December 19, 2008 Measure Performance Result Cost Firm fixed price upon acceptance of requirements specifications Schedule Not to exceed 8% of committed schedule Weekly status reporting with ability to detect one-day schedule slip Time in test and agility with rework time significantly less than customer historical average Quality Acceptance test defects significantly lower than customer historical average Company will fix defects found in production use free for life of the product We all do! Schedule Performance Index The Schedule Performance Index is a measure of project efficiency given by Project Management to gauge the progress and efficiency. A Schedule Performance Index score of 1 or greater is an optimum goal since it shows the Project Management that the project is on track and has favorable conditions of meeting the required goals. © 2008 by Carnegie Mellon University

28 Example Chart from CMMI Level 5 Company: Their Results vs
Example Chart from CMMI Level 5 Company: Their Results vs. Industry Average December 19, 2008 Measure Performance Result (Industry vs. ML5 Company) Schedule deviation >50% <10% Number of defects in delivered product (Size: 100,000 source lines of code) > <15 % of design and code inspected < Time to accept 100,000 SLOC product 10 months weeks % of defects removed prior to system test <60% >85% % of development time fixing system test defects >33% <10% Cost of Quality >50% <35% Warranty on products ? Lifetime We all do! Schedule Performance Index The Schedule Performance Index is a measure of project efficiency given by Project Management to gauge the progress and efficiency. A Schedule Performance Index score of 1 or greater is an optimum goal since it shows the Project Management that the project is on track and has favorable conditions of meeting the required goals. © 2008 by Carnegie Mellon University

29 CMMI Models © 2004 by Carnegie Mellon University

30 Sequence of Models 41

31 CMMI Models for Three Constellations
CMMI-SVC CMMI-SVC provides guidance for those providing services within organizations and to external customers. 16 Core Process Areas, common to all CMMI-DEV CMMI-DEV provides guidance for measuring, monitoring and managing development processes. CMMI-ACQ CMMI-ACQ provides guidance to enable informed and decisive acquisition leadership. The CMMI-SVC model includes the 16 core PAs, one shared PA (SAM), 6 unique PAs, and 1 PA addition. It also includes several modifications to further explain how the CMMI-SVC is to be interpreted in a services environment.

32 CMMI Core PAs CMMI-SVC CMMI Core PAs CMMI-DEV CMMI-ACQ
Core PAs are common to all three CMMI models. Core PAs include informative material that interprets the goals and practices for the model’s area of interest. CMMI Core PAs CMMI-DEV CMMI-ACQ The CMMI-SVC model includes the 16 core PAs, one shared PA (SAM), 6 unique PAs, and 1 PA addition. It also includes several modifications to further explain how the CMMI-SVC is to be interpreted in a services environment.

33 Process Area Components
Process Area (PA) Purpose Statement Introductory Notes Related Process Areas Specific Goals (SG) Generic Goals (GG) Specific Practices (SP) Generic Practices (GP) Example Work Products Subpractices Generic Practice Elaborations Subpractices Legend Required Expected Informative © 2004 by Carnegie Mellon University

34 CMMI Model Structure Benchmark Ratings CMMI-DEV CMMI-SVC CMMI-ACQ
Incremental Frameworks for Continuous Process Improvement Benchmark Ratings Goals Process Areas Maturity Levels Capability Levels Requirements Development Supplier Agreement Mgmt Technical Solution Product Integration Verification Validation CMMI-DEV Capacity & Availability Management Incident Resolution and Prevention Supplier Agreement Mgmt Service Continuity Service Delivery Service System Development Service System Transition Strategic Service Mgmt CMMI-SVC CMMI-ACQ Agreement Management Acquisition Requirements Development Acquisition Technical Mgt Acquisition Validation Acquisition Verification Solicitation and Supplier Agreement Development CMMI Model Foundation (Core Process Areas) Requirements Management Project Planning Project Monitoring & Control Measurement & Analysis Configuration Management Process and Product QA Integrated Project Management Risk Management Decision Analysis & Resolution Organizational Process Focus Organizational Process Definition Organizational Training Causal Analysis & Resolution Org Process Performance Org Performance Management Quantitative Project Mgmt Institutionalization Policies Plans Resources Responsibilities Training Managing Configurations Stakeholder Involvement Monitoring and Control Objective Evaluation Management Visibility Defined Process Improvement Information

35 Critical Distinctions Among Processes
performed vs. managed the extent to which the process is planned; performance is managed against the plan; corrective actions are taken when needed managed vs. defined the scope of application of the process descriptions, standards, and procedures (i.e., project vs. organization) © 2004 by Carnegie Mellon University

36 Understanding Levels Levels are used in CMMI to describe an evolutionary path for an organization that wants to improve the processes it uses to develop and maintain its products and services. CMMI supports two improvement paths: continuous - enabling an organization to incrementally improve processes corresponding to an individual process area (or set of process areas) selected by the organization staged - enabling the organization to improve a set of related processes by incrementally addressing successive predefined sets of process areas © 2004 by Carnegie Mellon University

37 Staged Representation: PAs by Maturity Level
Quality Productivity Level Focus 5 Optimizing Continuous Process Improvement 4 Quantitatively Managed Quantitative Management 3 Defined Process Standardization 2 Managed Basic Project Management Risk Rework 1 Initial © 2004 by Carnegie Mellon University

38 Achieving Maturity Levels
GG 2 and GG 3 All ML2, ML3, ML4, and ML5 PAs ML5 Optimizing Prevent defects; proactively improve; insert and deploy innovative technology GG 2 and GG 3 All ML2, ML3, and ML4 PAs ML4 Quantitatively Managed Measure process performance; stabilize process and control charts; deal with causes of special variations GG 2 and GG 3 All ML2 and ML3 PAs ML3 Defined Tailor the project’s process from organization’s standard processes; understand processes qualitatively; ensure that projects contribute to organization assets Adhere to policy; follow documented plans and processes; apply adequate resources; assign responsibility and authority; train people; apply CM; monitor, control, and evaluate process; identify and involve stakeholders; review with management GG 2 All ML2 PAs ML2 Managed ML1 Initial Processes are ad hoc and chaotic © 2004 by Carnegie Mellon University

39 Service Establishment and Delivery
Continuous Representation: PAs by Categories (And Potentially Across Constellations) Service Establishment and Delivery (Product) Engineering Acquisition Engineering Process Management Project/Work Management Support © 2004 by Carnegie Mellon University

40 Achieving Capability Levels (CLs) for a Process Area
GG 1, GG 2, and GG 3 All SPs Project’s process is tailored from organization’s standard processes; understand process qualitatively; process contributes to the organizations assets CL3 Defined GG 1 and GG 2 All SPs CL2 Managed Adhere to policy; follow documented plans and processes, apply adequate resources; assign responsibility and authority; train people, apply CM, monitor, control, and evaluate process; identify and involve stakeholders; review with management GG 1 All SPs CL1 Performed Perform the work A few GPs or SPs may be implemented CL0 Not performed, incomplete © 2004 by Carnegie Mellon University

41 Summary of Generic Goals and Practices
GG2: Institutionalize a Managed Process Generic Practices Generic Goals GG3: Institutionalize a Defined Process GP 3.1: Establish a Defined Process GP 3.2: Collect Process Related Experiences GP 2.1: Establish an Organizational Policy GP 2.2: Plan the Process GP 2.3: Provide Resources GP 2.4: Assign Responsibility GP 2.5: Train People GP 2.6: Control Work Products GP 2.7: Identify and Involve Relevant Stakeholders GP 2.8: Monitor and Control the Process GP 2.9: Objectively Evaluate Adherence GP 2.10: Review Status with Higher Level Management GG1: Achieve Specific Goals GP 1.1: Perform Specific Practices Adapted from Cepeda Systems & Software Analysis, Inc. © 2004 by Carnegie Mellon University

42 CMMI for Development © 2004 by Carnegie Mellon University

43 CMMI for Development Model
CMMI-DEV 22 Development-specific PAs 5 16 1 Core PAs that are present in all CMMI models. CMMI Core PAs Shared PA (SAM) CMMI-SVC CMMI-ACQ The CMMI-SVC model includes the 16 core PAs, one shared PA (SAM), 6 unique PAs, and 1 PA addition. It also includes several modifications to further explain how the CMMI-SVC is to be interpreted in a services environment.

44 Comparison of Models Measure CMMI for Development CMMI for Acquisition
Presentation Title 3/27/2017 Measure CMMI for Development CMMI for Acquisition CMMI for Services V1.1 Staged V1.1 Cont V1.2 V1.3 Pages 715 710 560 468 428 423 531 506 Process Areas 25 22 24 Generic Goals 2 5 3 Generic Practices 12 17 13 Specific Goals 55 50 49 46 47 52 53 Specific Practices 185 189 173 167 161 163 182 181 © 2010 Carnegie Mellon University

45 Development-Specific PAs
Requirements Development Supplier Agreement Management (Shared with SVC) CMMI Model Framework (CMF) Technical Solution Validation 16 Project, Organizational, and Support Process Areas Key Acquirer Roles Business analysis/relationship management (incl. requirements) Systems engineering, architecture strategy, system acceptance Contract development & supplier management Program ownership / project management Key Supplier Roles Application Design/Development System Maintenance Desktop / Service / Help Desk Hosting Data center / mainframe INSTRUCTOR NOTES: KEY POINT Both Developers and Acquirers have strong, albeit different, needs for process SUGGESTED SCRIPT Acquisition is a team sport. With both the Acquirer and the Developer on the SAME team. Each has a role to play, and Process is an important element in each of these roles Ultimately, the goal of both the Acquirer and the Developer is to translate an Operational Need into a fieldable and supportable product. In pursuit of this: the Acquirer Plans the acquisition Develops an RFP Solicits proposals Selects a source Monitors the project throughout development Accepts the product Transitions the product to operations The Developer Plans the development Designs the product Develops the product Integrates and tests the product Delivers the product The processes used by the developer and the acquirer are different but similar CMMI-Acquisition Module recommends Acquirers processes CMMI recommends Developers processes TRANSITION What processes does the Acquirer need? Product Integration Verification 45

46 CMMI-DEV PAs by Maturity Level
Process Areas 5 Optimizing Causal Analysis and Resolution Organizational Performance Management 4 Quantitatively Managed Organizational Process Performance Quantitative Project Management 3 Defined Decision Analysis and Resolution Integrated Project Management Organizational Process Definition Organizational Training Organizational Process Focus Product Integration Requirements Development Risk Management Technical Solution Validation Verification 2 Managed Configuration Management Measurement and Analysis Project Monitoring and Control Project Planning Process and Product Quality Assurance Requirements Management Supplier Agreement Management For the V1.3 release, there were no changes that affected the DEV PAs’ positioning by maturity level.

47 CMMI-DEV PAs by Category
Presentation Title 3/27/2017 Process Management Organizational Innovation and Deployment (OID) Organizational Process Definition (OPD) Organizational Process Focus (OPF) Organizational Process Performance (OPP) Organizational Training (OT) Support Causal Analysis and Resolution (CAR) Configuration Management (CM) Decision Analysis and Resolution (DAR) Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) Project Management Integrated Project Management (IPM) Project Monitoring and Control (PMC) Project Planning (PP) Quantitative Project Management (QPM) Requirements Management (REQM) Risk Management (RSKM) (+) Supplier Agreement Management (SAM) Engineering Product Integration (PI) Requirements Development (RD) Technical Solution (TS) Validation (VAL) Verification (VER) For the V1.3 release, REQM was moved from “Engineering” to “Project Management.” © 2008 Carnegie Mellon University

48 Product Integration SG 1: Prepare for Product Integration
SP 1.1 Establish an Integration Strategy SP 1.2 Establish the Product Integration Environment SP 1.3 Establish Product Integration Procedures and Criteria SG 2: Ensure Interface Compatibility SP 2.1 Review Interface Descriptions for Completeness SP 2.2 Manage Interfaces SG 3: Assemble Product Components and Deliver the Product SP 3.1 Confirm Readiness of Product Components for Integration SP 3.2 Assemble Product Components SP 3.3 Evaluate Assembled Product Components SP 3.4 Package and Deliver the Product or Product Component Revised the purpose statement to ensure proper behavior instead of proper function, thereby more explicitly including quality attributes and required functionality. Changed emphasis on integration sequence to an emphasis on integration strategy. Described an integration strategy and how it relates to an integration sequence.

49 Requirements Development
SG 1: Develop Customer Requirements SP 1.1 Elicit Needs SP 1.2 Transform Stakeholder Needs into Customer Requirements SG 2: Develop Product Requirements SP 2.1 Establish Product and Product Component Requirements SP 2.2 Allocate Product Component Requirements SP 2.3 Identify Interface Requirements SG 3: Analyze and Validate Requirements SP 3.1 Establish Operational Concepts and Scenarios SP 3.2 Establish a Definition of Required Functionality and Quality Attributes SP 3.3 Analyze Requirements SP 3.4 Analyze Requirements to Achieve Balance SP 3.5 Validate Requirements SP1.2 revised to add that customer requirements should be prioritized based on their criticality to the customer and other stakeholders. Broadened emphasis from “operational scenarios” to a more balanced “scenarios (operational, sustainment, and development).” Added a focus on architectural requirements. Because “Quality attributes” needs to be considered in addition to “functionality,” SG3 and SP 3.2 were revised. Added informative material that requirements can be monitored through development based on their criticality to the customer.

50 Technical Solution SG 1: Select Product Component Solutions
SP 1.1 Develop Alternative Solutions and Selection Criteria SP 1.2 Select Product Component Solutions SG 2: Develop the Design SP 2.1 Design the Product or Product Component SP 2.2 Establish a Technical Data Package SP 2.3 Design Interfaces Using Criteria SP 2.4 Perform Make, Buy, or Reuse Analyses SG 3: Implement the Product Design SP 3.1 Implement the Design SP 3.2 Develop Product Support Documentation

51 Validation SG 1: Prepare for Validation
SP 1.1 Select Products for Validation SP 1.2 Establish the Validation Environment SP 1.3 Establish Validation Procedures and Criteria SG 2: Validate Product or Product Components SP 2.1 Perform Validation SP 2.2 Analyze Validation Results

52 Verification SG 1: Prepare for Verification
SP 1.1 Select Work Products for Verification SP 1.2 Establish the Verification Environment SP 1.3 Establish Verification Procedures and Criteria SG 2: Perform Peer Reviews SP 2.1 Prepare for Peer Reviews SP 2.2 Conduct Peer Reviews SP 2.3 Analyze Peer Review Data SG 3: Verify Selected Work Products SP 3.1 Perform Verification SP 3.2 Analyze Verification Results

53 V1.3 CMMI Model Updates: Core PAs
Presentation Title 3/27/2017 V1.3 CMMI Model Updates: Core PAs © 2006 Carnegie Mellon University

54 CMMI Product Suite, Version 1.3
Version 1.3 focused on but was not limited to the following: High Maturity Appraisal efficiency Consistency across constellations Simplify the generic practices Version 1.3 was change request (CR) driven.

55 What Is a CMMI Model? Presentation Title 3/27/2017 A CMMI model is a subset of the CMMI Product Suite that covers a particular area of interest. Currently, there are three models that address the following: The development of products and services The acquisition of products and services The establishment, management, and delivery of services Every CMMI model is a process improvement approach that provides organizations with the essential elements of effective processes, can be used to guide improvement across a team, project, division, or entire organization, and helps to set process improvement goals and priorities, provide guidance for quality processes, and provide a point of reference for appraising current processes. © 2006 Carnegie Mellon University

56 How Similar Are Core PAs?
Presentation Title 3/27/2017 Core process areas appear in all CMMI models; however... These process areas are not identical across all models. Informative material can be different so that users interpret goals and practices for the area of interest addressed by the model. Sometimes practices can be different in one model from another (e.g., Project Planning). © 2006 Carnegie Mellon University

57 V1.3 Model Architecture Changes
IPPD/Teaming Removed the IPPD addition from CMMI-DEV and in its place added teaming practices from CMMI-ACQ and CMMI-SVC. These practices are not optional. Amplifications Removed the “amplification” model component. CMMI-ACQ Renamed the “Acquisition” process area category to be “Acquisition Engineering.” Moved AM and SSAD from the Acquisition PA category to the Project Management PA category. CMMI-DEV Moved REQM from the Engineering PA category to the Project Management PA category.

58 V1.3 Changes to GGs, GPs, and GP Elaborations
Positioned generic goals, generic practices, and GP elaborations in one central location as the first section of Part 2 in all three models. Simplified GG1 to make it more readable. Renamed GP 2.6 to “Control Work Products.” Added “selected work products” to the GP 2.9 statement. Simplified the GP 3.2 statement to replace “collect work products, measures, measurement results, and improvement information” with “collect process related experiences.” Eliminated GG4 and GG5.

59 Core PAs by Maturity Level
Focus Process Areas Quality Productivity 5 Optimizing Continuous Process Improvement Causal Analysis and Resolution Organizational Performance Management 4 Quantitatively Managed Quantitative Management Organizational Process Performance Quantitative Project Management 3 Defined Process Standardization Decision Analysis and Resolution Integrated Project Management Organizational Process Definition Organizational Process Focus Organizational Training Risk Management Configuration Management Measurement and Analysis Project Monitoring and Control Project Planning Process and Product Quality Assurance Requirements Management 2 Managed Basic Project Management Risk Rework 1 Initial © 2004 by Carnegie Mellon University

60 Core PAs by Category Process Management Support
Presentation Title 3/27/2017 Process Management Organizational Process Definition (OPD) Organizational Process Focus (OPF) Organizational Performance Management (OPM) Organizational Process Performance (OPP) Organizational Training (OT) Support Causal Analysis and Resolution (CAR) Configuration Management (CM) Decision Analysis and Resolution (DAR) Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) Project and Work Management Integrated Project Management (IPM) Project Monitoring and Control (PMC) Project Planning (PP) Quantitative Project Management (QPM) Requirements Management (REQM) Risk Management (RSKM) (+) Supplier Agreement Management (SAM) SAM is a shared PA instead of a core PA. “Work” is substituted for “Project” in CMMI-SVC titles. © 2008 Carnegie Mellon University

61 Core PAs: Support Category
Configuration Management Establish and maintain the integrity of work products using configuration identification, configuration control, configuration status accounting, and configuration audits Decision Analysis and Resolution Analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria Measurement and Analysis Develop and sustain a measurement capability used to support management information needs Process and Product Quality Assurance Provide staff and management with objective insight into processes and associated work products CM: Clarified that CM can apply to hardware, equipment, and other tangible assets. DAR: Added guidance on defining the scope of the decision and communicating results. MA: More clearly distinguished between information needs and objectives, measurement objectives, and business/project objectives. Included a table of examples (as in ACQ) for DEV and SVC. Clarified that PPQA also applies to organization level activities and work products.

62 Core PAs: Process Management Category
Organizational Process Definition Establish and maintain a usable set of organizational process assets, work environment standards, and rules and guidelines for teams Organizational Process Focus Plan, implement, and deploy organizational process improvements based on a thorough understanding of current strengths and weaknesses of the organization’s processes and process assets Organizational Training Develop skills and knowledge of people so they can perform their roles effectively and efficiently Converted goal on teaming to a single practice, which is no longer an “addition” for IPPD only. Simplified SP 3.4 to replace “process-related work products, measures, and improvement information” with “process related experiences”. Expanded applicability to training development and delivery methods such as self study, mentoring, and online training.

63 Core PAs: Project and Work Management Category -1
Simplified SP 1.7 to replace “work products, measures, and documented experiences” with “process related experiences.” Converted goal on IPPD or Integrated Teaming to a single practice (IPPD no longer an addition). Integrated Project Management Establish and manage the project and the involvement of relevant stakeholders according to an integrated and defined process that is tailored from the organization’s set of standard processes Project Monitoring and Control Provide an understanding of the project’s progress so that appropriate corrective actions can be taken when the project’s performance deviates significantly from the plan Project Planning Establish and maintain plans that define project activities Added guidance for monitoring risks, data management, stakeholder involvement, project progress, and milestone reviews. Added guidance on determining project lifecycle and milestones. Added subpractices on determining data rights and need for configuration control, and determining communication requirements and other continuing resource needs.

64 Core PAs: Project and Work Management Category -2
Requirements Management Manage requirements of the project’s products and product components and to ensure alignment between those requirements and the project’s plans and work products Risk Management Identify potential problems before they occur so that risk handling activities can be planned and invoked as needed across the life of the product or project to mitigate adverse impacts on achieving objectives Changed the focus of SP 1.5 so that it now reads “Ensure that project plans and work products remain aligned with requirements.” Included examples related to: architectural risks, use of industry standards to identify risks, FMEA, and consequence monetization. Provided guidance on maintaining risk parameters through life of the project.

65 Clarified the applicability of SAM practices.
SAM – the Shared PA Presentation Title 3/27/2017 SG 1: Establish Supplier Agreements SP 1.1 Determine Acquisition Type SP 1.2 Select Suppliers SP 1.3 Establish Supplier Agreements SG 2: Satisfy Supplier Agreements SP 2.1 Execute the Supplier Agreement SP 2.2 Accept the Acquired Product SP 2.3 Ensure Transition of Products Clarified the applicability of SAM practices. Demoted SP 2.2 and SP 2.3 to subpractices of SP 2.1 and renumbered the remainder of the practices. Revised SP 2.3 to allow its applicability to times when the product or service is delivered directly to the customer or end user from the supplier. © 2006 Carnegie Mellon University

66 New Informative Material
Presentation Title 3/27/2017 Update selected process areas to provide interpretation of practices for organizations with respect to the following topics: Agile methods Quality attributes (i.e., non functional requirements or “ilities”) Allocation of product capabilities to release increments Product lines System of systems Architecture-centric development practices Technology maturation Customer satisfaction  © 2006 Carnegie Mellon University

67 Terminology Presentation Title 3/27/2017 Used “team” instead of “integrated team” in most cases when discussing teaming practices. Simplified phrases such as “work products, measures, and improvement information” with simpler expressions such as the word “experiences.” Revised the terminology in engineering-related material from a strong emphasis on “functionality” to a more balanced “behavior (functionality and quality attributes)” or simply “functionality and quality attributes.” Clarified whether the use of “lifecycle” refers to a project lifecycle, product lifecycle, or both throughout the model. Involved the CMMI Translation Team during model development work to identify and resolve translations issues. Replaced the word “project” with other terms where needed. (SVC only) © 2006 Carnegie Mellon University

68 Front Matter Clarified that CMMI models are not processes or process descriptions. Removed any biases favoring maturity levels or capability levels. Explained that core process areas appear in all CMMI models and that they can have different expected and informative material. For example, PP can have an SP in ACQ that is absent in DEV’s PP. Added information on selecting the right CMMI model for use.

69 Glossary -1 Differentiated between definitions and usage notes for each glossary entry. Removed terms from the glossary, including: adequate, alternative practice, amplifications, appropriate, appraisal team leader, as needed, assessment, assignable cause of process variation, capability evaluation, discipline, evidence, functional configuration audit, goal, integrated product and process development, objective, objective evidence, observation, organizational unit, physical configuration audit, profile, program, rating, reference, root cause, and test procedure. Revised the definitions of “quality” and “corrective action” to be more consistent with ISO definitions. Revised the terms “process,” “development,” “supplier”, and “team” to be more broadly applicable.

70 Glossary -2 Revised the definition of “supplier agreement” to include agreements within an organization. Revised the following definitions related to high maturity practices: causal analysis, natural bounds, optimizing process, process performance model, quality and process performance objectives, stable process, statistical techniques, and subprocess. Added the following terms related to high maturity: quantitative management, statistical and other quantitative techniques. Deleted the following terms related to high maturity: quantitatively managed process, statistically managed process, and statistical predictability.

71 V1.3 Changes to High Maturity PAs
Many of the most significant changes to CMMI models as part of Version 1.3, are the changes to the high maturity process areas (CAR, OPM, OPP, and QPM). These process areas are core process areas, but we’ve focused on these four over the others because of their significance in this release.

72 High Maturity Changes for V1.3
Presentation Title 3/27/2017 Terminology Confusion Common Cause (Statistical versus Quantitative Techniques) Process Models and Process Modeling Business Objectives Subprocesses Requirements implied versus explicit/ Explanations not central or consistent Model/ Audit Criteria/ Presentations (Healthy Ingredients)/ UCHMP Perceptions Customers – ML 5 is expensive – no better than 3 Industry – ML 5 is NOT RIGHT for every business High Maturity in ALL constellations Examples are focused on Development © 2006 Carnegie Mellon University

73 High Maturity Restructuring for V1.3
Presentation Title 3/27/2017 Insufficient link between process improvement, business objectives, and performance Clarify distinction between ML4 and ML5 Eliminate GG4 and GG5 Make CAR more relevant for organizational benefit © 2006 Carnegie Mellon University

74 Combined OID and OPM into One PA
Presentation Title 3/27/2017 Improvements Progress toward achieving quality & process performance objectives Performance issues Organization Quality & process performance objectives Causal Analysis and Resolution Organizational Performance Management Improvement Proposals Organizational quality & process performance objectives Measures, baselines and models Organizational quality & process performance objectives Selected outcomes Measures, baselines and models Root cause solutions Quantitative Project Management Organizational Process Performance Measures, baselines and models Updated measures, baselines and models (actual performance) Customer © 2006 Carnegie Mellon University

75 Causal Analysis and Resolution
Presentation Title 3/27/2017 SG 1: Determine Causes of Selected Outcomes SP 1.1 Select Outcomes for Analysis SP 1.2 Analyze Causes SG 2: Address Causes of Selected Outcomes SP 2.1 Implement Action Proposals SP 2.2 Evaluate the Effect of Implemented Actions SP 2.3 Record Causal Analysis Data Used “outcomes” instead of “defects and problems.” Added examples for service organizations and for selecting outcomes for analysis. Added subpractices in SP 1.1 for defining the problem, and in SP 2.2 for following up when expected results did not occur. Added more information about how PPMs can be used. Added emphasis on prevention and reducing recurrence. © 2006 Carnegie Mellon University

76 Organizational Performance Management
Presentation Title 3/27/2017 Renamed the PA to be Organizational Performance Management (OPM). SG 1: Manage Business Performance SP 1.1 Maintain Business Objectives SP 1.2 Analyze Process Performance Data SP 1.3 Identify Potential Areas for Improvement SG 2: Select Improvements SP 2.1 Elicit Suggested Improvements SP 2.2 Analyze Suggested Improvements SP 2.3 Validate Improvements SP 2.4 Select and Implement Improvements for Deployment SG 3: Deploy Improvements SP 3.1 Plan the Deployment SP 3.2 Manage the Deployment SP 3.3 Evaluate Improvement Effects Added a new goal about managing business performance using statistical and other quantitative techniques. Provided more information about how improvements can be selected for deployment. More explicitly described and discussed using process performance models. Clarified that not all improvement validations include piloting. © 2006 Carnegie Mellon University

77 Organizational Process Performance
Presentation Title 3/27/2017 SG 1: Establish Performance Baselines and Models SP 1.1 Establish Quality and Process Performance Objectives SP 1.2 Select Processes SP 1.3 Establish Process Performance Measures SP 1.4 Analyze Process Performance and Establish Process Performance Baselines SP 1.5 Establish Process Performance Models Re-ordered SPs, moving the old SP 1.3 (Establish Quality and Process Performance Objectives) to SP 1.1 Revised SP 1.4 to include process performance analysis and assessment of subprocess stability. Revised SP 1.5 to note that under certain circumstances, projects may need to create their own process performance models. Clarified the relationship of OPP to other high maturity process areas. © 2006 Carnegie Mellon University

78 Quantitative Project Management
Presentation Title 3/27/2017 Restructured QPM so that SG1 focuses on preparation and SG2 focuses on managing the project. SG 1: Prepare for Quantitative Management SP 1.1 Establish the Project’s Objectives SP 1.2 Compose the Defined Process SP 1.3 Select Subprocesses and Attributes SP 1.4 Select Measures and Analytic Techniques SG 2: Quantitatively Manage the Project SP 2.1 Monitor the Performance of Selected Subprocesses SP 2.2 Manage Project Performance SP 2.3 Perform Root Cause Analysis Added guidance about using process performance baselines and process performance models. Define quantitative management in the glossary to include statistical management and use that definition for use of the terms throughout QPM. Removed the practice about applying statistical methods to understand variation to reduce the over-emphasis on control charts. Added new practices about managing performance and performing root cause analysis. © 2006 Carnegie Mellon University

79 CMMI Adoption Presentation Title 3/27/2017
© 2006 Carnegie Mellon University

80 Number of Appraisals Reported to the SEI by Continent
Presentation Title 3/27/2017 Number of Appraisals © 2006 Carnegie Mellon University

81 Countries Where Appraisals Have Been Performed and Reported to the SEI
* Red/Italic country name: New additions with this reporting since March 2010

82 CMMI Adoption Has Been Broad
34 countries with > 10 appraisals (as of Sept 2010): USA China India Japan Spain France Korea (ROK) 176 Brazil Taiwan U.K Mexico Germany Argentina Malaysia Canada Egypt Italy Colombia Chile Thailand And - Australia, Pakistan, Philippines, Singapore, Israel, Hong Kong, Viet Nam, Turkey, Netherlands, Portugal, Sri Lanka, Ireland, Peru and Russia There were SCAMPI A appraisals reported from 71 countries. Approximately 75% of adopters are commercial organizations. Services; 1/6 Manufacturing Approximately 2/3 of adopters in the US are contractors for military/government or are government. 201 to % 1 to % Estimated 1.2 million people work in organizations that have had at least one SCAMPI A appraisal since April 2002. Manufacturing 16.3% Services 71.1% Is the source for these statistical analyses.

83 CMMI Transition Status Reported to the SEI as of 10-31-10
Linda Northrop Software Product Lines 3/27/2017 Training Introduction to CMMI V1.2 120,838 Intermediate Concepts of CMMI 3,238 Understanding CMMI High Maturity Practices 636 Introduction to CMMI V1.2 Supplement for ACQ 1,325 Introduction to CMMI V1.2 Supplement for SVC 2,361 Introduction to CMMI for Services V1.2 314 Certifications Introduction to CMMI V1.2 Instructors 408 CMMI-ACQ V1.2 Supplement Instructors 66 CMMI-SVC V1.2 Supplement Instructors 131 Introduction to CMMI for Services V1.2 Instructors 23 SCAMPI V1.2 Lead Appraisers 466 SCAMPI V1.2 High Maturity Lead Appraisers 142 CMMI-ACQ V1.2 Lead Appraisers 72 CMMI-SVC V1.2 Lead Appraisers 147

84 CMMI V1.2 Foreign Language Translation Status Reported to the SEI as of 10-31-2010
Presentation (Full Color) Author, Date 3/27/2017 Language Status (for CMMI-DEV V1.2) Japanese Completed August Intro course translated October 2007 Chinese (trad.) Completed December 2007 French Completed August 2008 German Completed April Intro course translated October 2009 Spanish Completed in June 2009 Portuguese Completed in May 2010 Language Status (for CMMI-ACQ V1.2) Chinese (trad.) Completed April 2009 Language Status (for CMMI-SVC V1.2) Chinese (trad.) Completed in July 2010 Arabic To start, pending agreement Language Status (for CMMI-DEV V1.3) Dutch Underway

85 Number of Appraisals Conducted by Year Reported as of 10-31-10
Presentation (Full Color) Author, Date 3/27/2017 Number of Appraisals Conducted by Year Reported as of

86 Continuous Improvement
Presentation Title 3/27/2017 Certification of our process improvement professionals is appropriate and achievable Lead Appraisers first Instructors next CMMI consultants/practitioners in the future Improved architecture allows process improvement model expansion. Extensions of the lifecycle allows coverage of more of the enterprise or potential partnering organizations adapts model features to fit non-developmental efforts © 2006 Carnegie Mellon University

87 Transition… Presentation Title 3/27/2017 We are providing an on-line upgrade course for V1.3 as we did for V1.2. Users make the transition by taking the upgrade course. During a period of one year, organizations may use either V1.2 or V1.3 models for their appraisals. Appraisals using V1.2 models will be valid for the full 3 years. © 2010 Carnegie Mellon University


Download ppt "CMMI® Version 1.3 and Beyond November 2010"

Similar presentations


Ads by Google