Process Assessment Motivation SEI Capability Maturity Model

Slides:



Advertisements
Similar presentations
Implementing CMMI® for Development Version 1.3
Advertisements

SPIN-BG Seminar 1.Overview of CMMI Model changes 3.SCAMPI method changes 4.Training changes 5.CMMI Architecture Author: Kiril Karaatanasov
Kai H. Chang COMP 6710 Course NotesSlide CMMI-1 Auburn University Computer Science and Software Engineering Capability Maturity Model Integration - CMMI.
National Cheng-Kung University
1 Software Process Models Lawrence ChungSoftware Engineering.
More CMM Part Two : Details.
Chapter 2 The Software Process
CPIS 357 Software Quality & Testing I.Rehab Bahaaddin Ashary Faculty of Computing and Information Technology Information Systems Department Fall 2010.
Stepan Potiyenko ISS Sr.SW Developer.
Capability Maturity Model (CMM) in SW design
Capability Maturity Model Integration (CMMI). CMMI Enterprise-wide process improvement framework Focuses on processes for improved product Process areas:
1 R&D SDM 1 Software Project Management Capability Maturity Model 2009 Theo Schouten.
CMM Overview - 1 © Paul Sorenson CMPUT Software Engineering refs. IEEE Software, March 1988, 73-79, and IEEE Software, July 1993, (Capability.
Chapter 3 The Structure of the CMM
Lecture 11 CMM CSCI – 3350 Software Engineering II Fall 2014 Bill Pine.
Capability Maturity Model
Chapter : Software Process
Process: A Generic View
CMMI Course Summary CMMI course Module 9..
Copyright © 2009, Systems and Software Consortium, Inc. Introduction to an Integrated Lean Thinking, Six Sigma  and CMMI  Approach for Process Improvement.
8. CMMI Standards and Certifications
S T A M © 2000, KPA Ltd. Software Trouble Assessment Matrix Software Trouble Assessment Matrix *This presentation is extracted from SOFTWARE PROCESS QUALITY:
Integrated Capability Maturity Model (CMMI)
Software Engineering II Lecture 1 Fakhar Lodhi. Software Engineering - IEEE 1.The application of a systematic, disciplined, quantifiable approach to the.
The Capability Maturity Model in Software Development Paul X. Harder, JD Government Micro Resources, Inc. September 14, 2004.
Org Name Org Site CMM Assessment Kick-off Meeting Dates of assessment.
Capability Maturity Model Part One - Overview. History Effort started by SEI and MITRE Corporation  assess capability of DoD contractors First.
N By: Md Rezaul Huda Reza n
Software Quality Assurance Activities
1 Chapter 2 The Process. 2 Process  What is it?  Who does it?  Why is it important?  What are the steps?  What is the work product?  How to ensure.
J. R. Burns, Texas Tech University Capability Maturity Model -- CMM n Developed by the Software Engineering Institute (SEI) in 1989 –SEI is a spinoff.
CMMi What is CMMi? Basic terms Levels Common Features Assessment process List of KPAs for each level.
People First … Mission Always Capability Maturity Model Integration (CMMI ® ) Millee Sapp 2 Dec 08 Warner Robins Air Logistics Center.
Introduction to Software Engineering LECTURE 2 By Umm-e-Laila 1Compiled by: Umm-e-Laila.
Chapter 2 Process: A Generic View
Software Engineering - Spring 2003 (C) Vasudeva Varma, IIITHClass of 39 CS3600: Software Engineering: Standards in Process Modeling CMM and PSP.
Software Engineering Lecture # 17
NDIA Systems Engineering Supportability & Interoperability Conference October 2003 Using Six Sigma to Improve Systems Engineering Rick Hefner, Ph.D.
Capability Maturity Model. History Effort started by SEI and MITRE Corporation  assess capability of DoD contractors First version published in.
Process Management Process Management in software started in late 1960’s (but informally and inconsistently) Software Engineering Institute (SEI) is the.
Cmpe 589 Spring 2006.
Software Process Models
Quality Concepts within CMM and PMI G.C.Reddy
1 ISO 9001:2000 ISO 9001 is the creation of the International Organisation for Standardisation (ISO), a Swiss-based federation of national standards bodies.ISO.
Georgia Institute of Technology CS 4320 Fall 2003.
1 © Mahindra Satyam 2009 Mahindra Satyam Confidential Welcome To CMMI Introduction.
SWEN 5130 Requirements Engineering 1 Dr Jim Helm SWEN 5130 Requirements Engineering Requirements Management Under the CMM.
Software Engineering - I
Requirements Development in CMMI
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
January 2003 CMMI ® CMMI ® V1.1 Tutorial Sponsored by the U.S. Department of Defense © 2003 by Carnegie Mellon University SM CMM Integration and SCAMPI.
Ch-1 Introduction The processes used for executing a software project have major effect on quality of s/w produced and productivity achieved in project…
Level 1 Level 1 – Initial: The software process is characterized as ad hoc and occasionally even chaotic. Few processes are defined, and success depends.
Page 1 The Capability Maturity Model (CMM) distinguishes between immature and mature software organizations. Immature software organizations are typically.
An Introduction. Objective - Understand the difference between CMM & CMMI - Understand the Structure of CMMI.
CMMI1 Capability Maturity Model Integration Eyal Ben-Ari 8/2006.
Mahindra Satyam Confidential Quality Management System Software Defect Prevention.
Cmpe 589 Spring Fundamental Process and Process Management Concepts Process –the people, methods, and tools used to produce software products. –Improving.
Certification: CMMI Emerson Murphy-Hill. Capability Maturity Model Integration (CMMI) Creation of the Software Engineering Institute (SEI) at Carnegie.
Figures – Chapter 26. Figure 26.1 Factors affecting software product quality.
School of Business Administration
CS4311 Spring 2011 Process Improvement Dr
CMMI – Staged Representation
Software Engineering: A Practitioner’s Approach, 6/e Chapter 2 Process: A Generic View copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc.
Software Engineering Lecture 16.
Software Engineering I
Capability Maturity Model
Capability Maturity Model
Requirements Development in CMMI
Chapter 4: Software Process Models
Presentation transcript:

Process Assessment Motivation SEI Capability Maturity Model Maturity assessment process Case studies Critique 4/21/2017 ã 2007, Spencer Rugaber

Motivation Risk reduction Quality improvement Productivity increase 4/21/2017 ã 2007, Spencer Rugaber

Capability Maturity Model (CMM) Developed by the Software Engineering Institute (SEI) with Department of Defense (DoD) funding Designed for large organizations doing routine development Assessment and evaluation What is the difference? Five levels of maturity Key processes 4/21/2017 ã 2007, Spencer Rugaber

CMM . . . CMM CMMI CMMI for Development Original maturity model CMM Integration Generalization of CMM to different kinds of products and activities (software, services, acquisitions) CMMI for Development Evolution of original CMM wrt software 4/21/2017 ã 2007, Spencer Rugaber

The SEI Process Maturity Model Level Characteristic Key Problem Areas Optimizing Process improvement Automation Managed (quantitative) Process Measured Changing technology Problem analysis Problem prevention Defined (qualitative) Process defined and institutionalized Process measurement Process analysis Quantitative quality plans Repeatable (intuitive) Process dependant on individuals Training Technical practice reviews, testing Process focus standards, process groups Initial (ad hoc / chaotic) Project management Project planning Configuration management Software quality assurance Results Productivity Quality Risk 4/21/2017 ã 2007, Spencer Rugaber (source : SEI)

Levels Level 1: Initial Level 2: Repeatable Level 3: Defined Instable; dependent on individuals Level 2: Repeatable Policies; use of experience in planning; discipline Level 3: Defined Documented process; process group; readiness and completion criteria Level 4: Managed Quantitative goals; data collection Level 5: Optimized Continuous process improvement 4/21/2017 ã 2007, Spencer Rugaber

Level 1: Initial Process Ill-defined inputs; cost and schedule overruns Undefined process; no repeatability Simple metrics of size, staff effort Baseline for later comparison 4/21/2017 ã 2007, Spencer Rugaber

Level 2: Repeatable Process Identified process inputs, outputs, and constraints No knowledge of how outputs are produced Measures of size: Lines of code (LOC), function points, object and method counts Requirements volatility Extent of personnel experience determines success Domain / applications, development architecture, tools / methods, overall years of experience, turnover Key areas Requirements, management, project planning, project tracking, subcontract management, QA, Change Management 4/21/2017 ã 2007, Spencer Rugaber

Level 3: Defined Process Activities with definitions and entry / exit criteria Measures of requirements complexity, design modules, code complexity, test paths, pages of documentation Software Engineering Process Groups (SEPGs) Quality metrics Defects discovered, error density for each activity area Key areas Organizational process definition, training program, integrated management, product engineering, intergroup coordination, peer reviews 4/21/2017 ã 2007, Spencer Rugaber

Level 4: Managed Process Feedback from early activities is used to set priorities for later stages Data collected Process type, extent of reuse (production and consumption), when are defects detected, testing completion criteria, use of configuration management, change control, traceability links, module completion rate Key areas Process measurement and analysis, quality management 4/21/2017 ã 2007, Spencer Rugaber

Level 5: Optimizing Process Measures of activities are used to change the process Analogy with Statistical Process Control (SPC) Key Areas Defect prevention, technology innovation, process change management 4/21/2017 ã 2007, Spencer Rugaber

Assessment Process Selection of assessment team Management commitment Assessment agreement Preparation Training, survey questionnaire of key practices Assessment Questionnaire analysis; discussions with projects and functional area representatives; findings; feedback; presentation Report Follow Up Action plan, reassessment after 18 months 4/21/2017 ã 2007, Spencer Rugaber

Assessment Details - 1 Assessment team has 6-8 members, some internal, some external Either SEI or a vendor Team members have > 10 years experience; team leader has > 20 years experience Assessment itself takes 3-5 days 78 YES / NO questions Hurdle scoring (binary) 4/21/2017 ã 2007, Spencer Rugaber

Assessment Details - 2 Four or five projects are examined per organization Interviews with 8-10 functional area representatives (FARs) from each area QA, integration testing, coding and unit test, requirements and design Implementation process takes 12-18 months Follow-up at the end of this time 4/21/2017 ã 2007, Spencer Rugaber

CMM Key Practices 5 4 3 2 1 4/21/2017 ã 2007, Spencer Rugaber Level 5 Process change management Technology innovation Defect prevention 5 Level 4 Quality management Process measurement and analysis 4 Level 3 Peer reviews Intergroup coordination Software product engineering Integrated software management Training program Organization process definition Organization process focus 3 Level 2 Software configuration management Software quality assurance Software subcontract management Software project tracking and oversight Software project planning Requirements management 2 1 Level 1 4/21/2017 ã 2007, Spencer Rugaber

CMMI Process Areas Causal Analysis and Resolution (CAR) Configuration Management (CM) Decision Analysis and Resolution (DAR) Integrated Project Management +IPPD (IPM+IPPD) Measurement and Analysis (MA) Organizational Innovation and Deployment (OID) Organizational Process Definition +IPPD (OPD+IPPD) Organizational Process Focus (OPF) Organizational Process Performance (OPP) Organizational Training (OT) Product Integration (PI) Project Monitoring and Control (PMC) Project Planning (PP) Process and Product Quality Assurance (PPQA) Quantitative Project Management (QPM) Requirements Development (RD) Requirements Management (REQM) Risk Management (RSKM) Supplier Agreement Management (SAM) Technical Solution (TS) Validation (VAL) Verification (VER) 4/21/2017 ã 2007, Spencer Rugaber

Example Questionnaire Area 2.3 Data Management and Analysis Data management deals with the gathering and retention of process metrics. Data management requires standardized data definitions, data management facilities, and a staff to ensure that data is promptly obtained, properly checked, accurately entered into the database and effectively managed. Analysis deals with the subsequent manipulation of the process data to answer questions such as, "Is there a relatively high correlation between error densities found in test and those found in use?" Other types of analyses can assist in determining the optimum use of reviews and resources, the tools most needed, testing priorities, and needed education. 4/21/2017 ã 2007, Spencer Rugaber

Example Questions 2.3.1 Has a managed and controlled process database been established for process metrics data across all projects? 2.3.2 Are the review data gathered during design reviews analyzed? 2.3.3 Is the error data from code reviews and tests analyzed to determine the likely distribution and characteristics of the errors remaining in the product? 2.3.4 Are analyses of errors conducted to determine their process related causes? 2.3.5 Is a mechanism used for error cause analysis? 2.3.6 Are the error causes reviewed to determine the process changes required to prevent them? 2.3.7 Is a mechanism used for initiating error prevention actions? 2.3.8 Is review efficiency analyzed for each project? 2.3.9 Is software productivity analyzed for major process steps? 4/21/2017 ã 2007, Spencer Rugaber

Case Study: Hughes Aircraft - 1 500 people in part of one division 1987: Level 2; 1990: Level 3 $45K assessment cost; $400K improvement cost; $2M savings; 2% increased overhead; 18 months implementation (78 staff months); 5x improvement in expenditure estimation 4/21/2017 ã 2007, Spencer Rugaber

Case Study: Hughes Aircraft - 2 Major 1987 recommendations Central data repository, process group, more involvement in requirements process, technology transition organization Major 1990 recommendations More division-wide data analysis; opportunities for automation 4/21/2017 ã 2007, Spencer Rugaber

Early Results 4&up Problem areas: Error projection Test and review coverage Process metrics database Design and code reviews Software eng. training Software eng. process group Project planning Change control and CM Regression testing 3 Software Process Maturity Level 2 1 2% 12% 28% 28% 21% 9% Software process maturity distribution (in quartiles) (Source: IEEE Software) 4/21/2017 ã 2007, Spencer Rugaber

More Recent Results CMMI - 2005 4/21/2017 ã 2007, Spencer Rugaber

CMM Benefits Level 2 leads to superior product quality CMM encapsulates industry best practices DoD sponsorship has enforced process improvement throughout Defense community Quality movement has led to CMM being quite widely used in other sectors Enhanced understanding of the development process Increased control and risk reduction 4/21/2017 ã 2007, Spencer Rugaber

Benefits - 2 Migration path to a more mature process More accurate cost estimation and scheduling Objective evaluations of changes in tools and techniques Standardized training Marketing 4/21/2017 ã 2007, Spencer Rugaber

CMM Criticisms Lots of room for interpretation of assessment rules Purpose and potential misuse of model Originally for self-assessment and organizational learning Increasingly used by DoD for contractor evaluation and qualification Tends to ignore different needs of different development environments Emphasis on DoD contractual development Emphasis on big, mission-critical projects Deemphasis of design risk Deemphasis on satisfaction of customer requirements 4/21/2017 ã 2007, Spencer Rugaber