Cmpe 589 Spring 2006.

Slides:



Advertisements
Similar presentations
Develop an Information Strategy Plan
Advertisements

Software Process Models
More CMM Part Two : Details.
Chapter 2 The Software Process
Computer Engineering 203 R Smith Project Tracking 12/ Project Tracking Why do we want to track a project? What is the projects MOV? – Why is tracking.
Project Closure Report Basker George. Project Closure When does a project end? Does it end when the software has been delivered to customer & acceptance-tested?
CPIS 357 Software Quality & Testing I.Rehab Bahaaddin Ashary Faculty of Computing and Information Technology Information Systems Department Fall 2010.
Stepan Potiyenko ISS Sr.SW Developer.
Software Development Process Models. The Waterfall Development Model.
Software Quality Engineering Roadmap
SE 470 Software Development Processes James Nowotarski 12 May 2003.
CSE 403 Lecture 8 Risk assessment. Lecture goals Understand risk management and assessment techniques Guarding against failure to meet delivery deadline,
Capability Maturity Model (CMM) in SW design
Lecture 13 Revision IMS Systems Analysis and Design.
Fundamentals of Information Systems, Second Edition
1 R&D SDM 1 Software Project Management Capability Maturity Model 2009 Theo Schouten.
Chapter 3 The Structure of the CMM
Personal Software Process Overview CIS 376 Bruce R. Maxim UM-Dearborn.
Software Configuration Management
Capability Maturity Model
What is Business Analysis Planning & Monitoring?
S/W Project Management
Org Name Org Site CMM Assessment Kick-off Meeting Dates of assessment.
Capability Maturity Model Part One - Overview. History Effort started by SEI and MITRE Corporation  assess capability of DoD contractors First.
Process Assessment Motivation SEI Capability Maturity Model
N By: Md Rezaul Huda Reza n
Unit 8 Syllabus Quality Management : Quality concepts, Software quality assurance, Software Reviews, Formal technical reviews, Statistical Software quality.
J. R. Burns, Texas Tech University Capability Maturity Model -- CMM n Developed by the Software Engineering Institute (SEI) in 1989 –SEI is a spinoff.
©Ian Sommerville 1995 Software Engineering, 5th edition. Chapter 31 Slide 1 Process Improvement u Understanding, Modelling and Improving the Software Process.
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
Introduction to Software Engineering LECTURE 2 By Umm-e-Laila 1Compiled by: Umm-e-Laila.
S Q A.
Software Engineering - Spring 2003 (C) Vasudeva Varma, IIITHClass of 39 CS3600: Software Engineering: Standards in Process Modeling CMM and PSP.
Software Engineering Lecture # 17
Capability Maturity Model. History Effort started by SEI and MITRE Corporation  assess capability of DoD contractors First version published in.
Software Project Management With Usage of Metrics Candaş BOZKURT - Tekin MENTEŞ Delta Aerospace May 21, 2004.
Software Process Models
CS 3610: Software Engineering – Fall 2009 Dr. Hisham Haddad – CSIS Dept. Chapter 2 The Software Process Discussion of the Software Process: Process Framework,
Thomas L. Gilchrist Testing Basics Set 4: Strategies & Metrics By Thomas L. Gilchrist, 2009.
Quality Concepts within CMM and PMI G.C.Reddy
Georgia Institute of Technology CS 4320 Fall 2003.
SWEN 5130 Requirements Engineering 1 Dr Jim Helm SWEN 5130 Requirements Engineering Requirements Management Under the CMM.
Software Engineering - I
Process Improvement. It is not necessary to change. Survival is not mandatory. »W. Edwards Deming Both change and stability are fundamental to process.
Lecture 1 Introduction, Fundamentals, Classic Mistakes 1.
Develop Project Charter
Fundamentals of Information Systems, Second Edition 1 Systems Development.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Ch-1 Introduction The processes used for executing a software project have major effect on quality of s/w produced and productivity achieved in project…
Page 1 The Capability Maturity Model (CMM) distinguishes between immature and mature software organizations. Immature software organizations are typically.
CIS-74 Computer Software Quality Assurance Systematic Software Testing Chapter 11: Improving the Testing Process.
Software Quality Assurance SOFTWARE DEFECT. Defect Repair Defect Repair is a process of repairing the defective part or replacing it, as needed. For example,
SOFTWARE PROCESS IMPROVEMENT
Mahindra Satyam Confidential Quality Management System Software Defect Prevention.
Introduction to Project management and Principles.
Information Technology Project Management, Seventh Edition.
1 Project Management Skills Leadership Communications Problem Solving Negotiating Influencing the Organization Mentoring Process and technical expertise.
Capability Maturity Model. What is CMM? n CMM: Capability Maturity Model n Developed by the Software Engineering Institute of the Carnegie Mellon University.
Software Project Configuration Management
School of Business Administration
Fundamentals of Information Systems, Sixth Edition
CS4311 Spring 2011 Process Improvement Dr
Information Technology Project Management – Fifth Edition
Level 1 Level 1 – Initial: The software process is characterized as ad hoc and occasionally even chaotic. Few processes are defined, and success depends.
Software Engineering I
Capability Maturity Model
Software metrics.
Capability Maturity Model
Presentation transcript:

Cmpe 589 Spring 2006

PROCESS ASSESSMENT Definitions and motivation Software process models SEI capability maturity model Maturity assessment process Experience

MOTIVATION Risk reduction Quality improvement Productivity increase

Capability Maturity Model (CMM) Developed by the Software Engineering Institute (SEI) with DoD funding Designed for large organizations doing routine development Assessment AND evaluation What's the difference? Five levels of maturity Key processes

THE SEI PROCESS MATURITY MODEL Level Characteristic Key Problem Areas Optimizing Process improvement Automation Managed (quantitative) Process Measured Changing technology Problem analysis Problem prevention Defined (qualitative) Process defined and institutionalized Process measurement Process analysis Quantitative quality plans Repeatable (intuitive) Process dependant on individuals Training Technical practice reviews, testing Process focus standards, process groups Initial (ad hoc / chaotic) Project management Project planning Configuration management Software quality assurance Results Productivity Quality Risk

Levels Level 1: Initial Level 2: Repeatable Level 3: Defined Instable; dependent on individuals Level 2: Repeatable Policies; use of experience in planning; discipline Level 3: Defined Documented process; process group; readiness and completion criteria Level 4: Managed Quantitative goals; data collection Level 5: Optimized Continuous process improvement

LEVEL 1: Initial Process Ill-defined inputs; cost and schedule overruns Undefined process; no repeatability Simple metrics of size, staff effort Baseline for later comparison

LEVEL 2: Repeatable Process Identified process inputs, outputs, and constraints No knowledge of how outputs are produced Measures of size: Lines of code (LOC), function points, object and method counts Requirements volatility Extent of personnel experience determines success Domain / applications, development architecture, tools / methods, overall years of experience, turnover Key areas Requirements, management, project planning, project tracking, subcontract management, QA, CM

LEVEL 3: Defined Process Activities with definitions and entry / exit criteria Measures of requirements complexity, design modules, code complexity, test paths, pages of documentation Software Engineering Process Groups (SEPGs) Quality metrics Failures discovered, failure density for each activity area Key areas Organizational process definition, training program, integrated management, product engineering, intergroup coordination, peer reviews

LEVEL 4: Managed Process Feedback from early activities is used to set priorities for later stages Data collected Process type, extent of reuse (production and consumption), when are defects detected, testing completion criteria, use of configuration management, change control, traceability links, module completion rate Key areas Process measurement and analysis, quality management

LEVEL 5: Optimizing Process Measures of activities are used to change the process Analogy with Statistical Process Control (SPC) Key Areas Defect prevention, technology innovation, process change management

Assessment Selection of assessment team Management commitment Assessment agreement Preparation Training, survey questionnaire Assessment Questionnaire analysis; discussions with projects and functional area representatives; findings; feedback; presentation Report Follow Up Action plan, reassessment after 18 months

Assessment Details - 1 Assessment team has 6-8 members, some internal, some external Either SEI or a vendor Team members have > 10 years experience; team leader has > 20 years experience Assessment itself takes 3-5 days 78 YES / NO questions

Assessment Details - 2 Four or five projects are examined per organization Interviews with 8-10 functional area representatives (FARs) from each area QA, integration testing, coding and unit test, requirements and design Implementation process takes 12-18 months Follow-up at the end of this time

Questionnaire - 1 2.3 Data Management and Analysis Data management deals with the gathering and retention of process metrics. Data management requires standardized data definitions, data management facilities, and a staff to ensure that data is promptly obtained, properly checked, accurately entered into the database and effectively managed. Analysis deals with the subsequent manipulation of the process data to answer questions such as, "Is there a relatively high correlation between error densities found in test and those found in use?" Other types of analyses can assist in determining the optimum use of reviews and resources, the tools most needed, testing priorities, and needed education.

Questionnaire - 2 2.3.1 Has a managed and controlled process database been established for process metrics data across all projects? 2.3.2 Are the review data gathered during design reviews analyzed? 2.3.3 Is the error data from code reviews and tests analyzed to determine the likely distribution and characteristics of the errors remaining in the product? 2.3.4 Are analyses of errors conducted to determine their process related causes? 2.3.5 Is a mechanism used for error cause analysis? 2.3.6 Are the error causes reviewed to determine the process changes required to prevent them.? 2.3.7 Is a mechanism used for initiating error prevention actions? 2.3.8 Is review efficiency analyzed for each project? 2.3.9 Is software productivity analyzed for major process steps?

Case Study: Hughes Aircraft - 1 1987: Level 2; 1990: Level 3 $45K assessment cost; $400K improvement cost; $2M savings; 2% increased overhead; 18 months implementation (78 staff months); 5x improvement in expenditure estimation 500 people in part of one division

Case Study: Hughes Aircraft - 2 Major 1987 recommendations Central data repository, process group, more involvement in requirements process, technology transition organization Major 1990 recommendations More division-wide data analysis; opportunities for automation

CMM Benefits Level 2 leads to superior product quality CMM encapsulates industry best practices DOD sponsorship has enforced process improvement throughout Defense community Quality movement has led to CMM being quite widely used in other sectors Enhanced understanding of the development process Increased control and risk reduction Migration path to a more mature process More accurate cost estimation and scheduling Objective evaluations of changes in tools and techniques Standardized training Marketing

CMM Criticisms Lots of room for interpretation of assessment rules Purpose and potential misuse of model Originally for self-assessment and organizational learning Increasingly used by DoD for contractor evaluation and qualification Tends to ignore different needs of different development environments Emphasis on DoD contractual development Emphasis on big, mission-critical projects Deemphasis of design risk Deemphasis on satisfaction of customer requirements

SURVEY SEI 85% At Level 1 14% At Level 2 1% At Level 3 1990: 113 Projects 85% At Level 1 14% At Level 2 1% At Level 3

Process Improvement

Process Mistakes (McConnell) Overly optimistic schedules Insufficient risk management Contractor failure Insufficient planning Abandonment of planning under pressure Wasted time during fuzzy front end Short-changed upstream activities

More Process Mistakes Inadequate design Short changes quality assurance Insufficient management controls Premature or overly frequent convergence Omitting necessary tasks from estimates Planning to catch up later "Code-like-hell" programming

McConnell Fixes Identify and fix classical mistakes Miniature milestones Milestone-based schedule Track schedule meticulously Record reasons for missed milestones Recalibrate early and often Don't commit to a new schedule until you can do so meaningfully Manage risks painstakingly

Process Improvement (Shewart Cycle) Plan What and why? Do How, when, and how much? Check How will you know it worked? Act How do you plan to fully adopt? Act Plan Check Do

The Dilbert Cycle ACT Blame someone else for Make wild guess catastrophe Make wild guess at what is wrong ACT Adopt unproven process or technology

Plan Identify problem or opportunity Vision Where do you want to be? Describe the current process Is the process in control? Is it repeatable? Identify possible weaknesses Strategic impact Cost / benefit analysis, how effort supports core-competence, risk of not doing

Do Determine what changes might help Develop or purchase an evolution plan Implementation tactics Major deliverables, organization maturity considerations, key role of infrastructure, coordination with other groups, timing to match user projects, hand-off criteria Select and implement a change on a pilot project People and resources Sponsor and champion roles, user partners, people and expertise needed, hardware, software, space Schedule risks, contingency plans

Check Measure effect of change Postproject review, degree of adoption Baseline measurement, prechange environmental characteristics, expected effect Postproject review, degree of adoption Percentage of engineers using improvement, range of uses, maturity of usage

Act Adopt across organization Support strategy Continue cycle Infrastructure changes, documentation, training, consulting, packaging, maintenance, feedback Continue cycle

Metrics and Measurement Get the data first, then distort it with your judgment. --Mark Twain You can’t control what you can’t measure. --Tom Demarco If you don’t know where you are going any map will do, If you don’t know where you are, a map won’t help. --Watts Humphr

Getting the data How do we decide what data to collect? Goal, Question Metric (GQM) (Basili) Goal, e.g. reduce defects to 6/KLOC in code Question, e.g. what is the defect rate now? Metric, e.g. defect count per KLOC

Key Metrics Lines of code Staff months Calendar months Defects

Defect Analysis Origin (where) Type (what) Mode (why) Requirements / specification, design, code, environment support, documentation Type (what) Mode (why) Missing, unclear, wrong, changed, better way

Defect Types Requirement / analysis: functionality Specification / design: HW / SW / User interfaces, functional description Design: Communication, data definition, module design, logic description, error checking, standards Code: Logic, computation, data handling, module interface implementation, standards Environment: Test SW, Test HW, development tools, integration SW

Root Cause Analysis Determination of the underlying process deficiency that causes a class of product defects Collect defect data Determine causes Organizational buy-in (tie to business goals) Assign responsibility Variations One-shot, postproject, continuous

One-Shot Root Cause Analysis (Requires prior defect data collection but no prior causal analysis) Train on defect analysis; indicate goals Select 50-75 defects randomly Classify; pie chart Pick top two defect types; fishbone diagram Improvement recommendations Present results; assign responsibility; implement change

Postproject Root Cause Analysis (Prior understanding of defect patterns) Premeeting Identify primary business goal Data analysis (champion and facilitator) Select two defect types for brainstorming Invitations to engineers Bring examples of their faults (in selected categories) Guess at root cause Suggested prevention mechanism

Meeting State meeting goal and rules Select issues Review defects/causes/solutions brought to meetings Analyze defects (fishbone diagram) Brainstorm solutions Assure organizational commitment Plan for change (responsibilities and dates)

Postmeeting Champion and facilitator review meeting Prepare meeting summary Champion captures process baseline data Before and after process description and data

Fishbone Diagram (Cause and Effect Diagram) Write suggested causes of a class of defects on note cards Organize into affinity groups Use lines to indicate "is a cause of" or "is a prerequisite of" Spine is major defect class Major branches are affinity groups

Example

Third Party Audits / Assessments CMM: maturity model ISO9001: international standard SQPA: business practices; functional model QMS: business model Baldridge: Department of commerce; weighted criteria

CMM V IV III II I Continuously Improving • Quantitative strategic goals • Processes improved IV • Improvement based on data Quantitatively Controlled • Definition of quantitative goals III • Process metrics captured • Managing processes by data Well Defined • Development of org. std. process II • Projects use org. std. process • Sharing organizational learning Planned & Tracked I • Work is planned & managed Performed Informally • Projects using defined process • Individual heroics • Controlling local chaos • Essential elements performed Legend: Not Performed • Doing systems engineering Level Title • Characterized by • Achieved when • Primary Concept • SE process area not being done • N/A • Organizational starting point SA- 11

ISO 9000 Series Actually a series of specifications that allow companies to be “certified” Doesn’t focus on improvement—just current state Addresses minimum criteria for a quality system Strongly paper-trail oriented Checklist assessment Generic—not software specific Principle: Every important process should be documented

Guidelines for Process Improvement Incremental rather than big-bang implementation Contrast with “Paradigm Shift", "BPR” Use data Treat root causes not symptoms Workers know best how to do a process Avoid consultants

More Guidelines Today’s problems come from yesterday’s solutions The harder you push, the harder the system pushes back The easy way out usually leads back in The cure can be worse than the disease

Still More Guidelines Faster is slower Cause and Effect are not closely related in time and space Small changes can produce big results, but the areas of highest leverage are the least obvious You can have your cake and eat it too, but not at the same time Dividing an elephant in half does not produce two small elephants "Fix the problem, not the blame"