1 / x CMMI Measurement & Analysis Pieter Cailliau Stijn De Vos Measurement & Analysis.

Slides:



Advertisements
Similar presentations
Integrated Project Management IPM (Without IPPD) Intermediate Concepts of CMMI Project meets the organization Author: Kiril Karaatanasov
Advertisements

Process and Product Quality Assurance (PPQA)
Chapter 2 The Software Process
For internal use Project Management at a glance - the ‘what’ and the ‘how’ Jørgen Nygaard Nielsen Dan Bandsholm Erensø PMD –
Software Delivery. Software Delivery Management  Managing Requirements and Changes  Managing Resources  Managing Configuration  Managing Defects 
Stepan Potiyenko ISS Sr.SW Developer.
Root Cause Analysis Procedure (IMSP525) Training
CMMI PMC Group Members Inam ul Haq Sajjad Raza Nabeel Azam
Planning a measurement program What is a metrics plan? A metrics plan must describe the who, what, where, when, how, and why of metrics. It begins with.
Measurement & Analysis Adeel Munir Butt Muhammad Adnan Nasir Muhammad Shoaib Khan.
Adaptive Processes Comparing CMMI 1.2 vs. CMMI 1.1 LN Mishra Adaptive Processes Consulting.
Process Area : Requirement Management (REQM) By: Amna Rehmat Maria Habib Sana Ahmed.
Release & Deployment ITIL Version 3
What is Business Analysis Planning & Monitoring?
Chapter : Software Process
Understanding (and Untangling) Verification and Validation Requirements ISO 9001 vs. CMMI-Dev 1.2.
Process: A Generic View
CMMI Course Summary CMMI course Module 9..
Capability Maturity Model Integration
1 The Continuous Representation. 2 UNIT 2 Topics covered in this unit include Additional terminology Practices – The fundamental building blocks Process.
Web Development Process Description
Integrated Capability Maturity Model (CMMI)
Chapter 4 Interpreting the CMM. Group (3) Fahmi Alkhalifi Pam Page Pardha Mugunda.
COMPANY CONFIDENTIAL Page 1 Final Findings Briefing Client ABC Ltd CMMI (SW) – Ver 1.2 Staged Representation Conducted by: QAI India SM - CMMI is a service.
1 The Continuous Representation. 2 UNIT 2 Topics covered in this unit include Additional terminology Practices – The fundamental building blocks Process.
CMMi What is CMMi? Basic terms Levels Common Features Assessment process List of KPAs for each level.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
ARINC PROPRIETARY Measurement and Analysis JD Rosser SC-SPIN January 2008.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
EARTO – working group on quality issues – 2 nd session Anneli Karttunen, Quality Manager VTT Technical Research Centre of Finland This presentation.
 To explain the importance of software configuration management (CM)  To describe key CM activities namely CM planning, change management, version management.
1 / x Project Planning CMMI Project Planning Jean-Luc Deprez Robin Leblon.
Configuration Management (CM)
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
Project Planning Author : Software Engineering Institute Carnegie Mellon University 學生 : 吳與倫 老師:李健興 教授.
1 / x Verification CMMI Verification Hendrik van den Berge Kevin Mets.
10/16/2015Bahill1 Organizational Innovation and Deployment Causal Analysis and Resolution 5 Optimizing 4 Quantitatively Managed 3 Defined 2 Managed Continuous.
Software process improvement Framework for SPI SPI support groups, maturity and immaturity models Assessment and gap analysis Education and training Selection.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Current and Future Applications of the Generic Statistical Business Process Model at Statistics Canada Laurie Reedman and Claude Julien May 5, 2010.
Adaptive Processes Overview Adaptive Processes©. Adaptive Processes © Adaptive ProcessesSimpler, Faster, Better2 Objective To provide an over view of.
1/18 CMMI Risk Management Jense Seurynck Daan Van Britsom Risk Management.
Managing CMMI® as a Project
CMMI: PROCESS AND PRODUCT QUALITY ASSURANCE Lieven Lemiengre Thomas Spranghers.
Georgia Institute of Technology CS 4320 Fall 2003.
1 1 Major Changes in CMMI v1.3 Configuration Management Working Group April 12, 2011.
Capability Maturity Model Integration Project Monitoring and Control Software Management 2008 – 2009 Alexander Ide Niels Soetens.
@2002 Copyright, Itreya Technologies CMMI kick off July 2005.
Software Engineering - I
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
1 Agenda for measurement r1. CMMI r2. Other thrusts.
1 / 7 ©Jozef De Man CMMI Exercise Guidelines CMMI Exercise Guidelines Prof. Dr. ir J. De Man Version 2.
1 / 25 IPM CMMI Integrated Project Management (IPM) Dieter De Paepe & Sarah Bourgeois.
1 / x CMMI Technical Solution Rob Vanden Meersche Dieter Van den Bulcke.
Capability Maturity Model Integration Project Monitoring and Control Software Management 2008 – 2009 Alexander Ide Niels Soetens.
Purpose: The purpose of CMM Integration is to provide guidance for improving your organization’s processes and your ability to manage the development,
需求管理 Capability Maturity Model Integrated Author : Softare Engineering Institute Carnegie Mellon University.
Copyright © | Trade secret and confidential Page 1 Innovative, Professional, Fact Based and Eustressed© Maruthi Quality Management Services Ptv. Ltd..,
Software Testing Process
Space and Airborne Systems Prepared For 3rd Annual CMMI Technology Conference Presented In Denver, CO Tom Cowles November 19, 2003 Peer Reviews For CMMI.
 CMMI  REQUIREMENT DEVELOPMENT  SPECIFIC AND GENERIC GOALS  SG1: Develop CUSTOMER Requirement  SG2: Develop Product Requirement  SG3: Analyze.
Pittsburgh, PA CMMI Acquisition Module - Page M5-1 CMMI ® Sponsored by the U.S. Department of Defense © 2005 by Carnegie Mellon University This.
COMPGZ07 Project Management CMMI Project Planning Lecture 5b Graham Collins, UCL.
MSA Orientation – v203a 1 What’s RIGHT with the CMMI?!? Pat O’Toole
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
Info-Tech Research Group1 Manage IT Budgets & Cost World Class Operations - Impact Workshop.
Adaptive Software Development Process Framework. Version / 21 / 2001Page Project Initiation 2.0 Adaptive Cycle Planning 5.0 Final Q/A and.
CMMI for Services, Version 1.3 Speaker: Business Excellence Date:
CMMI – Staged Representation
Engineering Processes
Presentation transcript:

1 / x CMMI Measurement & Analysis Pieter Cailliau Stijn De Vos Measurement & Analysis

2 / x Measurement & Analysis Purpose  CMMI for development v1.2: The purpose of Measurement and Analysis (MA) is to develop and sustain a measurement capability that is used to support management information needs. Basically,  You need all kinds of information in other process areas,  and you find out what you can measure and analyze in order to obtain these information. Measurement & Analysis

3 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.1 Establish Measurement objectives that are derived from identified information needs and objectives. Sources of information needs/objectives: Use project plans, e.g. Architecture, scenario’s,... Review Meetings for feedback Experiences of other projects or organization entities. Objectives: “Why are we measuring?” Reduce time to delivery Reduce total lifecycle cost Deliver specified functionality Improve prior customer satisfaction ratings Measurement & Analysis

4 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.2 Specify measures to address the measurement objectives (ctd) Only one group has documented measurement objectives on their wiki But all groups do have “unwritten” objectives 4/4 Timesheet objectives 4/4 Testing objectives 2/4 Code readability Measurement & Analysis [17] TultiMouch MA objectives [10] Hadra code readability guidelines

5 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.2 Specify measures to address the measurement objectives Timesheets: each task is measured in hours/minutes Testing: class coverage (#unit_tests/#classes) method coverage (#test_methods/#methods) line coverage (#test_lines/#code_lines) Code readability Count of coding warnings/errors Measurement & Analysis [1] WAFL Timesheet [2] Hadra Tickets on trac

6 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.3 Specify Data Collection and Storage Procedure Timesheets: Only one group has specified a procedure for billing their work All groups have “unwritten” procedures  1 group uses txt-files (individually updated) on a dropbox  2 groups store their tickets on a trac  1 group uses txt-files on their SVN Testing All groups have “unwritten” procedures  All groups: individually write unit tests  PHPUnderControl/CruiseControl, Hudson, rcov  All groups: other tests by Test Manager Code readability One group: Automated  PHP Code Sniffer Measurement & Analysis [14] TultiMouch Billing procedure [16] Mashedup List of JUnits

7 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.4 Specify Analysis Procedures All groups have test managers but no formal procedure of analysis, but testing analysis are usually automatically generated (trac, rcov, CruiseControl) Measurement & Analysis

8 / x Measurement & Analysis Specific Goals  SG 2 Provide Measurement Results n SP2.1 Collect Measurement Data Timesheets Individually Testing Testing code committed individually, measurement data generated by tools (PHPUnderControl/CruiseControl, rcov, …) Code readability Automated (PHPCodeSniffer, …) Measurement & Analysis [4] WAFL Coverage single unit test [5] WAFL Coverage unittests; different categories [6] Hadra Coverage unittests [7] Hadra Coverage functioals [8] WAFL Overview of unittests [9] WAFL PHP CodeSniffer results

9 / x Measurement & Analysis Specific Goals  SG 2 Provide Measurement Results n SP2.2 Analyse Measurement Data Timesheets: analysis and interpretation of the workload: straightforward. duration of a task can be linked to a risk:  Longer than expected -> underestimated risk.  Putting much effort in a task -> justified because of its high risk. missed deadlines: e.g. a task took much longer than expected, which caused deadline failure. Testing: Results are interpreted by the testing managers. Code readability: Results are interpreted by the testing managers. It’s up to them to decide what’s acceptable and what’s not. Measurement & Analysis

10 / x Measurement & Analysis Specific Goals  SG 2 Provide Measurement Results n SP2.3 Store Data and Results Timesheets: 1 group stores timesheets in a dropbox Other groups use trac Testing: 1 group uses PHPUnderControl/CruiseControl Other groups use Trac Code readability: PHPUnderControl/CruiseControl Measurement & Analysis [3] WAFL Timesheets in dropbox [2] Hadra Tickets on trac [4] WAFL Coverage single unit test [5] WAFL Coverage unittests; different categories [6] Hadra Coverage unittests [7] Hadra Coverage functioals [8] WAFL Overview of unittests

11 / x Measurement & Analysis Specific Goals  SG 2 Provide Measurement Results n SP2.4 Communicate Results Discussed on Review meetings Most analysis are available for all team members on trac, PHPUnderControl/CruiseControl Measurement & Analysis [12] TultiMouch Work pressure on trac [13] TultiMouch Statistics of tickets

12 / x Measurement & Analysis Delete this slide in final version Guidelines (remove in final version) Use English language Copy only the titles of the practices not the description from the CMMI – use subpractices as guidance but do not discuss separately Map CMMI terminology to project terminology (add to glossary) Add explanatory slides as required Add references to direct/indirect artifacts for each practice Include descriptions/screen shots of artifacts Make a first draft based on documentation of projects and results of dream questions and add markers where information is missing Use this draft in interview sessions to ask questions to the group or discuss Update final version with interview feedback Ensure this presentation enables students to understand the practices of this process area in the context of real projects Do not forget the slide with Strengths and Opportunities for Improvement Check spelling

13 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 2  GP2.1 Establish an Organizational Policy n One group: The process manager in cooperation with the test/project managers generates the processes n Other groups: agreements that have been made during the meetings They can be found in the meeting reports, in the list of key decisions or on the WIKI.  GP2.2 Plan the Process n Most groups have wiki pages with processes/tutorials

14 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 2 (2)  GP2.3 Provide Resources n All groups have tools implemented on their server (rcov, …)  GP2.4 Assign Responsibility n All groups count on individual responsability Social control n Every group has Test Managers  GP2.5 Train People n Tutorials on the Wiki [11] WAFL Wiki page with training tutorial [15] TultiMouch Manual for stresstest

15 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 2 (3)  GP2.6 Manage Configurations n Testing and code readability: This is mainly managed by the PHPUnderControl/CruiseControl or trac server. This server does the data collecting/generating and presenting the results.  GP2.7 Identify and Involve Relevant Stakeholders n Whole group, clients, prof  GP2.8 Monitor and Control the Process n One group: process descriptions on trac evaluated by PM n All groups: testing by Test Managers

16 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 2 (4)  GP2.9 Objectively Evaluate Adherence n Not performed  GP2.10 Review Status with Higher Level Management n Not performed, maybe indirect on review meetings

17 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 3 GG 3 Institutionalize a Defined Process The process is institutionalized as a defined process. GP 3.1 Establish a Defined Process Establish and maintain the description of a defined measurement and analysis process. [WAFL]: e.g. all PHP projects should use PHP Unit, PHPUnderControl/CruiseControl and the PEAR coding standard. [TultiMouch]: e.g. all work done should be ticketed on the trac immediately GP 3.2 Collect Improvement Information Collect work products, measures, measurement results, and improvement information derived from planning and performing the measurement and analysis process to support the future use and improvement of the organization’s processes and process assets. Examples of this could be: bad experience with certain tools should result in looking out for better tools. e.g. In WAFL, we should replace the timesheets on the dropbox by the build in function of trac. Also, WAFL has a separate WIKI, this could be replaced by the WIKI in trac.

18 / x Measurement & Analysis Measurement & Analysis Process Assets  [1] Example of a timesheet template, WAFL.

19 / x Measurement & Analysis Measurement & Analysis Process Assets  [2] Time spending in trac, for each ticket, Hadra

20 / x Measurement & Analysis Measurement & Analysis Process Assets  [3] Example of the timesheets in the dropbox, WAFL

21 / x Measurement & Analysis Measurement & Analysis Process Assets  [4] Coverage of a single unit test, WAFL

22 / x Measurement & Analysis Measurement & Analysis Process Assets  [5] Coverage of the unittests for the different categories,WAFL

23 / x Measurement & Analysis Measurement & Analysis Process Assets  [6] Coverage of the unittests, Hadra

24 / x Measurement & Analysis Measurement & Analysis Process Assets  [7] Coverage of the functionals, Hadra

25 / x Measurement & Analysis Measurement & Analysis Process Assets  [8] Overview of the unittests: class, methodname, succeeded or not, elapsed time, WAFL

26 / x Measurement & Analysis Measurement & Analysis Process Assets  [9] Results published by PHP CodeSniffer on the PHPUnderControl/CruiseControl server, WAFL

27 / x Measurement & Analysis Measurement & Analysis Process Assets  [10] HADRA code readability guidelines, but no metrics, Hadra

28 / x Measurement & Analysis Measurement & Analysis Process Assets  [11] Example of training: wiki page that explains how to make PHP unit tests that will automatically be used for MA purposes, WAFL

29 / x Measurement & Analysis Measurement & Analysis Process Assets  [12] Shows the work pressure balance of all the teammembers, TultiMouch

30 / x Measurement & Analysis Measurement & Analysis Process Assets  [13] statistics of which part of the milestone is reached, how many tickets closed each day …, TultiMouch

31 / x Measurement & Analysis Measurement & Analysis Process Assets  [14] Manual, process, how to ticket work, TultiMouch

32 / x Measurement & Analysis Measurement & Analysis Process Assets  [15] manual for stresstest, TultiMouch

33 / x Measurement & Analysis Measurement & Analysis Process Assets  [16] List of source files with junit tests, Mashedup

34 / x Measurement & Analysis Measurement & Analysis Process Assets  [17] MA objectives, TultiMouch

35 / x Measurement & Analysis Measurement & Analysis Findings  Strengths n Data collection is performed well (integrated tools)  Opportunities for Improvement n Most part of Measurement is performed but almost nothing is written down, objectives, processes, … n Analysis is almost not performed (formally)  Proposed Actions n Perform analysis and perform action on the results! n Write objectives down

36 / x Measurement & Analysis References  CMMI for Development version  Software Quality Assurance within the CMMI framework quality-assurance.org/cmmi-measurement-and-analysis.htmlhttp:// quality-assurance.org/cmmi-measurement-and-analysis.html  CMMI browser Measurement & Analysis