Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 / x CMMI Measurement & Analysis Pieter Cailliau Stijn De Vos Measurement & Analysis.

Similar presentations


Presentation on theme: "1 / x CMMI Measurement & Analysis Pieter Cailliau Stijn De Vos Measurement & Analysis."— Presentation transcript:

1 1 / x CMMI Measurement & Analysis Pieter Cailliau Stijn De Vos Measurement & Analysis

2 2 / x Measurement & Analysis Purpose  CMMI for development v1.2: The purpose of Measurement and Analysis (MA) is to develop and sustain a measurement capability that is used to support management information needs. Basically,  You need all kinds of information in other process areas,  and you find out what you can measure and analyze in order to obtain these information. Measurement & Analysis

3 3 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.1 Establish Measurement objectives that are derived from identified information needs and objectives. Sources of information needs/objectives: Use project plans, e.g. Architecture, scenario’s,... Review Meetings for feedback Experiences of other projects or organization entities. Objectives: “Why are we measuring?” Reduce time to delivery Reduce total lifecycle cost Deliver specified functionality Improve prior customer satisfaction ratings Measurement & Analysis

4 4 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.2 Specify measures to address the measurement objectives (ctd) Only one group has documented measurement objectives on their wiki But all groups do have “unwritten” objectives 4/4 Timesheet objectives 4/4 Testing objectives 2/4 Code readability Measurement & Analysis [17] TultiMouch MA objectives [10] Hadra code readability guidelines

5 5 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.2 Specify measures to address the measurement objectives Timesheets: each task is measured in hours/minutes Testing: class coverage (#unit_tests/#classes) method coverage (#test_methods/#methods) line coverage (#test_lines/#code_lines) Code readability Count of coding warnings/errors Measurement & Analysis [1] WAFL Timesheet [2] Hadra Tickets on trac

6 6 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.3 Specify Data Collection and Storage Procedure Timesheets: Only one group has specified a procedure for billing their work All groups have “unwritten” procedures  1 group uses txt-files (individually updated) on a dropbox  2 groups store their tickets on a trac  1 group uses txt-files on their SVN Testing All groups have “unwritten” procedures  All groups: individually write unit tests  PHPUnderControl/CruiseControl, Hudson, rcov  All groups: other tests by Test Manager Code readability One group: Automated  PHP Code Sniffer Measurement & Analysis [14] TultiMouch Billing procedure [16] Mashedup List of JUnits

7 7 / x Measurement & Analysis Specific Goals  SG 1 Align Measurement and Analysis Activities n SP1.4 Specify Analysis Procedures All groups have test managers but no formal procedure of analysis, but testing analysis are usually automatically generated (trac, rcov, CruiseControl) Measurement & Analysis

8 8 / x Measurement & Analysis Specific Goals  SG 2 Provide Measurement Results n SP2.1 Collect Measurement Data Timesheets Individually Testing Testing code committed individually, measurement data generated by tools (PHPUnderControl/CruiseControl, rcov, …) Code readability Automated (PHPCodeSniffer, …) Measurement & Analysis [4] WAFL Coverage single unit test [5] WAFL Coverage unittests; different categories [6] Hadra Coverage unittests [7] Hadra Coverage functioals [8] WAFL Overview of unittests [9] WAFL PHP CodeSniffer results

9 9 / x Measurement & Analysis Specific Goals  SG 2 Provide Measurement Results n SP2.2 Analyse Measurement Data Timesheets: analysis and interpretation of the workload: straightforward. duration of a task can be linked to a risk:  Longer than expected -> underestimated risk.  Putting much effort in a task -> justified because of its high risk. missed deadlines: e.g. a task took much longer than expected, which caused deadline failure. Testing: Results are interpreted by the testing managers. Code readability: Results are interpreted by the testing managers. It’s up to them to decide what’s acceptable and what’s not. Measurement & Analysis

10 10 / x Measurement & Analysis Specific Goals  SG 2 Provide Measurement Results n SP2.3 Store Data and Results Timesheets: 1 group stores timesheets in a dropbox Other groups use trac Testing: 1 group uses PHPUnderControl/CruiseControl Other groups use Trac Code readability: PHPUnderControl/CruiseControl Measurement & Analysis [3] WAFL Timesheets in dropbox [2] Hadra Tickets on trac [4] WAFL Coverage single unit test [5] WAFL Coverage unittests; different categories [6] Hadra Coverage unittests [7] Hadra Coverage functioals [8] WAFL Overview of unittests

11 11 / x Measurement & Analysis Specific Goals  SG 2 Provide Measurement Results n SP2.4 Communicate Results Discussed on Review meetings Most analysis are available for all team members on trac, PHPUnderControl/CruiseControl Measurement & Analysis [12] TultiMouch Work pressure on trac [13] TultiMouch Statistics of tickets

12 12 / x Measurement & Analysis Delete this slide in final version Guidelines (remove in final version) Use English language Copy only the titles of the practices not the description from the CMMI – use subpractices as guidance but do not discuss separately Map CMMI terminology to project terminology (add to glossary) Add explanatory slides as required Add references to direct/indirect artifacts for each practice Include descriptions/screen shots of artifacts Make a first draft based on documentation of projects and results of dream questions and add markers where information is missing Use this draft in interview sessions to ask questions to the group or discuss Update final version with interview feedback Ensure this presentation enables students to understand the practices of this process area in the context of real projects Do not forget the slide with Strengths and Opportunities for Improvement Check spelling

13 13 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 2  GP2.1 Establish an Organizational Policy n One group: The process manager in cooperation with the test/project managers generates the processes n Other groups: agreements that have been made during the meetings They can be found in the meeting reports, in the list of key decisions or on the WIKI.  GP2.2 Plan the Process n Most groups have wiki pages with processes/tutorials

14 14 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 2 (2)  GP2.3 Provide Resources n All groups have tools implemented on their server (rcov, …)  GP2.4 Assign Responsibility n All groups count on individual responsability Social control n Every group has Test Managers  GP2.5 Train People n Tutorials on the Wiki [11] WAFL Wiki page with training tutorial [15] TultiMouch Manual for stresstest

15 15 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 2 (3)  GP2.6 Manage Configurations n Testing and code readability: This is mainly managed by the PHPUnderControl/CruiseControl or trac server. This server does the data collecting/generating and presenting the results.  GP2.7 Identify and Involve Relevant Stakeholders n Whole group, clients, prof  GP2.8 Monitor and Control the Process n One group: process descriptions on trac evaluated by PM n All groups: testing by Test Managers

16 16 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 2 (4)  GP2.9 Objectively Evaluate Adherence n Not performed  GP2.10 Review Status with Higher Level Management n Not performed, maybe indirect on review meetings

17 17 / x Measurement & Analysis Measurement & Analysis Generic Practices GG 3 GG 3 Institutionalize a Defined Process The process is institutionalized as a defined process. GP 3.1 Establish a Defined Process Establish and maintain the description of a defined measurement and analysis process. [WAFL]: e.g. all PHP projects should use PHP Unit, PHPUnderControl/CruiseControl and the PEAR coding standard. [TultiMouch]: e.g. all work done should be ticketed on the trac immediately GP 3.2 Collect Improvement Information Collect work products, measures, measurement results, and improvement information derived from planning and performing the measurement and analysis process to support the future use and improvement of the organization’s processes and process assets. Examples of this could be: bad experience with certain tools should result in looking out for better tools. e.g. In WAFL, we should replace the timesheets on the dropbox by the build in function of trac. Also, WAFL has a separate WIKI, this could be replaced by the WIKI in trac.

18 18 / x Measurement & Analysis Measurement & Analysis Process Assets  [1] Example of a timesheet template, WAFL.

19 19 / x Measurement & Analysis Measurement & Analysis Process Assets  [2] Time spending in trac, for each ticket, Hadra

20 20 / x Measurement & Analysis Measurement & Analysis Process Assets  [3] Example of the timesheets in the dropbox, WAFL

21 21 / x Measurement & Analysis Measurement & Analysis Process Assets  [4] Coverage of a single unit test, WAFL

22 22 / x Measurement & Analysis Measurement & Analysis Process Assets  [5] Coverage of the unittests for the different categories,WAFL

23 23 / x Measurement & Analysis Measurement & Analysis Process Assets  [6] Coverage of the unittests, Hadra

24 24 / x Measurement & Analysis Measurement & Analysis Process Assets  [7] Coverage of the functionals, Hadra

25 25 / x Measurement & Analysis Measurement & Analysis Process Assets  [8] Overview of the unittests: class, methodname, succeeded or not, elapsed time, WAFL

26 26 / x Measurement & Analysis Measurement & Analysis Process Assets  [9] Results published by PHP CodeSniffer on the PHPUnderControl/CruiseControl server, WAFL

27 27 / x Measurement & Analysis Measurement & Analysis Process Assets  [10] HADRA code readability guidelines, but no metrics, Hadra

28 28 / x Measurement & Analysis Measurement & Analysis Process Assets  [11] Example of training: wiki page that explains how to make PHP unit tests that will automatically be used for MA purposes, WAFL

29 29 / x Measurement & Analysis Measurement & Analysis Process Assets  [12] Shows the work pressure balance of all the teammembers, TultiMouch

30 30 / x Measurement & Analysis Measurement & Analysis Process Assets  [13] statistics of which part of the milestone is reached, how many tickets closed each day …, TultiMouch

31 31 / x Measurement & Analysis Measurement & Analysis Process Assets  [14] Manual, process, how to ticket work, TultiMouch

32 32 / x Measurement & Analysis Measurement & Analysis Process Assets  [15] manual for stresstest, TultiMouch

33 33 / x Measurement & Analysis Measurement & Analysis Process Assets  [16] List of source files with junit tests, Mashedup

34 34 / x Measurement & Analysis Measurement & Analysis Process Assets  [17] MA objectives, TultiMouch

35 35 / x Measurement & Analysis Measurement & Analysis Findings  Strengths n Data collection is performed well (integrated tools)  Opportunities for Improvement n Most part of Measurement is performed but almost nothing is written down, objectives, processes, … n Analysis is almost not performed (formally)  Proposed Actions n Perform analysis and perform action on the results! n Write objectives down

36 36 / x Measurement & Analysis References  CMMI for Development version 1.2 http://www.sei.cmu.edu/publications/documents/06.reports/06tr008.html http://www.sei.cmu.edu/publications/documents/06.reports/06tr008.html  Software Quality Assurance within the CMMI framework http://www.software- quality-assurance.org/cmmi-measurement-and-analysis.htmlhttp://www.software- quality-assurance.org/cmmi-measurement-and-analysis.html  CMMI browser http://www.cmmi.de/cmmi_v1.2/browser.html http://www.cmmi.de/cmmi_v1.2/browser.html Measurement & Analysis


Download ppt "1 / x CMMI Measurement & Analysis Pieter Cailliau Stijn De Vos Measurement & Analysis."

Similar presentations


Ads by Google