Presentation is loading. Please wait.

Presentation is loading. Please wait.

Prof. Mohamed Batouche “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2.

Similar presentations


Presentation on theme: "Prof. Mohamed Batouche “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2."— Presentation transcript:

1 Prof. Mohamed Batouche batouche@ksu.edu.sa

2 “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

3 Process Measure the efficacy of processes. What works, what doesn't. Project Assess the status of projects. Track risk. Identify problem areas. Adjust work flow. Product Measure predefined product attributes (generally related to ISO9126 Software Characteristics) 3

4 Other classifications 4

5 Classification by phases of software system Process metrics – metrics related to the software development process Product metrics – metrics related to software maintenance Classification by subjects of measurements Quality Timetable Effectiveness (of error removal and maintenance services) Productivity 5

6 1. Facilitate management control, planning and managerial intervention. Based on: · Deviations of actual from planned performance. · Deviations of actual timetable and budget performance from planned. 2. Identify situations for development or maintenance process improvement (preventive or corrective actions). Based on: · Accumulation of metrics information regarding the performance of teams, units, etc. 6

7 KLOC — classic metric that measures the size of software by thousands of code lines. Number of function points (NFP) — a measure of the development resources (human resources) required to develop a program, based on the functionality specified for the software system. 7

8 8 Calculation of NCE Calculation of WCE Error severity class Number of Errors Relative WeightWeighted Errors abcD = b x c low severity421 medium severity17351 high severity11999 Total70---192 NCE70--- WCE---192

9 Process Metrics 9

10  Software process quality metrics ◦ Error density metrics ◦ Error severity metrics  Software process timetable metrics  Software process error removal effectiveness metrics  Software process productivity metrics 10

11 11 CodeNameCalculation formula CED Code Error Density NCE CED = ----------- KLOC DED Development Error Density NDE DED = ----------- KLOC WCED Weighted Code Error Density WCE WCDE = --------- KLOC WDED Weighted Development Error Density WDE WDED = --------- KLOC WCEF Weighted Code Errors per Function Point WCE WCEF = ---------- NFP WDEF Weighted Development Errors per Function Point WDE WDEF = ---------- NFP NCE = The number of code errors detected by code inspections and testing. NDE = total number of development (design and code) errors) detected in the development process. WCE = weighted total code errors detected by code inspections and testing. WDE = total weighted development (design and code) errors detected in development process.

12  A software development department may apply two alternative metrics for calculation of code error density: CED and WCED.  The unit has to determine indicators for unacceptable software quality: CED > 2 and WCED > 4 12

13 13 CodeNameCalculation formula ASCE Average Severity of Code Errors WCE ASCE = ----------- NCE ASDE Average Severity of Development Errors WDE ASDE = ----------- NDE NCE = The number of code errors detected by code inspections and testing. NDE = total number of development (design and code) errors detected in the development process. WCE = weighted total code errors detected by code inspections and testing. WDE = total weighted development (design and code) errors detected in development process.

14 14 CodeNameCalculation formula TTO Time Table Observance MSOT TTO = ----------- MS ADMC Average Delay of Milestone Completion TCDAM ADMC = ----------- MS MSOT = Milestones completed on time. MS = Total number of milestones. TCDAM = Total Completion Delays (days, weeks, etc.) for all milestones.

15 15 CodeNameCalculation formula DERE Development Errors Removal Effectiveness NDE DERE = ---------------- NDE + NYF DWERE Development Weighted Errors Removal Effectiveness WDE DWERE = ------------------ WDE+WYF NDE = total number of development (design and code) errors) detected in the development process. WCE = weighted total code errors detected by code inspections and testing. WDE = total weighted development (design and code) errors detected in development process. NYF = number software failures detected during a year of maintenance service. WYF = weighted number of software failures detected during a year of maintenance service.

16 16 CodeNameCalculation formula DevP Development Productivity DevH DevP = ---------- KLOC FDevP Function point Development Productivity DevH FDevP = ---------- NFP CRe Code Reuse ReKLOC Cre = -------------- KLOC DocRe Documentation Reuse ReDoc DocRe = ----------- NDoc DevH = Total working hours invested in the development of the software system. ReKLOC = Number of thousands of reused lines of code. ReDoc = Number of reused pages of documentation. NDoc = Number of pages of documentation.

17 Product Metrics 17

18 18 * HD quality metrics: *HD calls density metrics - measured by the number of calls. *HD calls severity metrics - the severity of the HD issues raised. *HD success metrics – the level of success in responding to HD calls. *HD productivity metrics. *HD effectiveness metrics. *Corrective maintenance quality metrics. *Software system failures density metrics *Software system failures severity metrics *Failures of maintenance services metrics *Software system availability metrics *Corrective maintenance productivity and effectiveness metrics.

19 CodeNameCalculation Formula HDDHD calls density NHYC HDD = -------------- KLMC WHDDWeighted HD calls density WHYC WHYC = ------------ KLMC WHDF Weighted HD calls per function point WHYC WHDF = ------------ NMFP NHYC = the number of HD calls during a year of service. KLMC = Thousands of lines of maintained software code. WHYC = weighted HD calls received during one year of service. NMFP = number of function points to be maintained.

20 CodeNameCalculation Formula ASHCAverage severity of HD calls WHYC ASHC = -------------- NHYC NHYC = the number of HD calls during a year of service. WHYC = weighted HD calls received during one year of service.

21 CodeNameCalculation Formula HDSHD service success NHYOT HDS = -------------- NHYC NHYNOT = Number of yearly HD calls completed on time during one year of service. NHYC = the number of HD calls during a year of service.

22 CodeNameCalculation Formula HDP HD Productivity HDYH HDP= -------------- KLNC FHDP Function Point HD Productivity HDYH FHDP = ---------- NMFP HDE HD effectiveness HDYH HDE = -------------- NHYC HDYH = Total yearly working hours invested in HD servicing of the software system. KLMC = Thousands of lines of maintained software code. NMFP = number of function points to be maintained. NHYC = the number of HD calls during a year of service.

23 CodeNameCalculation Formula SSFD Software System Failure Density NYF SSFD = -------------- KLMC WSSFD Weighted Software System Failure Density WYF WFFFD = --------- KLMC WSSFF Weighted Software System Failures per Function point WYF WSSFF = ---------- NMFP NYF = number of software failures detected during a year of maintenance service. WYF = weighted number of yearly software failures detected during one year of maintenance service. NMFP = number of function points designated for the maintained software. KLMC = Thousands of lines of maintained software code.

24 CodeNameCalculation Formula ASSSF Average Severity of Software System Failures WYF ASSSF = -------------- NYF NYF = number of software failures detected during a year of maintenance service. WYF = weighted number of yearly software failures detected during one year.

25 CodeNameCalculation Formula MRepF Maintenance Repeated repair Failure metric - RepYF MRepF = -------------- NYF  NYF = number of software failures detected during a year of maintenance service. RepYF = Number of repeated software failure calls (service failures).

26 CodeNameCalculation Formula FAFull Availability NYSerH - NYFH FA = ----------------------- NYSerH VitAVital Availability NYSerH - NYVitFH VitA = ----------------------------- NYSerH TUATotal Unavailability NYTFH TUA = ------------ NYSerH NYSerH = Number of hours software system is in service during one year. NYFH = Number of hours where at least one function is unavailable (failed) during one year, including total failure of the software system. NYVitFH = Number of hours when at least one vital function is unavailable (failed) during one year, including total failure of the software system. NYTFH = Number of hours of total failure (all system functions failed) during one year. NYFH ≥ NYVitFH ≥ NYTFH. 1 – TUA ≥ VitA ≥FA

27 CodeNameCalculation Formula CMaiP Corrective Maintenance Productivity CMaiYH CMaiP = --------------- KLMC FCMP Function point Corrective Maintenance Productivity CMaiYH FCMP = -------------- NMFP CMaiE Corrective Maintenance Effectiveness CMaiYH CMaiE = ------------ NYF CMaiYH = Total yearly working hours invested in the corrective maintenance of the software system. NYF = number of software failures detected during a year of maintenance service. NMFP = number of function points designated for the maintained software. KLMC = Thousands of lines of maintained software code.

28 Defining new Software Quality Metrics 28

29

30 Limitations of Software Quality Metrics 30

31 * Budget constraints in allocating the necessary resources. * Human factors, especially opposition of employees to evaluation of their activities. * Validity Uncertainty regarding the data's, partial and biased reporting.

32 * Parameters used in development process metrics: KLOC, NDE, NCE. * Parameters used in product (maintenance) metrics: KLMC, NHYC, NYF.

33 a.Programming style (KLOC). b.Volume of documentation comments (KLOC). c.Software complexity (KLOC, NCE). d.Percentage of reused code (NDE, NCE). e.Professionalism and thoroughness of design review and software testing teams: affects the number of defects detected (NCE). f.Reporting style of the review and testing results: concise reports vs. comprehensive reports (NDE, NCE).

34 a.Quality of installed software and its documentation (NYF, NHYC). b.Programming style and volume of documentation comments included in the code be maintained (KLMC). c.Software complexity (NYF). d.Percentage of reused code (NYF). e.Number of installations, size of the user population and level of applications in use: (NHYC, NYF).

35 Goal Question Metrics Approach “GQM” Approach 35

36  The GQM approach is based on the idea that in order to measure in a meaningful way we must measure that which will help us to assess and meet our organizational goals. –A Goal is defined for an object, for a variety of reasons, with respect to various models of quality, from various points of view, relative to a particular environment. The objects of measurement are products, processes and resources. –Questions are used to characterize the way the assessment/achievement of a specific goal is going to be performed based on some characterizing model. Questions try to characterize the object of measurement with respect to a selected quality issue and to determine its quality from the selected viewpoint. Questions must be answerable in a quantitative manner. –Metrics are associated with every question in order to answer it in a quantitative way. 36

37 –Goal: Customer Satisfaction –Question: Are customers dissatisfied when problems occur? –Metric: Number of customer defect reports 37

38 –Goal: To improve the outcome of design inspections from the quality managers point of view. –Question: What is the current yield of design inspections? –Metric: inspection yield = nr defects found / est. total defects –Question: What is the current inspection rate? –Metric: individual inspection rate, inspection period, overall inspection rate –Question: Is design inspection yield improving? –Metric: current design inspection yield / baseline design inspection yield * 100 38

39  organization’s goals: is to evaluate the effectiveness of the coding standard.  Some of the associated questions might be: ◦ Who is using the standard? ◦ What is coder productivity? ◦ What is the code quality?  For these questions, the corresponding metrics might be: ◦ What proportion of coders are using the standard? ◦ How have the number of LOC or FPs generated per day per coder changed? ◦ How have appropriate measures of quality for the code changed? (Appropriate measures might be errors found per LOC or cyclomatic complexity.) 39

40  W ILLIAW E. L EWIS, “S OFTWARE T ESTING A ND C ONTINUOUS Q UALITY I MPROVEMENT ”, T HIRD E DITION, CRC P RESS, 2009.  K. N AIK AND P. T RIPATHY : “S OFTWARE T ESTING AND Q UALITY A SSURANCE ”, W ILEY, 2008.  I AN S OMMERVILLE, S OFTWARE E NGINEERING, 8 TH EDITION, 2006.  A DITYA P. M ATHUR,“F OUNDATIONS OF S OFTWARE T ESTING ”, P EARSON E DUCATION, 2009.  D. G ALIN, “S OFTWARE Q UALITY A SSURANCE : F ROM T HEORY TO I MPLEMENTATION ”, P EARSON E DUCATION, 2004  D AVID G USTAFSON, “T HEORY AND P ROBLEMS OF S OFTWARE E NGINEERING ”, Schaum’s Outline Series, McGRAW-HILL, 2002. 40

41


Download ppt "Prof. Mohamed Batouche “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2."

Similar presentations


Ads by Google