Presentation is loading. Please wait.

Presentation is loading. Please wait.

Measurement and quality assessment Framework for product metrics – Measure, measurement, and metrics – Formulation, collection, analysis, interpretation,

Similar presentations


Presentation on theme: "Measurement and quality assessment Framework for product metrics – Measure, measurement, and metrics – Formulation, collection, analysis, interpretation,"— Presentation transcript:

1

2 Measurement and quality assessment Framework for product metrics – Measure, measurement, and metrics – Formulation, collection, analysis, interpretation, feedback – Principles for metrics characterization and validation Metrics for requirements model – Function-based metrics – Metrics for specification quality Metric for design model – Architectural design metrics – Metric for object-oriented design 2

3 Weighted methods per class (WMC) – n methods of complexity c 1, c 2, … cn for a class C – WMC = ∑c i for i = 1 to n – If complexity increases, more efforts are required – Limited reuse – Counting methods apparently seems straightforward – Consistent counting approach is required Depth of the inheritance tree (DIT) – Maximum length from the node to the root – If DIT grows, lower-level classes inherit many methods – Many methods may be reused – Leads to design complexity 3

4 Number of Children (NOC) – Immediate subordinate classes – NOC grows Reuse increases Abstraction of parent class may be diluted Testing effort increases Coupling between class objects (CBO) – If coupling increases Reusability decreases Testing and modification complicated – CBO as low as reasonable 4

5 Response for a class (RFC) – Methods potentially executed in response to message received by a class object – RFC increases, design complexity and testing increases Lack of cohesion in methods (LCOM) – Number of methods that access one or more of the same attributes – If no methods access same attribute, LCOM is zero – If LCOM high, complexity of design increases 5

6 Metrics for conventional components focus on internal characteristics of a component Cohesion metrics – Data slice Backward walk through a module to look data values – Data tokens Variables defined – Glue tokens Data tokens lies on data slice – Superglue tokens Data tokens common to every data slice – Stickiness The relative stickiness of glue token directly proportional to the number of data slices that it binds 6

7 Coupling metrics – Data and control flow coupling d i = number of input data parameters c i = number of input control parameters d o = number of output data parameters c o = number of output control data parameters – Global coupling g d = number of global variable used as data g c = number of global variable used as control – Environmental coupling w = number of modules called r = number of modules calling the module – m c = k/M where M = d i + (a * c i ) + d o + (b * c o ) + g d + (c * g c ) + w + r 7

8 Complexity metrics – Cyclomatic complexity Number of independent logical paths – Variations of cyclomatic complexity 8

9 Average operation size (OS avg ) – Number of lines of code or number of messages sent by the operation – If number of messages sent increases, most probably responsibilities are not well allocated within a class Operation complexity (OC) – Complexity metrics for conventional software – OC should be kept as low as possible Average number of parameters per operation (NP avg ) – Larger number of parameters, complex collaboration – NP avg should be kept as low as possible 9

10 Layout complexity – Number of distinct regions defined for an interface Layout region complexity – Average number of distinct links per region Recognition complexity – Average number of distinct items the user must look at before making navigation or data input decision Recognition time – Average time (in seconds) that it takes a user to select the appropriate action for a given task Typing effort – Average number of key strokes required for a specific function 10

11 Mouse pick effort – Average number of mouse picks per function Selection complexity – Average number of links that can be selected per page Content acquisition time – Average number of words of text per web page Memory load – Average number of distinct data items that the user must remember to achieve specific objective 11

12 Word count – Total number of words that appear on a page Body text percentage – Percentage of words that are body versus display text (i.e. headers) Emphasized body text % – Portion of body text that is emphasized (e.g., bold, capitalized) Text cluster count – Text areas highlighted with color, bordered regions, rules, or lists Link count – Total links on a page 12

13 Page size – Total bytes for the page as well as elements, graphics, and style sheets Graphic percentage – Percentage of page bytes that are for graphics Graphics count – Total graphics on a page (not including graphics specified in scripts, applets, and objects) Color count – Total colors employed Font count – Total fonts employed (i.e. face + size + bold + italic) 13

14 Page wait – Average time required for a page to download at different connection speeds Page complexity – Average number of different types of media used on page, not including text Graphic complexity – Average number of graphics media per page Audio complexity – Average number of audio media per page Video complexity – Average number of video media per page Animation complexity – Average number of animations per page Scanned image complexity – Average number of scanned images per page 14

15 For static pages Page-linking complexity – Number of links per page Connectivity – Total number of internal links, not including dynamically generated links Connectivity density – Connectivity divided by page count 15

16 n 1 = number of distinct operators that appear in a program n 2 = number of distinct operands that appear in a program N 1 = total number of operator occurrences N 2 = total number of operand occurrences Overall program length (N) and volume (V) N = n 1 log 2 n 1 + n 2 log 2 n 2 V = N log 2 (n 1 + n 2 ) 16

17 Lack of cohesion in methods (LCOM) – If LCOM is high, more states must be tested Percent public and protected (PAP) – Percentage of public and protected class attributes – High value for PAP increases the side effects among classes Public access to data members (PAD) – The number of classes (or methods) that access another class’s attributes – Violation of encapsulation 17

18 Number of root classes (NOR) – The number of distinct class hierarchies – Test should be developed for each root class and corresponding hierarchy – If NOR increases, testing effort also increases Fan-in (FIN) – Indication of multiple inheritance – FIN > 1 should be avoided Number of children (NOC) and depth of inheritance tree (DIT) 18

19 M T = number of modules in the current release F c = number of modules in the current release that have been changed F a = number of modules in the current release that have been added F d = number of modules from the preceding release that were deleted in the current release Software maturity index SMI = M T - (F a + F c + F d ) / M T 1 indicates the product stability 19

20 Class-oriented metrics – Weighted methods per class, depth of the inheritance tree, number of children, coupling, response for class, lack of cohesion Component-level design metrics – Cohesion, coupling, and complexity Operation-oriented metrics – Average operation size, operation complexity average number of parameters per operation Design metrics for WebApps Metrics for source code Metrics for object-oriented testing Metrics for maintenance 20


Download ppt "Measurement and quality assessment Framework for product metrics – Measure, measurement, and metrics – Formulation, collection, analysis, interpretation,"

Similar presentations


Ads by Google