Software Engineering Metrics James Gain

Slides:



Advertisements
Similar presentations
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Advertisements

Chapter 4 Software Process and Project Metrics
1 Estimating Software Development Using Project Metrics.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Chapter 4 Software Process and Project Metrics
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Software Engineering Software Metrics James Gain
Software Process and Project Metrics
Process and Project Metrics
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Metrics for Process and Projects
Metrics for Process and Projects
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Software Metrics.
Software Engineering II - Topic: Software Process Metrics and Project Metrics Instructor: Dr. Jerry Gao San Jose State University
Object-Oriented Metrics
Software Process and Product Metrics
Software Engineering Software Process and Project Metrics.
Chapter 6 : Software Metrics
Software Measurement & Metrics
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
CS460 © 2003 Ray S. Babcock Metrics The term metrics refers to any measurement related to software development. ● Lines of Code ● number of defects ● defects.
Software Engineering SM ? 1. Outline of this presentation What is SM The Need for SM Type of SM Size Oriented Metric Function Oriented Metric 218/10/2015.
1 Chapter 15 Product Metrics for Software Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
1 Chapter 4 Software Process and Project Metrics.
Software Project Management Lecture # 3. Outline Chapter 22- “Metrics for Process & Projects”  Measurement  Measures  Metrics  Software Metrics Process.
Software Metrics – part 2 Mehran Rezaei. Software Metrics Objectives – Provide State-of-art measurement of software products, processes and projects Why.
Lecture 4 Software Metrics
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Computing and SE II Chapter 15: Software Process Management Er-Yu Ding Software Institute, NJU.
SOFTWARE PROCESS AND PROJECT METRICS. Topic Covered  Metrics in the process and project domains  Process, project and measurement  Process Metrics.
CSc 461/561 Information Systems Engineering Lecture 5 – Software Metrics.
Measurement and quality assessment Framework for product metrics – Measure, measurement, and metrics – Formulation, collection, analysis, interpretation,
1 Software Engineering: A Practitioner’s Approach, 6/e Chapter 15a: Product Metrics for Software Software Engineering: A Practitioner’s Approach, 6/e Chapter.
Advanced Software Engineering Lecture 4: Process & Project Metrics.
Chapter 22 Metrics for Process and Projects Software Engineering: A Practitioner’s Approach 6 th Edition Roger S. Pressman.
Software Project Management Lecture # 3. Outline Metrics for Process and Projects  Introduction  Software Metrics Process metrics Project metrics Direct.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
1 These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
Software Engineering Object Oriented Metrics. Objectives 1.To describe the distinguishing characteristics of Object-Oriented Metrics. 2.To introduce metrics.
Software Metrics 1.
Chapter 22 Process and Project Metrics
Course Notes Set 12: Object-Oriented Metrics
Software Engineering (CSI 321)
Chapter 4 Software Process and Project Metrics
Lecture 15: Technical Metrics
Software Project Sizing and Cost Estimation
Why Do We Measure? assess the status of an ongoing project
Software Planning
Software engineering.
Function Point.
Software Metrics “How do we measure the software?”
For University Use Only
Why Do We Measure? assess the status of an ongoing project
Chapter 25 Process and Project Metrics
COCOMO Models.
Software metrics.
Software Metrics SAD ::: Fall 2015 Sabbir Muhammad Saleh.
Why Do We Measure? assess the status of an ongoing project
Process and Project Metrics
Why Do We Measure? assess the status of an ongoing project
Chapter 32 Process and Project Metrics
Chapter 22 Process and Project Metrics
Software Engineering: A Practitioner’s Approach, 6/e Chapter 15 Product Metrics for Software copyright © 1996, 2001, 2005 R.S. Pressman & Associates,
Metrics for Process and Projects
Presentation transcript:

Software Engineering Metrics James Gain (jgain@cs.uct.ac.za) http://people.cs.uct.ac.za/~jgain/courses/SoftEng/

Objectives To motivate the need for software metrics To differentiate between process, project and product metrics To compare and contrast Lines-Of-Code (LOC) and Function Point (FP) methods of metric normalization To introduce product and project metrics suitable for: OO Planning OO Analysis and Design OO Testing Quality

What are Metrics? Attempt to objectively quantify Software Engineering Can be direct (immediately measureable) or indirect (not immediately quantifiable) Measure (Raw Data): Quantitative indication of the extent, amount, dimension, or size of some SE attribute A single data point Metrics: The degree to which a system, component, or process possesses a given attribute. Synthesizes several data points Indicators: A combination of metrics that provides insight into the software process, project or product. Direct (LOC, speed, defects reported) Indirect (quality, reliability)

Do We Need Metrics? Against: Collecting metrics is too hard, too time consuming and too political. They can be used against individuals and don’t actually prove anything For: In order to characterize, evaluate, predict and improve the process and product a metric baseline is essential. “Anything that you need to quantify can be measured in some way that is superior to not measuring it at all” Tom Gilb

Purpose of Metrics Process: Project: Product: Measure and correct the software process applied by your organization over several projects Project: Create reliable estimates of time and cost Track and control the project schedule Product: Monitor and adjust aspects of the software as it is created But Metrics are like dynamite. Powerful if used carefully but dangerous if used to threaten or attack individuals “Not everything that can be counted counts, and not everything that counts can be counted.” - Einstein

Process Metrics Focus on quality achieved as a consequence of a repeatable or managed process. Strategic and Long Term. Statistical Software Process Improvement (SSPI). Error Categorization and Analysis: All errors and defects are categorized by origin The cost to correct each error and defect is recorded The number of errors and defects in each category is computed Data is analyzed to find categories that result in the highest cost to the organization Plans are developed to modify the process

Project Metrics Used by a project manager and software team to adapt project work flow and technical activities. Tactical and Short Term. Purpose: Minimize the development schedule by making the necessary adjustments to avoid delays and mitigate problems. Assess product quality on an ongoing basis. Metrics: Effort or time per SE task Errors uncovered per review hour Scheduled vs. actual milestone dates Number of changes and their characteristics Distribution of effort on SE tasks

Product Metrics Measure aspects of the product being engineered Fed into the project to provide time, cost and resource metrics Combined across several projects to produce process metrics. Types of Product Metrics: Analysis and Design: Do the product specifications exhibit the properties of good analysis and design? Testability: How easily can the product be tested? How thouroughly has the product been tested? Quality. Is there evidence of quality in the final product and also as it is created?

Process Metrics: Normalization How does an organization combine metrics that come from different individuals or projects? Metrics will depend on the size and complexity of the project Normalization: compensate for complexity aspects of a particular product Divide metrics by a Normalization Factor Approaches: LOC (Lines of Code): size oriented FP (Function Points): function oriented Person Months: time and effort oriented

Typical Normalized Metrics Size-Oriented: errors per KLOC (thousand lines of code), defects per KLOC, R per LOC, page of documentation per KLOC, errors / person-month, LOC per person-month, R / page of documentation Function-Oriented: errors per FP, defects per FP, R per FP, pages of documentation per FP, FP per person-month Project LOC FP Effort (P/M) R(000) Pp. doc Errors Defects People alpha 12100 189 24 168 365 134 29 3 beta 27200 388 62 440 1224 321 86 5 gamma 20200 631 43 314 1050 256 64 6

Why Opt for FP Measures? For: Against: Independent of programming language. Some programming languages are more compact, e.g. C++ vs. Assembler Use readily countable characteristics of the “information domain” of the problem Does not “penalize” inventive implementations that require fewer LOC than others Makes it easier to accommodate reuse and object-oriented approaches Against: Original FP approach good for typical Information Systems applications (interaction complexity). Variants (Extended FP and 3D FP) more suitable for real-time and scientific software Can suffer from a lack of objectivity and consistency. An independent third party may not reproduce the same FP

Computing Function Points Analyze information domain of the application and develop counts Establish count for input domain and system interfaces Weight each count by assessing complexity Assign level of complexity (simple, average, complex) or weight to each count Assess the influence of global factors that affect the application Grade significance of external factors, F_i, such as reuse, concurrency, OS, ... FP = SUM(count x weight) x C where complexity multiplier C = (0.65+0.01 x N) degree of influence N = SUM(F_i) Compute function points

Analyzing the Information Domain

Taking Complexity into Account Complexity Adjustment Values (F_i) are rated on a scale of 0 (not important) to 5 (very important): Does the system require reliable backup and recovery? Are data communications required? Are there distributed processing functions? Is performance critical? System to be run in an existing, heavily utilized environment? Does the system require on-line data entry? On-line entry requires input over multiple screens or operations? Are the master files updated on-line? Are the inputs, outputs, files, or inquiries complex? Is the internal processing complex? Is the code designed to be reusable? Are conversion and instillation included in the design? Multiple installations in different organizations? Is the application designed to facilitate change and ease-of-use?

Exercise: Function Points Compute the function point value for a project with the following information domain characteristics: Number of user inputs: 32 Number of user outputs: 60 Number of user enquiries: 24 Number of files: 8 Number of external interfaces: 2 Assume that weights are average and external complexity adjustment values are not important. Answer:

Example: SafeHome Functionality Test Sensor Password SafeHome System User Sensors Zone Setting Zone Inquiry Messages Sensor Inquiry User Sensor Status Panic Button (De)activate (De)activate Monitor and Response System Password, Sensors, etc. Alarm Alert System Config Data

Example: SafeHome FP Calc weighting factor measurement parameter count simple avg. complex number of user inputs 3 X 3 4 6 = 9 number of user outputs 2 X 4 5 7 = 8 number of user inquiries 2 X 3 4 6 = 6 number of files 1 7 X 7 10 15 = 22 number of ext.interfaces 4 X 5 7 10 = count-total 52 complexity multiplier 1.11 function points 58

Exercise: Function Points Compute the function point total for your project. Hint: The complexity adjustment values should be low ( ) Some appropriate complexity factors are (each scores 0-5): Is performance critical? Does the system require on-line data entry? On-line entry requires input over multiple screens or operations? Are the inputs, outputs, files, or inquiries complex? Is the internal processing complex? Is the code designed to be reusable? Is the application designed to facilitate change and ease-of-use?

OO Metrics: Distinguishing Characteristics The following characteristics require that special OO metrics be developed: Encapsulation—the packaging of data and processing. Conventional paradigms organize programs around data or function, OO does both. Concentrate on classes rather than functions. Information hiding—the way in which information about operational details is hidden by a secure interface. An information hiding metric will provide an indication of quality. Inheritance—the manner in which the responsibilities of one class are propagated to another. A pivotal indication of complexity. Abstraction—the mechanism that allows a design to focus on essential details. Metrics need to measure a class at different levels of abstraction and from different viewpoints. Conclusion: the class is the fundamental unit of measurement.

OO Project Metrics Number of Scenario Scripts (Use Cases): Number of use-cases is directly proportional the number of classes needed to meet requirements A strong indicator of program size. Number of Key Classes (Class Diagram): A key class focuses directly on the problem domain NOT likely to be implemented via reuse Typically 20-40% of all classes are key, the rest support infrastructure (e.g. GUI, communications, databases) Number of Subsystems (Package Diagram): Provides insight into resource allocation, scheduling for parallel development and overall integration effort.

OO Analysis and Design Metrics Related to Analysis and Design Principles Complexity: Weighted Methods per Class (WMC): Assume that n methods with cyclomatic complexity are defined for a class C: Depth of the Inheritance Tree (DIT): The maximum length from a leaf to the root of the tree. Large DIT leads to greater design complexity but promotes reuse Number of Children (NOC): Total number of children for each class. Large NOC may dilute abstraction and increase testing

Further OOA&D Metrics Coupling: Cohesion: Coupling between Object Classes (COB): Total number of collaborations listed for each class in CRC cards. Keep COB low because high values complicate modification and testing Response For a Class (RFC): Set of methods potentially executed in response to a message received by a class. High RFC implies test and design complexity Cohesion: Lack of Cohesion in Methods (LCOM): Number of methods in a class that access one or more of the same attributes. High LCOM means tightly coupled methods

OO Testability Metrics Encapsulation: lack of cohesion in methods (LCOM): The higher the value of LCOM, the more states must be tested to ensure that methods do not generate side effects. Percent Public and Protected (PAP): Percentage of attributes that are public. Public attributes can be inherited and accessed externally. High PAP means more side effects. Public Access to Data members (PAD): Number of classes that access another classes attributes. Violates encapsulation. Inheritance: Number of Root Classes (NRC): Count of distinct class hierarchies. Must all be tested separately. Fan In (FIN): The number of superclasses associated with a class. FIN > 1 indicates multiple inheritance. Must be avoided. Number of Children (NOC) and Depth of Inheritance Tree (DIT): Superclasses need to be retested for each subclass. Success of Testing: Defect Removal Efficiency (DRE). Relationship between pre-release errors (E) and post-release defects (D) Ideal is DRE = 1

Quality Metrics Measures conformance to explicit requirements, following specified standards, satisfying of implicit requirements Software quality can be difficult to measure and is often highly subjective. Correctness: The degree to which a program operates according to specification. Metric = Defects per FP. Maintainability: The degree to which a program is amenable to change. Metric = Mean Time to Change. Average time taken to analyze, design, implement and distribute a change.

Quality Metrics: Further Measures Integrity: The degree to which a program is impervious to outside attack. Summed over all types of security attacks, i, where t = threat (probability that an attack of type i will occur within a given time) and s = security (probability that an attack of type i will be repelled) Usability: The degree to which a program is easy to use. Metric = (1) the skill required to learn the system, (2) the time required to become moderately proficient, (3) the net increase in productivity, (4) assessment of the users attitude to the system. Covered in HCI course

Quality Metrics: McCall’s Approach Maintainability Portability Flexibility Reusability Testability Interoperability P R O D U C T E V I S N P R O D U C T A N S I P R O D U C T E A I N Correctness Usability Efficiency Reliability Integrity McCall’s Triangle of Quality

Quality Metrics: Deriving McCall’s Quality Metrics Assess a set of quality factors on a scale of 0 (low) to 10 (high) Each of McCall’s Quality Metrics is a weighted sum of different quality factors Weighting is determined by product requirements Example: Correctness = Completeness + Consistency + Traceability) Completeness is the degree to which full implementation of required function has been achieved Consistency is the use of uniform design and documentation techniques Traceability is the ability to trace program components back to analysis This technique depends on good objective evaluators because quality factor scores can be subjective.