CS427: Software Engineering I

Slides:



Advertisements
Similar presentations
SPEC Workshop 2008 Laboratory for Computer Architecture1/27/2008 On the Object Orientedness of C++ programs in SPEC CPU 2006 Ciji Isen & Lizy K. John University.
Advertisements

Visual Studio Tips and Tricks Code Metrics Zain Naboulsi Sr. Developer Evangelist Microsoft Blog: blogs.msdn.com/
Software Metrics for Object Oriented Design
Prediction of fault-proneness at early phase in object-oriented development Toshihiro Kamiya †, Shinji Kusumoto † and Katsuro Inoue †‡ † Osaka University.
Figures – Chapter 24.
Metrics for Object Oriented Design Shyam R. Chidamber Chris F. Kemerer Presented by Ambikadevi Damodaran.
Applying and Interpreting Object Oriented Metrics
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
March 25, R. McFadyen1 Metrics Fan-in/fan-out Lines of code Cyclomatic complexity Comment percentage Length of identifiers Depth of conditional.
Nov R. McFadyen1 Metrics Fan-in/fan-out Lines of code Cyclomatic complexity* Comment percentage Length of identifiers Depth of conditional.
Page 1 Building Reliable Component-based Systems Chapter 7 - Role-Based Component Engineering Chapter 7 Role-Based Component Engineering.
Design Metrics Software Engineering Fall 2003 Aditya P. Mathur Last update: October 28, 2003.
Software engineering for real-time systems
Object-Oriented Metrics
March R. McFadyen1 Software Metrics Software metrics help evaluate development and testing efforts needed, understandability, maintainability.
1 Complexity metrics  measure certain aspects of the software (lines of code, # of if-statements, depth of nesting, …)  use these numbers as a criterion.
Predicting Class Testability using Object-Oriented Metrics M. Bruntink and A. van Deursen Presented by Tom Chappell.
Object Oriented Metrics XP project group – Saskia Schmitz.
Slide 1. Slide 2 Administrivia Nate's office hours are Wed, 2-4, in 329 Soda! TA Clint will be handing out a paper survey in class sometime this week.
Comp 587 Parker Li Bobby Kolski. Automated testing tools assist software engineers to gauge the quality of software by automating the mechanical aspects.
Software Metrics.
Cyclomatic Complexity Dan Fleck Fall 2009 Dan Fleck Fall 2009.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
Lecture 17 Software Metrics
Chidamber & Kemerer Suite of Metrics
Object-Oriented Metrics Alex Evans Jonathan Jakse Cole Fleming Matt Keran Michael Ababio.
Paradigm Independent Software Complexity Metrics Dr. Zoltán Porkoláb Department of Programming Languages and Compilers Eötvös Loránd University, Faculty.
Software Measurement & Metrics
The CK Metrics Suite. Weighted Methods Per Class b To use this metric, the software engineer must repeat this process n times, where n is the number of.
The CK Metrics Suite. Weighted Methods Per Class b To use this metric, the software engineer must repeat this process n times, where n is the number of.
Software Quality Metrics
Concepts of Software Quality Yonglei Tao 1. Software Quality Attributes  Reliability  correctness, completeness, consistency, robustness  Testability.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
1 Metrics and lessons learned for OO projects Kan Ch 12 Steve Chenoweth, RHIT Above – New chapter, same Halstead. He also predicted various other project.
An Automatic Software Quality Measurement System.
CSc 461/561 Information Systems Engineering Lecture 5 – Software Metrics.
Measurement and quality assessment Framework for product metrics – Measure, measurement, and metrics – Formulation, collection, analysis, interpretation,
Software Metric Tools Joel Keyser, Jacob Napp, Carey Norslien, Stephen Owings, Tristan Paynter.
Object-Oriented (OO) estimation Martin Vigo Gabriel H. Lozano M.
Ontology Support for Abstraction Layer Modularization Hyun Cho, Jeff Gray Department of Computer Science University of Alabama
Object Oriented Metrics
Software Engineering Lecture 19: Object-Oriented Testing & Technical Metrics.
1 OO Technical Metrics CIS 375 Bruce R. Maxim UM-Dearborn.
Software Engineering Object Oriented Metrics. Objectives 1.To describe the distinguishing characteristics of Object-Oriented Metrics. 2.To introduce metrics.
CS223: Software Engineering Lecture 21: Unit Testing Metric.
1 Week 7 Software Engineering Spring Term 2016 Marymount University School of Business Administration Professor Suydam.
Design Metrics CS 406 Software Engineering I Fall 2001 Aditya P. Mathur Last update: October 23, 2001.
Architecting Complexity HOW UNDERSTANDING COMPLEXITY PROMOTES SIMPLICITY.
Static Software Metrics Tool
Object Oriented Metrics
School of Business Administration
Metrics of Software Quality
Software Metrics 1.
Software Testing.
Course Notes Set 12: Object-Oriented Metrics
Design Characteristics and Metrics
Towards a Multi-paradigm Complexity Measure
Object-Oriented Metrics
Chapter 13 & 14 Software Testing Strategies and Techniques
Design Metrics Software Engineering Fall 2003
Design Metrics Software Engineering Fall 2003
Information flow-Test coverage measure
Software testing strategies 2
Software Testing (Lecture 11-a)
Predicting Fault-Prone Modules Based on Metrics Transitions
Graph Coverage for Design Elements CS 4501 / 6501 Software Testing
Software Metrics SAD ::: Fall 2015 Sabbir Muhammad Saleh.
Software Metrics using EiffelStudio
Chapter 8: Design: Characteristics and Metrics
Chapter 13 & 14 Software Testing Strategies and Techniques 1 Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Presentation transcript:

CS427: Software Engineering I Grigore Rosu

Course Logistics Wiki: http://wiki.cites.illinois.edu/wiki/display/cs427fa17 Piazza: https://piazza.com/class#fall2017/cs427 Compass: https://compass2g.illinois.edu All class-wide announcements through compass or my.cs DONE with the MPs ! Lectures taught by Profs. Marinov and Xie will be on exams! 4th credit: books on wiki page, to read and review Office hours by request (Piazza) Qing Ye offers regular office hours on Tu/Th 3-4pm, SC4105 Used also for makeup quizzes: Mon quiz ~> Tu; Wed quiz ~> Th Reminder MPs (30%), final exam (25%), project (40%), quizzes (5%) Bonus up to 3% for active participation (lectures, Piazza, …)

Metrics

Today’s goals How do we measure product and project progress?

Metrics What can we measure? Process Man hours Bugs reported Stories implemented Product Measures of the product are called “technical metrics”

Non-technical metrics Number of people on project Time taken, money spent Bugs found/reported By testers/developers By users Bugs fixed, features added

Technical metrics Size of code Complexity of code Number of files Number of classes Number of processes Complexity of code Dependencies / Coupling / Cohesion Depth of nesting Cyclomatic complexity

Technical or non-technical? Number of tests Number of failing tests Number of classes in design model Number of relations per class Size of user manual Time taken by average transaction What metrics did we have in MPs?

Measuring size of system Lines of code (Source Lines of Code - SLOC) Number of classes, functions, files, etc. Function Points Are they repeatable? Do they work on analysis model, design model, or code?

Lines of code (1) Easy to measure All projects produce it Correlates to time to build Not a very good standard Bill Gates: “Measuring software productivity by lines of code is like measuring progress on an airplane by how much it weighs.”

Lines of code (2) Blanks? Comments? copy(char *p,*q) {while(*p) *q++ = *p++;} copy(char *p,*q) { while(p) { *q++ = *p++; }

Lines of code: language matters Assembly code may be 2-3X longer than C code C code may be 2-3X longer than Java code Java code may be 2-3X longer than …

Lines of code (3) Lines of code is valid metric when Same language Standard formatting Code has been reviewed

Complexity Complex systems are Some metrics for complexity: Hard to understand Hard to change Hard to reuse Some metrics for complexity: Cyclomatic complexity (Thomas J. McCabe, Sr. in 1976) Function points Coupling and cohesion Many OO-specific metrics (may cover later)

Cyclomatic Complexity (1) A measure of logical complexity Tells how many tests are needed to execute every statement of program Number of branches (if, while, for) + 1

Cyclomatic Complexity (2) void processInterest() { for (acc : accounts) { if (hasInterest(acc)) { acc.balance = acc.balance + acc.balance * acc.interest }

Cyclomatic Complexity (3) processInterest for if acc.balance = ... done

Cyclomatic Complexity (4) Number of predicates + 1 Number of edges - number of nodes + 2 Number of regions of the flow graph

Cyclomatic Complexity (5) Testing view Cyclomatic complexity is the number of independent paths through the procedure Gives an upper bound on the number of tests necessary to execute every edge of control graph

Cyclomatic Complexity (6) Metrics view McCabe found that modules with cyclomatic complexity greater than 10 were hard to test and error prone Look for procedures with high cyclomatic complexity and rewrite them, focus testing on them, or focus reviewing on them Good target for refactoring, reverse engineering, reengineering

Function points (1) Way of measuring “functionality” of system Measure of how big a system ought to be Used to predict size Several methods of computing function points, all complicated Most are proprietary

Function points (2) Count number of inputs, number of outputs, number of algorithms, number of tables in database “Function points” is function of above, plus fudge factor for complexity and developer expertise You need training to measure function points

Coupling and cohesion (1) Coupling - dependences among modules Cohesion - dependences within modules Dependences Call methods, refer to class, share variable Coupling - bad Cohesion - good

Coupling and cohesion (2) Number and complexity of shared variables Functions in a module should share variables Functions in different modules should not Number and complexity of parameters Number of functions/modules that are called Number of functions/modules that call me

Coupling and Cohesion in OO Languages

Dhama’s Coupling Metric Module coupling = 1 / ( number of input parameters + number of output parameters + number of global variables used + number of modules called + number of modules calling ) 0.5 is low coupling, 0.001 is high coupling

Martin’s Coupling Metric Ca : Afferent coupling: the number of classes outside this module that depend on classes inside this module Ce : Efferent coupling: the number of classes inside this module that depend on classes outside this module Instability = Ce / (Ca + Ce)

Abstractness Abstractness = number of abstract classes in module / number of classes in module Main sequence : “right” number of concrete and abstract classes in proportion to its dependencies (0,1) Abstractness (1,0) Instability

Technical OO Metrics Paper (not required for the exam, but you can read it if you are interested in this topic and the slides are not sufficient) A Metrics Suite for Object Oriented Design, Shyam R. Chidamber and Chris F. Kemerer IEEE Transactions on Software Engineering, June 1994, pp 476-493

List of metrics Weighted Methods Per Class (WMC) Depth of Inheritance Tree (DIT) Number of Children (NOC) Coupling between Object Classes (CBO) Response for a Class (RFC) Lack of Cohesion in Methods (LCOM)

Weighted Methods Per Class WMC for a class is the sum of the complexities of the methods in the class Possible method complexities 1 (number of methods) Lines of code Number of method calls Cyclomatic complexity

Weighted Methods Per Class The number of methods and the complexity of methods predict the time and effort required to develop and maintain a class The larger the number of methods in a class, the greater the potential impact on children Classes with large numbers of methods are more likely to be application specific and less reusable

Weighted Methods Per Class

Depth of Inheritance Tree Maximum length from a class to the root of the tree

Depth of Inheritance Tree The deeper a class is in the hierarchy, the more methods it inherits and so it is harder to predict its behavior The deeper a class is in the hierarchy, the more methods it reuses Deeper trees are more complex

Depth of Inheritance Tree

Number of Children Number of immediate subclasses More children is more reuse A class might have a lot of children because of misuse of subclassing A class with a large number of children is probably very important and needs a lot of testing

Number of Children Almost all classes have 0 children Only a handful of classes will have more than five children

Coupling Between Object Classes Number of other classes to which a class is coupled Class A is coupled to class B if there is a method in A that invokes a method of B Want to be coupled only with abstract classes high in the inheritance hierarchy

Coupling Between Object Classes Coupling makes designs hard to change Coupling makes classes hard to reuse Coupling is a measure of how hard a class is to test

Coupling Between Object Classes C++ project: median 0, max 84 Smalltalk project: median 9, max 234

Response for a class Number of methods in a class or called by a class The response set of a class is a set of methods that can potentially be executed in response to a message received by an object of that class

Response for a class If a large number of methods can be invoked in response to a message, testing becomes more complicated The more methods that can be invoked from a class, the greater the complexity of the class

Response for a class C++: median 6, max 120 Smalltalk: median 29, max 422

Lack of Cohesion in Methods Number of pairs of methods that don’t share instance variables minus number of pairs of methods that share instance variables Cohesiveness of methods is a sign of encapsulation Lack of cohesion implies classes should be split

Lack of cohesion of methods C++: median 0, max 200 Smalltalk: median 2, max 17 Smalltalk system had only a few zero

What is best? WMC or list of long methods? DIT or list of classes with depth over 6? NOC or list of classes with more than 6 children? CBO or list of classes with high coupling?

One way to use metrics Measure the amount of code produced each month by each programmer Give high producers big raise Is this good or bad?

Another way to use metrics Measure complexity of modules Pick the most complex and rewrite it Is this good or bad?

Yet another way to use metrics Use function points to determine how many lines of code a system will require When you write that many lines of code, stop and deliver the system Is this good or bad?

Yet^2 another way to use metrics Track progress Manager review Information radiator Code reviews Planning Is this good or bad?

Manager review Manager periodically makes a report Growth in SLOC Bugs reported and fixed SLOC per function point (user story), SLOC per programmer (user stories per programmer)

Information radiator Way of letting entire group know the state of the project Green/red test runner (green/red lava lamp or green/red traffic light) Wall chart of predicted/actual stories Wall chart of predicted/actual SLOC Web page showing daily change in metrics (computed by daily build)

Code reviews Look at metrics before a code review Coverage - untested code usually has bugs Large methods/classes Code with many reported bugs

Planning Need to predict amount of effort Predict components and SLOC Make sure you want to do it Hit delivery dates Hire enough people Predict components and SLOC Predict stories & months (XP) Opinion of developers/tech lead/manager

Summary: Metrics (in General) Non-technical: about process Technical: about product Size, complexity (cyclomatic, function points) How to use metrics Prioritize work - compare modules in a system version, compare system versions over time Measure programmer productivity Make plans based on predicted effort