Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS427: Software Engineering I

Similar presentations


Presentation on theme: "CS427: Software Engineering I"— Presentation transcript:

1 CS427: Software Engineering I
Grigore Rosu

2 Course Logistics Wiki: Piazza: Compass: All class-wide announcements through compass or my.cs DONE with the MPs ! Lectures taught by Profs. Marinov and Xie will be on exams! 4th credit: books on wiki page, to read and review Office hours by request (Piazza) Qing Ye offers regular office hours on Tu/Th 3-4pm, SC4105 Used also for makeup quizzes: Mon quiz ~> Tu; Wed quiz ~> Th Reminder MPs (30%), final exam (25%), project (40%), quizzes (5%) Bonus up to 3% for active participation (lectures, Piazza, …)

3 Metrics

4 Today’s goals How do we measure product and project progress?

5 Metrics What can we measure?
Process Man hours Bugs reported Stories implemented Product Measures of the product are called “technical metrics”

6 Non-technical metrics
Number of people on project Time taken, money spent Bugs found/reported By testers/developers By users Bugs fixed, features added

7 Technical metrics Size of code Complexity of code Number of files
Number of classes Number of processes Complexity of code Dependencies / Coupling / Cohesion Depth of nesting Cyclomatic complexity

8 Technical or non-technical?
Number of tests Number of failing tests Number of classes in design model Number of relations per class Size of user manual Time taken by average transaction What metrics did we have in MPs?

9 Measuring size of system
Lines of code (Source Lines of Code - SLOC) Number of classes, functions, files, etc. Function Points Are they repeatable? Do they work on analysis model, design model, or code?

10 Lines of code (1) Easy to measure All projects produce it
Correlates to time to build Not a very good standard Bill Gates: “Measuring software productivity by lines of code is like measuring progress on an airplane by how much it weighs.”

11 Lines of code (2) Blanks? Comments?
copy(char *p,*q) {while(*p) *q++ = *p++;} copy(char *p,*q) { while(p) { *q++ = *p++; }

12 Lines of code: language matters
Assembly code may be 2-3X longer than C code C code may be 2-3X longer than Java code Java code may be 2-3X longer than …

13 Lines of code (3) Lines of code is valid metric when Same language
Standard formatting Code has been reviewed

14 Complexity Complex systems are Some metrics for complexity:
Hard to understand Hard to change Hard to reuse Some metrics for complexity: Cyclomatic complexity (Thomas J. McCabe, Sr. in 1976) Function points Coupling and cohesion Many OO-specific metrics (may cover later)

15 Cyclomatic Complexity (1)
A measure of logical complexity Tells how many tests are needed to execute every statement of program Number of branches (if, while, for) + 1

16 Cyclomatic Complexity (2)
void processInterest() { for (acc : accounts) { if (hasInterest(acc)) { acc.balance = acc.balance + acc.balance * acc.interest }

17 Cyclomatic Complexity (3)
processInterest for if acc.balance = ... done

18 Cyclomatic Complexity (4)
Number of predicates + 1 Number of edges - number of nodes + 2 Number of regions of the flow graph

19 Cyclomatic Complexity (5) Testing view
Cyclomatic complexity is the number of independent paths through the procedure Gives an upper bound on the number of tests necessary to execute every edge of control graph

20 Cyclomatic Complexity (6) Metrics view
McCabe found that modules with cyclomatic complexity greater than 10 were hard to test and error prone Look for procedures with high cyclomatic complexity and rewrite them, focus testing on them, or focus reviewing on them Good target for refactoring, reverse engineering, reengineering

21 Function points (1) Way of measuring “functionality” of system
Measure of how big a system ought to be Used to predict size Several methods of computing function points, all complicated Most are proprietary

22 Function points (2) Count number of inputs, number of outputs, number of algorithms, number of tables in database “Function points” is function of above, plus fudge factor for complexity and developer expertise You need training to measure function points

23 Coupling and cohesion (1)
Coupling - dependences among modules Cohesion - dependences within modules Dependences Call methods, refer to class, share variable Coupling - bad Cohesion - good

24 Coupling and cohesion (2)
Number and complexity of shared variables Functions in a module should share variables Functions in different modules should not Number and complexity of parameters Number of functions/modules that are called Number of functions/modules that call me

25 Coupling and Cohesion in OO Languages

26 Dhama’s Coupling Metric
Module coupling = 1 / ( number of input parameters + number of output parameters + number of global variables used + number of modules called + number of modules calling ) 0.5 is low coupling, is high coupling

27 Martin’s Coupling Metric
Ca : Afferent coupling: the number of classes outside this module that depend on classes inside this module Ce : Efferent coupling: the number of classes inside this module that depend on classes outside this module Instability = Ce / (Ca + Ce)

28 Abstractness Abstractness = number of abstract classes in module / number of classes in module Main sequence : “right” number of concrete and abstract classes in proportion to its dependencies (0,1) Abstractness (1,0) Instability

29 Technical OO Metrics Paper (not required for the exam, but you can read it if you are interested in this topic and the slides are not sufficient) A Metrics Suite for Object Oriented Design, Shyam R. Chidamber and Chris F. Kemerer IEEE Transactions on Software Engineering, June 1994, pp

30 List of metrics Weighted Methods Per Class (WMC)
Depth of Inheritance Tree (DIT) Number of Children (NOC) Coupling between Object Classes (CBO) Response for a Class (RFC) Lack of Cohesion in Methods (LCOM)

31 Weighted Methods Per Class
WMC for a class is the sum of the complexities of the methods in the class Possible method complexities 1 (number of methods) Lines of code Number of method calls Cyclomatic complexity

32 Weighted Methods Per Class
The number of methods and the complexity of methods predict the time and effort required to develop and maintain a class The larger the number of methods in a class, the greater the potential impact on children Classes with large numbers of methods are more likely to be application specific and less reusable

33 Weighted Methods Per Class

34 Depth of Inheritance Tree
Maximum length from a class to the root of the tree

35 Depth of Inheritance Tree
The deeper a class is in the hierarchy, the more methods it inherits and so it is harder to predict its behavior The deeper a class is in the hierarchy, the more methods it reuses Deeper trees are more complex

36 Depth of Inheritance Tree

37 Number of Children Number of immediate subclasses
More children is more reuse A class might have a lot of children because of misuse of subclassing A class with a large number of children is probably very important and needs a lot of testing

38 Number of Children Almost all classes have 0 children
Only a handful of classes will have more than five children

39 Coupling Between Object Classes
Number of other classes to which a class is coupled Class A is coupled to class B if there is a method in A that invokes a method of B Want to be coupled only with abstract classes high in the inheritance hierarchy

40 Coupling Between Object Classes
Coupling makes designs hard to change Coupling makes classes hard to reuse Coupling is a measure of how hard a class is to test

41 Coupling Between Object Classes
C++ project: median 0, max 84 Smalltalk project: median 9, max 234

42 Response for a class Number of methods in a class or called by a class
The response set of a class is a set of methods that can potentially be executed in response to a message received by an object of that class

43 Response for a class If a large number of methods can be invoked in response to a message, testing becomes more complicated The more methods that can be invoked from a class, the greater the complexity of the class

44 Response for a class C++: median 6, max 120
Smalltalk: median 29, max 422

45 Lack of Cohesion in Methods
Number of pairs of methods that don’t share instance variables minus number of pairs of methods that share instance variables Cohesiveness of methods is a sign of encapsulation Lack of cohesion implies classes should be split

46 Lack of cohesion of methods
C++: median 0, max 200 Smalltalk: median 2, max 17 Smalltalk system had only a few zero

47 What is best? WMC or list of long methods?
DIT or list of classes with depth over 6? NOC or list of classes with more than 6 children? CBO or list of classes with high coupling?

48 One way to use metrics Measure the amount of code produced each month by each programmer Give high producers big raise Is this good or bad?

49 Another way to use metrics
Measure complexity of modules Pick the most complex and rewrite it Is this good or bad?

50 Yet another way to use metrics
Use function points to determine how many lines of code a system will require When you write that many lines of code, stop and deliver the system Is this good or bad?

51 Yet^2 another way to use metrics
Track progress Manager review Information radiator Code reviews Planning Is this good or bad?

52 Manager review Manager periodically makes a report Growth in SLOC
Bugs reported and fixed SLOC per function point (user story), SLOC per programmer (user stories per programmer)

53 Information radiator Way of letting entire group know the state of the project Green/red test runner (green/red lava lamp or green/red traffic light) Wall chart of predicted/actual stories Wall chart of predicted/actual SLOC Web page showing daily change in metrics (computed by daily build)

54 Code reviews Look at metrics before a code review
Coverage - untested code usually has bugs Large methods/classes Code with many reported bugs

55 Planning Need to predict amount of effort Predict components and SLOC
Make sure you want to do it Hit delivery dates Hire enough people Predict components and SLOC Predict stories & months (XP) Opinion of developers/tech lead/manager

56 Summary: Metrics (in General)
Non-technical: about process Technical: about product Size, complexity (cyclomatic, function points) How to use metrics Prioritize work - compare modules in a system version, compare system versions over time Measure programmer productivity Make plans based on predicted effort


Download ppt "CS427: Software Engineering I"

Similar presentations


Ads by Google