Download presentation
Presentation is loading. Please wait.
1
Copyright McCabe & Associates 1999 1 Management Overview 9861 Broken Land Parkway Fourth Floor Columbia, Maryland 21046 800-638-6316 www.mccabe.com support@mccabe.com 1-800-634-0150
2
Copyright McCabe & Associates 1999 2 Agenda l McCabe IQ Overview l Software Measurement Issues l McCabe Concepts l Software Quality Metrics l Software Testing l Questions and Answers
3
Copyright McCabe & Associates 1999 3 About McCabe & Associates 20 Years of Expertise GlobalPresence Analyzed Over 25 Billion Lines of Code
4
Copyright McCabe & Associates 1999 4 McCabe IQ process flow Analysis platformTarget platform Source code Compile and run Execution log Effective Testing Quality Managemen t Instrumented source code McCabe IQ
5
Copyright McCabe & Associates 1999 5 McCabe IQ and Configuration Management Merant PVCS Rational ClearCase CA Endevor McCabe IQ Execution Log Test Environmen t Effectiv e Testing Quality Management Monitor quality as software changes Manage test environment
6
Copyright McCabe & Associates 1999 6 McCabe IQ and Test Automation McCabe IQ Mercury Interactive: TestDirector WinRunner Source code Test executable Execution log Risk Management Test Management GUI Test Automation Effective Testing Risk-driven test management Effective, automated testing Non-GUI Test Automation
7
Copyright McCabe & Associates 1999 7 McCabe IQ Components McCabe IQ Framework ( metrics, data, visualization, testing, API ) TESTING McCabe Test McCabe TestCompress McCabe Slice McCabe ReTest QUALITY ASSURANCE McCabe QA McCabe Data McCabe Compare McCabe Change Source Code Parsing Technology (C, C++, Java, Visual Basic, COBOL, Fortran, Ada)
8
Copyright McCabe & Associates 1999 8 McCabe QA McCabe QA measures software quality with industry-standard metrics –Manage technical risk factors as software is developed and changed –Improve software quality using detailed reports and visualization –Shorten the time between releases –Develop contingency plans to address unavoidable risks
9
Copyright McCabe & Associates 1999 9 McCabe Data McCabe Data pinpoints the impact of data variable modifications –Identify usage of key data elements and data types –Relate data variable changes to impacted logic –Focus testing resources on the usage of selected data
10
Copyright McCabe & Associates 1999 10 McCabe Compare McCabe Compare identifies reusable and redundant code –Simplify maintenance and re-engineering of applications through the consolidation of similar code modules –Search for software defects in similar code modules, to make sure they’re fixed consistently throughout the software
11
Copyright McCabe & Associates 1999 11 McCabe Change McCabe Change identifies new and changed modules –Manage change with more precision than the file-level information from CM tools –Work with a complete technical risk profile l Complex? l Poorly tested? l New or changed? –Focus review and test efforts
12
Copyright McCabe & Associates 1999 12 McCabe test maximizes testing effectiveness –Focus testing on high-risk areas –Objectively measure testing effectiveness –Increase the failure detection rate during internal testing –Assess the time and resources needed to ensure a well-tested application –Know when to stop testing McCabe Test
13
Copyright McCabe & Associates 1999 13 McCabe Slice McCabe Slice traces functionality to implementation –Identifies code that implements specific functional transactions –Isolates code that is unique to the implementation of specific functional transactions –Helps extract business rules for application redesign
14
Copyright McCabe & Associates 1999 14 McCabe IQ Components Summary l McCabe QA: Improve quality with metrics l McCabe Data: Analyze data impact l McCabe Compare: Eliminate duplicate code l McCabe Change: Focus on changed software l McCabe Test: Increase test effectiveness l McCabe TestCompress: Increase test efficiency l McCabe Slice: Trace functionality to code l McCabe ReTest: Automate regression testing
15
Copyright McCabe & Associates 1999 15 Software Measurement Issues l Risk management l Software metrics l Complexity metrics l Complexity metric evaluation l Benefits of complexity measurement
16
Copyright McCabe & Associates 1999 16 Software Risk Management l Software risk falls into two major categories –Non-technical risk: how important is the system? l Usually known early –Technical risk: how likely is the system to fail? l Often known too late l Complexity analysis quantifies technical risk –Helps quantify reliability and maintainability l This helps with prioritization, resource allocation, contingency planning, etc. –Guides testing l Focuses effort to mitigate greatest risks l Helps deploy testing resources efficiently
17
Copyright McCabe & Associates 1999 17 Software Metrics Overview l Metrics are quantitative measures –Operational: cost, failure rate, change effort, … –Intrinsic: size, complexity, … l Most operational metrics are known too late –Cost, failure rate are only known after deployment –So, they aren’t suitable for risk management l Complexity metrics are available immediately –Complexity is calculated from source code l Complexity predicts operational metrics –Complexity correlates with defects, maintenance costs,...
18
Copyright McCabe & Associates 1999 18 Complexity Metric Evaluation l Good complexity metrics have three properties –Descriptive: objectively measure something –Predictive: correlate with something interesting –Prescriptive: guide risk reduction l Consider lines of code –Descriptive: yes, measures software size –Predictive, Prescriptive: no l Consider cyclomatic complexity –Descriptive: yes, measures decision logic –Predictive: yes, predicts errors and maintenance –Prescriptive: yes, guides testing and improvement
19
Copyright McCabe & Associates 1999 19 Benefits of Complexity Measurement l Complexity metrics are available from code –They can even be estimated from a design l They provide continuous feedback –They can identify high-risk software as soon as it is written or changed l They pinpoint areas of potential instability –They can focus resources for reviews, testing, and code improvement l They help predict eventual operational metrics –Systems with similar complexity metric profiles tend to have similar test effort, cost, error frequency,...
20
Copyright McCabe & Associates 1999 20 McCabe Concepts l Definition: In C and C++, a module is a function or subroutine with a single entry point and a single exit point. A module is represented by a rectangular box on the Battlemap. main function afunction c function dprintf Difficult to maintainable module Difficult to test module Well-designed, testable module Library module
21
Copyright McCabe & Associates 1999 21 Analyzing a Module l For each module, an annotated source listing and flowgraph is generated. l Flowgraph - an architectural diagram of a software module’s logic. 1main() 2{ 3printf(“example”); 4if (y > 10) 5 b(); 6else 7 c(); 8printf(“end”); 9} Stmt Code Number main Flowgraph node:statement or block of sequential statements condition end of condition edge: flow of control between nodes 1-3 4 57 8-9 Battlemap main bcprintf
22
Copyright McCabe & Associates 1999 22 if (i) ; if (i) ; else ; if (i || j) ; do ; while (i);while (i) ;switch(i) { case 0: break;... } Flowgraph Notation (C) if (i && j) ;
23
Copyright McCabe & Associates 1999 23 Flowgraph and Its Annotated Source Listing 0 1* 2 3 4* 5 6* 7 8 9 Origin information Node correspondence Metric information Decision construct
24
Copyright McCabe & Associates 1999 24 Low Complexity Software l Reliable –Simple logic l Low cyclomatic complexity –Not error-prone –Easy to test l Maintainable –Good structure l Low essential complexity –Easy to understand –Easy to modify
25
Copyright McCabe & Associates 1999 25 Moderately Complex Software l Unreliable –Complicated logic l High cyclomatic complexity –Error-prone –Hard to test l Maintainable –Can be understood –Can be modified –Can be improved
26
Copyright McCabe & Associates 1999 26 Highly Complex Software l Unreliable –Error prone –Very hard to test l Unmaintainable –Poor structure l High essential complexity –Hard to understand –Hard to modify –Hard to improve
27
Copyright McCabe & Associates 1999 27 Would you buy a used car from this software? l Problem: There are size and complexity boundaries beyond which software becomes hopeless –Too error-prone to use –Too complex to fix –Too large to redevelop l Solution: Control complexity during development and maintenance –Stay away from the boundary
28
Copyright McCabe & Associates 1999 28 Important Complexity Measures l Cyclomatic complexity: v(G) –Amount of decision logic l Essential complexity: ev(G) –Amount of poorly-structured logic l Module design complexity: iv(G) –Amount of logic involved with subroutine calls l Data complexity: sdv –Amount of logic involved with selected data references
29
Copyright McCabe & Associates 1999 29 Cyclomatic Complexity l The most famous complexity metric l Measures amount of decision logic l Identifies unreliable software, hard-to-test software l Related test thoroughness metric, actual complexity, measures testing progress
30
Copyright McCabe & Associates 1999 30 l Cyclomatic complexity, v - A measure of the decision logic of a software module. –Applies to decision logic embedded within written code. –Is derived from predicates in decision logic. –Is calculated for each module in the Battlemap. –Grows from 1 to high, finite number based on the amount of decision logic. –Is correlated to software quality and testing quantity; units with higher v, v>10, are less reliable and require high levels of testing. Cyclomatic Complexity
31
Copyright McCabe & Associates 1999 31 Cyclomatic Complexity 1 4 2 6 7 8 9 11 13 14 15 35 1012 region method regions = 11 Beware of crossing lines R1R2 R3R4 R5 R6 R7 R8 R9 R10 R11 19 23 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 20 21 22 23 24 edges and node method e = 24, n = 15 v = 24 -15 +2 v = 11 =2 =1 =2 =1 predicate method v = + 1 v = 11
32
Risks of increasingv TIME Vital Signs and High v’s 5 20 15 10 Higher risk of failures Difficult to understand Unpredictable expected results Complicated test environments including more test drivers Knowledge transfer constraints to new staff
33
Copyright McCabe & Associates 1999 33 Essential Complexity l Measures amount of poorly-structured logic l Remove all well-structured logic, take cyclomatic complexity of what’s left l Identifies unmaintainable software l Pathological complexity metric is similar –Identifies extremely unmaintainable software
34
Copyright McCabe & Associates 1999 34 l Essential complexity, ev - A measure of “structuredness” of decision logic of a software module. –Applies to decision logic embedded within written code. –Is calculated for each module in the Battlemap. –Grows from 1 to v based on the amount of “unstructured” decision logic. –Is associated with the ability to modularize complex modules. –If ev increases, then the coder is not using structured programming constructs. Essential Complexity
35
Copyright McCabe & Associates 1999 35 Branching out of a loopBranching in to a loop Branching into a decision Branching out of a decision Essential Complexity - Unstructured Logic
36
Copyright McCabe & Associates 1999 36 Cyclomatic Complexity = 4 Essential Complexity - Flowgraph Reduction Essential Complexity = 1 l Essential complexity, ev, is calculated by reducing the module flowgraph. Reduction is completed by removing decisions that conform to single-entry, single-exit constructs.
37
Copyright McCabe & Associates 1999 37 Essential Complexity l Flowgraph and reduced flowgraph after structured constructs have been removed, revealing decisions that are unstructured. v = 5 Reduced flowgraph v = 3 Therefore ev of the original flowgraph = 3 Superimposed essential flowgraph
38
Copyright McCabe & Associates 1999 38 Essential Complexity Good designs Can quickly deteriorate! v = 10 ev = 1 v = 11 ev = 10 l Essential complexity helps detect unstructured code.
39
Risks of increasingev TIME Vital Signs and High ev’s 1 10 6 3 Intricate logic Conflicting decisions Unrealizable test paths Constraints for architectural improvement Difficult knowledge transfer to new staff
40
TIME How to Manage and Reduce v and ev Decreasing and managing v and ev 1 20 15 10 Emphasis on design architecture and methodology Development and coding standards QA procedures and reviews Peer evaluations Automated tools Application portfolio management Modularization
41
Copyright McCabe & Associates 1999 41 How Much Supervising Is Done? Module Design Complexity
42
Copyright McCabe & Associates 1999 42 Module design complexity l Measures amount of decision logic involved with subroutine calls l Identifies “managerial” modules l Indicates design reliability, integration testability l Related test thoroughness metric, tested design complexity, measures integration testing progress
43
Copyright McCabe & Associates 1999 43 l Module design complexity, iv - A measure of the decision logic that controls calls to subroutines. –Applies to decision logic embedded within written code. –Is derived from predicates in decision logic associated with calls. –Is calculated for each module in the Battlemap. –Grows from 1 to v based on the complexity of calling subroutines. –Is related to the degree of "integratedness" between a calling module and its called modules. Module Design Complexity
44
Copyright McCabe & Associates 1999 44 Module Design Complexity l Module design complexity, iv, is calculated by reducing the module flowgraph. Reduction is completed by removing decisions and nodes that do not impact the calling control over a module’s immediate subordinates.
45
Copyright McCabe & Associates 1999 45 Module Design Complexity main proge progd iv = 3 Therefore, iv of the original flowgraph = 3 Reduced Flowgraph v = 3 proge() progd() main v = 5 proge() progd() l Example: main() { if (a == b) progd(); if (m == n) proge(); switch(expression) { case value_1: statement1; break; case value_2: statement2; break; case value_3: statement3; } do not impact calls
46
Copyright McCabe & Associates 1999 46 Data complexity l Actually, a family of metrics –Global data complexity (global and parameter), specified data complexity, date complexity l Measures amount of decision logic involved with selected data references l Indicates data impact, data testability l Related test thoroughness metric, tested data complexity, measures data testing progress
47
Copyright McCabe & Associates 1999 47 Data complexity calculation PathsConditions Pb : 1-2-3-4-9-3-4-9-12 C1 = T, C2 = T, C2 = F P2 : 1-2-12C1 = F P3 : 1-2-3-4-9-12C1 = T, C2 = F v = 6 M : data complexity = 3 M : => Data A C1 C2 C1 C2 C3 C4 C5 1 2 5 7 8 3 10 11 12 4* 9 6 3 2 1 12 4* 9
48
Copyright McCabe & Associates 1999 48 Module Metrics Report v, number of unit test paths for a moduleTotal number of test paths for all modulesiv, number of integration tests for a module Average number of test paths for each module
49
Copyright McCabe & Associates 1999 49 l Deriving Tests – Creating a “Good” Set of Tests l Verifying Tests – Verifying that Enough Testing was Performed – Providing Evidence that Testing was Good Enough l When to Stop Testing l Prioritizing Tests – Ensuring that Critical or Modified Code is Tested First l Reducing Test Duplication – Identifying Similar Tests That Add Little Value & Removing Them Common Testing Challenges
50
Copyright McCabe & Associates 1999 50 Requirements Test Scenarios Static Identification of Test Paths Implementation Black Box White Box Sub-System or System Analysis An Improved Testing Process
51
Copyright McCabe & Associates 1999 51 Execute Code Trace Info Source Code Parsing Build Executable Import The McCabe Tools Requirements Tracing Test Coverage Untested Paths Database Instrumented Source Code Export What is McCabe Test?
52
Copyright McCabe & Associates 1999 52 l Color Scheme Represents Coverage No Trace File Imported Coverage Mode
53
Copyright McCabe & Associates 1999 53 l Colors Show “Testedness” l Lines Show Execution Between Modules l Color Scheme: - Branches - Paths - Lines of Code PartiallyTested Tested Untested Trace File Imported 3 67% My_Func1ion Coverage Results
54
Copyright McCabe & Associates 1999 54 Module _ >Slice Coverage Results at Unit Level
55
Copyright McCabe & Associates 1999 55 l Examine Partially Tested Modules l Visualize Untested Modules l Module Names Provide Insight into Additional Tests Module Name ‘search’ Deriving Functional Tests
56
Copyright McCabe & Associates 1999 56 Deriving Tests at the Unit Level 18 times Statistical Paths = 10 18 l Too Many Theoretical Tests! l What is the Minimum Number of Tests? l What is a “Good” Number of Tests? 010 18 Minimum yet effective testing? Too Few TestsToo Many Tests
57
Copyright McCabe & Associates 1999 57 Example ‘A’Example ‘B’ Which Function Is More Complex? Code Coverage
58
Copyright McCabe & Associates 1999 58 Example ‘A’Example ‘B’ 2 Tests Required Code Coverage Is Not Proportional to Complexity Using Code Coverage
59
Copyright McCabe & Associates 1999 59 McCabe's Cyclomatic Complexity v(G) Number of Linearly Independent Paths One Additional Path Required to Determine the Independence of the 2 Decisions McCabe's Cyclomatic Complexity
60
Copyright McCabe & Associates 1999 60 Complexity = 10 Deriving Tests at the Unit Level Minimum 10 Tests Will: Ensure Code Coverage Test Independence of Decisions
61
Copyright McCabe & Associates 1999 61 C E B G A F D M=N O=P X=Y S=T Basis set of pathsPath conditions P1: ABCBDEFPb: M=N,O=P,S=T,O not = P P2: AGDEFP2: M not = N, X=Y P3: ABDEFP3: M=N,O not = P P4: ABCFP4: M=N,O=P,S not = T P5: AGEFP5: M not = N,X not = Y Unit Level Test Paths - Baseline Method l The baseline method is a technique used to locate distinct paths within a flowgraph. The size of the basis set is equal to v(G). v = 5
62
Copyright McCabe & Associates 1999 62 E FG H A BC D M NO P I JK L R1 R2 R3 R4 R5 1. Generates independent tests Basis set P1: ACDEGHIKLMOP P2: ABD… P3: ACDEFH… P4: ACDEGHIJL… P5: ACDEGHIKLMNP 2. Code coverage - frequency of execution NodeABCDEFGHIJKLMNOP Count5145514551455145 Structured Testing Coverage
63
Copyright McCabe & Associates 1999 63 E FG H A BC D M NO P I JK L R1 R2 R3 R4 R5 1. Generates independent tests Basis set P1: ABDEFHIJLMNP P2: ACD… P3: ABDEGH… P4: ABDEGHIKL… P5: ABDEGHIKLMOP 2. Code coverage - frequency of execution NodeABCDEFGHIJKLMNOP Count5415541554155415 Other Baselines - Different Coverage Previous code coverage - frequency of execution NodeABCDEFGHIJKLMNOP Count5145514551455145 Same number of tests; which coverage is more effective?
64
Copyright McCabe & Associates 1999 64 Untested Paths at Unit Level l Cyclomatic Test Paths –Module _ >Test Paths –Complete Test Paths by Default l Configurable Reports –Preferences _ >Testing –Modify List of Graph/Test Path Flowgraphs Module _ >Test Paths Remaining Untested Test Paths
65
Copyright McCabe & Associates 1999 65 Untested Branches at Unit Level Preferences _ >Testing (Add ‘Tested Branches’ Flowgraph to List) Module _ >Test Paths Number of Executions for Decisions Untested Branches
66
Copyright McCabe & Associates 1999 66 Untested Paths at Higher Level l System Level Integration Paths –Based on S 1 –End-to-End Execution –Includes All iv(G) Paths S 1 = 6
67
Copyright McCabe & Associates 1999 67 Untested Paths at Higher Level l System Level Integration Paths –Displayed Graphically –Textual Report –Theoretical Execution Paths –Show Only Untested Paths S 1 = 6
68
Copyright McCabe & Associates 1999 68 Untested Paths at Higher Level l Textual Report of End-to-End Decisions Decision Values with Line/Node # Module Calling List
69
Copyright McCabe & Associates 1999 69 l Use Coverage to Verify Tests l Store Coverage Results in Repository l Use Execution Flowgraphs to Verify Tests Verifying Tests
70
Copyright McCabe & Associates 1999 70 Verifying Tests Using Coverage l Four Major Coverage Techniques: – Code Coverage – Branch Coverage – Path Coverage – Boolean Coverage (MC/DC) 67% 23% 0% 35% 100%
71
Copyright McCabe & Associates 1999 71 When to Stop Testing l Coverage to Assess Testing Completeness –Branch Coverage Reports l Coverage Increments –How Much New Coverage for Each New Test of Tests?
72
Copyright McCabe & Associates 1999 72 l Is All of the System Equally Important? l Is All Code in An Application Used Equally? l 10% of Code Used 90% of Time l Remaining 90% Only Used 10% of Time l Where Do We Need to Test Most? When to Stop Testing
73
Copyright McCabe & Associates 1999 73 When to Stop Testing / Prioritizing Tests l Locate “Critical” Code –Important Functions –Modified Functions –Problem Functions l Mark Modules –Create New “Critical” Group l Import Coverage l Assess Coverage for “Critical” Code –Coverage Report for “Critical” Group –Examine Untested Branches 32 67% Runproc 39 52% Search 56 My_Func1ion
74
Copyright McCabe & Associates 1999 74 l Optionally Use Several “Critical” Groups l Increasing Levels l Determine Coverage for Each Group l Focus Testing Effort on Critical Code Coverage 30% 25% 90% 70% 50% Insufficient Testing? Useful as a Management Technique Criticality Coverage
75
Copyright McCabe & Associates 1999 75 When to Stop Testing l Store Coverage in Repository –With Name & Author l Load Coverage –Multiple Selections –Share Between Users –Import Between Analyses with Common Code Testing _ >Load/Save Testing Data
76
Copyright McCabe & Associates 1999 76 Testing the Changes Version 1.0 - Coverage Results Version 1.1 - Previous Coverage Results Imported Into New Analysis Changed Code Import Previous Coverage Results Into New Analysis: Parser Detects Changed Code Coverage Removed for Modified or New Code
77
Copyright McCabe & Associates 1999 77 Testing the Changes l Store Coverage for Versions –Use Metrics Trending to Show Increments –Objective is to Increase Coverage between Releases Incremental Coverage
78
Copyright McCabe & Associates 1999 78 McCabe Change l Marking Changed Code –Reports Showing Change Status –Coverage Reports for Changed Modules l Configurable Change Detection –Standard Metrics –“String Comparison” Changed Code
79
Copyright McCabe & Associates 1999 79 Manipulating Coverage l Addition/Subtraction of slices –The technique:
80
Copyright McCabe & Associates 1999 80 Slice Manipulation Slice Operations l Manipulate Slices Using Set Theory l Export Slice to File –List of Executed Lines l Must be in Slice Mode
81
Copyright McCabe & Associates 1999 81 Review l McCabe IQ Products l Metrics –cyclomatic complexity, v –essential complexity, ev –module design complexity, iv l Testing –Deriving Tests –Verifying Tests –Prioritizing Tests –When is testing complete? l Managing Change
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.