University of Southern California Center for Systems and Software Engineering Software Metrics Unification and Productivity Domain Workshop Summary Brad.

Slides:



Advertisements
Similar presentations
Automating Software Module Testing for FAA Certification Usha Santhanam The Boeing Company.
Advertisements

A Sizing Framework for DoD Software Cost Analysis Raymond Madachy, NPS Barry Boehm, Brad Clark and Don Reifer, USC Wilson Rosa, AFCAA
Copyright 2000, Stephan Kelley1 Estimating User Interface Effort Using A Formal Method By Stephan Kelley 16 November 2000.
COSYSMO 2.0 Workshop Summary (held Monday, March 17 th 2008) USC CSSE Annual Research Review March 18, 2008 Jared Fortune.
Chapter 19: Network Management Business Data Communications, 4e.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Ricardo Valerdi USC Center for Systems and Software.
USC 21 st International Forum on Systems, Software, and COCOMO Cost Modeling Nov 2006 University of Southern California Center for Software Engineering.
University of Southern California Center for Systems and Software Engineering Next-Generation Software Sizing and Costing Metrics Workshop Report Wilson.
Automated Analysis and Code Generation for Domain-Specific Models George Edwards Center for Systems and Software Engineering University of Southern California.
University of Southern California Center for Systems and Software Engineering Code Counter Suite - Difftool Overview Michael Lee - The Aerospace Corporation.
Integration of Software Cost Estimates Across COCOMO, SEER- SEM, and PRICE-S models Tom Harwick, Engineering Specialist Northrop Grumman Corporation Integrated.
University of Southern California Center for Systems and Software Engineering Productivity Data Analysis and Issues Brad Clark, Thomas Tan USC CSSE Annual.
University of Southern California Center for Systems and Software Engineering An Investigation on Domain-Based Effort Distribution Thomas Tan 26 th International.
University of Southern California Center for Systems and Software Engineering A Tractable Approach to Handling Software Productivity Domains Thomas Tan.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
May 11, 2004CS WPI1 CS 562 Advanced SW Engineering Lecture #5 Tuesday, May 11, 2004.
University of Southern California Center for Systems and Software Engineering 1 © USC-CSSE Unified CodeCounter (UCC) with Differencing Functionality Marilyn.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
Ch8: Management of Software Engineering. 1 Management of software engineering  Traditional engineering practice is to define a project around the product.
University of Southern California Center for Systems and Software Engineering Building Cost Estimating Relationships for Acquisition Decision Support Brad.
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 Assessing and Estimating Corrective, Enhancive, and Reductive.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
USC 21 st International Forum on Systems, Software, and COCOMO Cost Modeling Nov 2006 University of Southern California Center for Software Engineering.
1 Software Testing and Quality Assurance Lecture 30 – Testing Systems.
1 Discussion on Reuse Framework Jared Fortune, USC Ricardo Valerdi, MIT COSYSMO COCOMO Forum 2008 Los Angeles, CA.
COCOMO II Database Brad Clark Center for Software Engineering Annual Research Review March 11, 2002.
University of Southern California Center for Systems and Software Engineering Software Cost Estimation Metrics Manual 26 th International Forum on COCOMO.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 Reuse and Maintenance Estimation Vu Nguyen March 17, 2009.
About the Presentations The presentations cover the objectives found in the opening of each chapter. All chapter objectives are listed in the beginning.
Business Intelligence Dr. Mahdi Esmaeili 1. Technical Infrastructure Evaluation Hardware Network Middleware Database Management Systems Tools and Standards.
University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas.
SRDR Data Analysis Workshop Summary Brad Clark Ray Madachy Thomas Tan 25th International Forum on COCOMO and Systems/Software Cost Modeling November 5,
Local Bias and its Impacts on the Performance of Parametric Estimation Models Accepted by PROMISE2011 (Best paper award) Ye Yang, Lang Xie, Zhimin He (iTechs)
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy USC Center for Systems and Software Engineering
USC Annual Research Review - March 2006 University of Southern California Center for Software Engineering USC Affiliates Code Counter News USC Annual Research.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 An Analysis of Changes in Productivity and COCOMO Cost.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
Cmpe 589 Spring Software Quality Metrics Product  product attributes –Size, complexity, design features, performance, quality level Process  Used.
1 ECE 453 – CS 447 – SE 465 Software Testing & Quality Assurance Lecture 22 Instructor Paulo Alencar.
Chapter 6 : Software Metrics
Next Back MAP 3-1 Management Information Systems for the Information Age Copyright 2002 The McGraw-Hill Companies, Inc. All rights reserved Chapter 3 Database.
At A Glance VOLT is a freeware, platform independent tool set that coordinates cross-mission observation planning and scheduling among one or more space.
Software Engineering SM ? 1. Outline of this presentation What is SM The Need for SM Type of SM Size Oriented Metric Function Oriented Metric 218/10/2015.
Enabling Reuse-Based Software Development of Large-Scale Systems IEEE Transactions on Software Engineering, Volume 31, Issue 6, June 2005 Richard W. Selby,
University of Southern California Center for Systems and Software Engineering Vu Nguyen, Barry Boehm USC-CSSE ARR, May 1, 2014 COCOMO II Cost Driver Trends.
SOFTWARE METRICS. Software Process Revisited The Software Process has a common process framework containing: u framework activities - for all software.
SFWR ENG 3KO4 Slide 1 Management of Software Engineering Chapter 8: Fundamentals of Software Engineering C. Ghezzi, M. Jazayeri, D. Mandrioli.
University of Southern California Center for Systems and Software Engineering © 2010, USC-CSSE 1 Trends in Productivity and COCOMO Cost Drivers over the.
Function Points Synthetic measure of program size used to estimate size early in the project Easier (than lines of code) to calculate from requirements.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
Effort Estimation In WBS,one can estimate effort (micro-level) but needed to know: –Size of the deliverable –Productivity of resource in producing that.
Cross Language Clone Analysis Team 2 February 3, 2011.
Proposed Metrics Definition Highlights Raymond Madachy Naval Postgraduate School CSSE Annual Research Review March 8, 2010.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 21 Slide 1 Software evolution.
University of Southern California Center for Systems and Software Engineering A Tractable Approach to Handling Software Productivity Domains Thomas Tan.
11/04/091 Some Topics Concerning The COSYSMOR Model/Tool John E. Gaffney, Jr Center For Process Improvement Excellence.
Center for Systems and Software Engineering DoD Software Resource Data Reports (SRDRs) and Cost Data Analysis Workshop Summary Brad Clark University of.
1 Agile COCOMO II: A Tool for Software Cost Estimating by Analogy Cyrus Fakharzadeh Barry Boehm Gunjan Sharman SCEA 2002 Presentation University of Southern.
Estimate Testing Size and Effort Using Test Case Point Analysis
COCOMO III Workshop Summary
Productivity Data Analysis and Issues
Constructive Cost Model
SLOC and Size Reporting
Software Metrics “How do we measure the software?”
More on Estimation In general, effort estimation is based on several parameters and the model ( E= a + b*S**c ): Personnel Environment Quality Size or.
Software metrics.
Automated Analysis and Code Generation for Domain-Specific Models
Center for Software and Systems Engineering,
Presentation transcript:

University of Southern California Center for Systems and Software Engineering Software Metrics Unification and Productivity Domain Workshop Summary Brad Clark Ray Madachy Barry Boehm Vu Nguyen 24th International Forum on COCOMO and Systems/Software Cost Modeling November 5, 2009

University of Southern California Center for Systems and Software Engineering Workshop Participants Tony Abolfotouh Jill Allen Redge Bartholomew Barry Boehm* Winsor Brown Brad Clark* Linda Esker Dennis Goldenson Jared Fortune John Gaffney Gary Hafen Barbara Hitchings 24th International Forum on COCOMO and Systems/Software Cost Modeling2 Qi Li Dan Ligett Ray Madachy* Ali Malik Vu Nguyen Wilson Rosa Rick Selby Susan Stefanec Tom Tan Ye Yang * Workshop moderators

University of Southern California Center for Systems and Software Engineering Topics Covered Source Lines of Code (SLOC) Types SLOC Conversion Ratios Experiment Requirements Volatility Measurements Operational Environment to Operational Context Mapping Productivity Domains to Application Difficulty Mapping Insights on Software Maintenance 24th International Forum on COCOMO and Systems/Software Cost Modeling3

University of Southern California Center for Systems and Software Engineering Source Lines of Code (SLOC) Types Ray Madachy, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling4

University of Southern California Center for Systems and Software Engineering Software Size Categories Overview Reviewed core software size type definitions and adaption parameters Participants provided experiences on size categories and domain- specific practices in reused/modified/converted/translated software Walked through modified code exercises 24th International Forum on COCOMO and Systems/Software Cost Modeling5

University of Southern California Center for Systems and Software Engineering Core Software Size Types

University of Southern California Center for Systems and Software Engineering Software Size Type Results Discussions forced clarification of categories and crisper definitions Practical sizing guidance captured in adaptation parameter ranges –E.g. maximum values where adapted code is instead replaced with new software identify range tops Created model-agnostic AAF weight ranges Added sub-categories for generated, converted and translated code to distinguish what is handled for applying equivalent size –Generator statements vs. generated –Translated as-is vs. optimized –Converted as-is vs. optimized 24th International Forum on COCOMO and Systems/Software Cost Modeling7

University of Southern California Center for Systems and Software Engineering Software Size Type Results (cont.) Category additions will affect SLOC inclusion rules Practical guidance and updated adaption parameter ranges to be included in AFCAA Software Cost Estimation Metrics Manual Change request for CodeCount to flag and count moved code 24th International Forum on COCOMO and Systems/Software Cost Modeling8

University of Southern California Center for Systems and Software Engineering Modified Code Exercise Results 24th International Forum on COCOMO and Systems/Software Cost Modeling9 * If DM or C M is greater than 50%, start over with new ** IM could be driven by safety critical applications, environments with high reliability requirements

University of Southern California Center for Systems and Software Engineering Next Steps Revisit size types after updating AFCAA manual definitions Create worked-out exercises Data analysis on existing data to find empirical value ranges for the reuse parameters for each size type. 24th International Forum on COCOMO and Systems/Software Cost Modeling10

University of Southern California Center for Systems and Software Engineering SLOC Conversion Ratios Experiment Brad Clark, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling11

University of Southern California Center for Systems and Software Engineering Purpose SLOC size is the most significant for SLOC-based models For analysis, the definition of a source line of code needs to be as consistent as possible to eliminate noise in the data –A logical source line of code has be selected as the baseline SLOC definition If a source line of code has been defined at either Physical or Non-Commented Source Statements (NCSS), these counts need to be converted to a logical SLOC –Physical: a line in a file, e.g. carriage returns –NCSS: a line in a file that is not a blank or comment line 24th International Forum on COCOMO and Systems/Software Cost Modeling12

University of Southern California Center for Systems and Software Engineering Logical Source Statements 24th International Forum on COCOMO and Systems/Software Cost Modeling13

University of Southern California Center for Systems and Software Engineering Experiment USC used the code count tool to count 21 application source code files in different languages: –C#: 4 apps: Rainbow, Cuyahoga, SharpMap & Kryo –C++: 3 apps: Agar, Better String Library & Crypto++ Library –Ada: 3 apps: TASKIT (Tasking Ada Simulation Kit), TEDIT & GNAVI v0.98 (Ada Visual Interface) –Perl: 3 apps: "Sun Automated Diagnostic Environment (StorADE)”, Enodes & OpenKore –Java: 4 apps: Robocode, Dozer, Jstock & dLog4j –PHP: 4 apps: Helix, Zcore, p4a & FreeWebsho Physical, NCSS, and Logical counts were collected for each file 24th International Forum on COCOMO and Systems/Software Cost Modeling14

University of Southern California Center for Systems and Software Engineering Cautions This is the beginning of an investigation We need more data to produce conclusive results Data call for CSSE Affiliates for run the code count tool over several large applications –We only need the results, not the added, modified, deleted counts for each file The analysis presented here may surprise you Do not start using these results until more data is collected 24th International Forum on COCOMO and Systems/Software Cost Modeling15

University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling16

University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling17

University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling18

University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling19

University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling20

University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling21

University of Southern California Center for Systems and Software Engineering Summary Data Count Physical to Logical NCSS to Logical Ada C# Perl PHP C Java Overall th International Forum on COCOMO and Systems/Software Cost Modeling22

The ratio between physical-ncb (no comment, no blank) and logical Total lines including comments, blanks, and code (you may call this physical count) Logical = count of statements Comments

University of Southern California Center for Systems and Software Engineering Conclusions Results are more severe that expected More data needs to be collected –Physical count –NCSS count –Logical count –Language Next Steps Request help from CSSE Affiliates that are already using the Code Count tool to submit 4 to 5 counts of complete applications 24th International Forum on COCOMO and Systems/Software Cost Modeling24

University of Southern California Center for Systems and Software Engineering Requirements Volatility Measurements Barry Boehm, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling25

University of Southern California Center for Systems and Software Engineering Requirements Volatility The final Software Resources Data Report (SRDR) requires the reporting of Requirements Volatility Requirements Volatility is defined as follows: –“As part of the final DD Form report, indicate the amount of requirements volatility using a qualitative scale (very low, low, nominal, high, very high) relative to similar systems of the same type. This should be a relative measure rather than an absolute one in order to understand how initial expectations were or were not met during the course of the software development.” 3. Product Size Reporting Provide Actuals at Final Delivery 1. Number of Software Requirements, not including External Interface Requirements (unless noted in associated Data Dictionary) Number of External Interface Requirements (i.e., not under project control) Amount of Requirements Volatility encountered during development (1=Very Low.. 5=Very High) 4 24th International Forum on COCOMO and Systems/Software Cost Modeling

University of Southern California Center for Systems and Software Engineering Discussion -1 What are alternative questions to be asked about Requirements and Volatility? –Average rates of change (% per month) of Software Requirements and External Interface Requirements during reporting periods –Level of detail of requirements definition (see referenced examples) 1.Mission / Business Objectives 2.Mission / Business Events 3.Use Cases 4.Use Case Analysis 5.Functional Details 24th International Forum on COCOMO and Systems/Software Cost Modeling

University of Southern California Center for Systems and Software Engineering Discussion -2 Code count and Requirements count (added, modified, deleted) –Start with requirements count before Use Case counts SRDR data has beginning and ending requirements count –Volatility is measured against all “Shalls” Counting “modified” requirements presents challenges –Editorial change would modify the requirement without changing the requirement –Changing one character may change the whole specification, e.g. 4 seconds to 1 second response time –Develop requirements counting tool “Level of requirements” would make a difference between requirements counts of different systems 24th International Forum on COCOMO and Systems/Software Cost Modeling

University of Southern California Center for Systems and Software Engineering Discussion -3 Cross-check: ask for original estimate of effort and final actual effort –Attribute difference to requirements volatility, bad estimation, etc. A 10% change of requirements may not result in any size change in product size (SLOC) –Periodic collection of requirements changes would provide more insight Ask for a count of the code that was developed for delivery but not delivered –Dead code elimination would impact results –Debug code removal would impact results 24th International Forum on COCOMO and Systems/Software Cost Modeling

University of Southern California Center for Systems and Software Engineering Operational Environment to Operational Context Mapping Brad Clark, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling30

University of Southern California Center for Systems and Software Engineering Background Operational Environment: operating environment, platform or target host in which the software application will operate. There are many of these environments. Operational Context: a continuous range of operating constraints indexed at five points. The range of constraints consider different dimensions of an operational context including electrical power, computing capacity, storage capacity, repair capability, platform volatility, and physical environment accessibility This part of the workshop was an exercise that mapped multiple Operational Environments to an Operational Context with five scales This exercise is based on the presentation given by Thomas Tan earlier in the forum: “A Tractable Approach to Handling Software Productivity Domains” 24th International Forum on COCOMO and Systems/Software Cost Modeling31

University of Southern California Center for Systems and Software Engineering Very Unconstrained Fixed Ground Unconstrained Mobile Ground Shipboard Medical Testing Devices Constrained Avionics – Usual Avionics – Rugged PC Unmanned Airborne Vehicle – Large Missile – Man-in-the-loop Very Constrained Avionics – Sensors Missile – Usual Pace-Maker Manned Vehicle – TASER Unmanned Airborne Vehicle – Usual Unmanned Ground Vehicle – Armed Unmanned Space Submarine – Usual Highly Constrained Unmanned Vehicle – Micro-level Missile – ICBM Manned Space Submarine – Nuclear Operational ContextOperational Environment

Very UnconstrainedUnconstrainedConstrained Very Constrained Highly Constrained Fixed GroundUsual Mobile Ground Usual Shipboard Usual Avionics Usual; Rugged PCSensors Unmanned Airborne Large VehicleUsualMicro-level Missile Man-in-the-loopUsualICBM Manned Space Usual Unmanned Space Usual Newly added environments Submarine UsualNuclear Unmanned Ground Vehicle Armed Manned Vehicle TASER Pace-Maker Usual Medical Testing Devices Usual

University of Southern California Center for Systems and Software Engineering Productivity Domains to Application Difficulty Mapping Brad Clark, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling34

University of Southern California Center for Systems and Software Engineering Background Application Domains: historical terms used to express the realization that different systems exhibit productivities when developing software. There are many, many terms used to express Application Domains. Application Difficulty: the degree of difficulty in specifying, developing, and testing a software application. Expressed as a continuous range of “difficulty” indexed at five points. This part of the workshop was an exercise that mapped multiple Application Domains to an Application Difficulty with five scales This exercise is based on the presentation given by Thomas Tan earlier in the forum: “A Tractable Approach to Handling Software Productivity Domains” 24th International Forum on COCOMO and Systems/Software Cost Modeling35

University of Southern California Center for Systems and Software Engineering Application Difficulties Newly added application types: –Simple spreadsheet programs –Weather forecasting –Semi-automated forces

University of Southern California Center for Systems and Software Engineering Application Difficulties Very Easy Simple Spreadsheet Program Internet – Simple Web Pages Training – Set of screens (similar to spreadsheet) Control and Displays – GUI Builders Maintenance and Diagnostics – Fault Detection

University of Southern California Center for Systems and Software Engineering Application Difficulties Easy Command and Control – Taxi Cab Dispatch Middleware – TCP/IP Implementation Mobile Command Center – Truck Test and Evaluation Simulation and Modeling – Low fidelity simulators Scientific Systems – Offline data reduction Internet – Web Applications (Shopping) Business Systems – Large Nominal Tools – Verification Sensor Control and Processing Process Control Mission Planning

University of Southern California Center for Systems and Software Engineering Application Difficulties Challenging Tools – Safety critical development tools Spacecraft Bus Mission Management Communications – Handles noise, anomalies Infrastructure or Middleware - System of System (SOSCOE) Sensor Control and Processing – Data Fusion Weather forecasting Scientific Systems – Large Data Sets Control and Displays – Voice and image recognition Maintenance and Diagnostic – Fault Isolation and Prognostic, Executive (EAF Level 4) Weapons Delivery and Control – Space Internet – Mega Web Applications Semi-automated forces Training – Simulation-Network (distributed users) Test – Distributed debugging Simulation and Modeling – Physical phenomenon

University of Southern California Center for Systems and Software Engineering Application Difficulties Very Challenging Spacecraft Payload (F6) Business Systems – Trillion-Dollar/Day Transaction System Mission Management – Multi-level Security (Satisfy Orange Book) Mission Management – Safety (Developed to a Standard) Weapon Delivery and Control – Safety Command and Control – System of Systems (C4ISR) Control and Displays – Advance Human Prosthetics Communication – Radio Safety/Security/Frequency- hopping-anti-jam Executive – Security Certification (EAF Level 7)

Application DomainsVery EasyEasyNominalChallengingVery Challenging Business Systems Large biz system Trillion $/day transaction Internet Simple web pages Web application (shopping) Mega-web application Tools and Tool Systems Verification toolsSafety critical Scientific Systems Offline data reduction Large dataset Simulation and Modeling Low fidelity simulator Physical phenomenon Test and Evaluation Usual Distributed debugging TrainingSet of screens Simulation network Command and Control Taxi-cab dispatch SOS (C4ISR) Mission Management Usual Multi-level security and safety Weapon Delivery and Control Weapon spaceSafety Communications Noise, anomalies handling Radio Safety/Security Frequency- hopping Application Difficulty

Application DomainsVery EasyEasyNominalChallengingVery Challenging Control and DisplaysGUI builders Voice and image recognition Advance human prosthetics Infrastructure or Middleware TCP/IP SOS (SOSCOE) Executive EAF level 4+ Security certification (EAF Level 7) Information Assurance Maintenance and Diagnostics Fault detection Fault isolation and prognostics Mission Planning Usual Process Control Usual Sensor Control and Processing UsualData fusion Spacecraft Bus Usual Spacecraft Payload (F6) Newly proposed types Simple Spreadsheet Program Weather Forecasting Usual Semi-automated Forces Usual Mobile Command Center Truck Application Difficulty

University of Southern California Center for Systems and Software Engineering Next Steps Repeat this exercise at the Annual Research Review Begin analyzing DOD SRDR data for productivities by Operational Context and Application Difficulty (Productivity Groups) –Rigors “scrub” of the data (size, effort, schedule, context & difficulty –Divide data into productivity groups –Determine simple cost estimating relationships 43

University of Southern California Center for Systems and Software Engineering Insights on Software Maintenance Vu Nguyen, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling44

University of Southern California Center for Systems and Software Engineering Maintenance –Various activities not covered by cost estimation models –Important constraints and attributes Availability of test code and supporting materials from development team Transfer from development to maintenance organizations Re-verification Entry barriers Skill retention Fiscal funding constraints Team size and skill distribution Small changes may require much testing –Important drivers CPLX Defect level Tools support –Productivity ranges Understanding underlying reasons for the differences between new development and maintenance –Information assurance

University of Southern California Center for Systems and Software Engineering Next Steps Understand underlying reasons for the differences between new development and maintenance Explore drivers for different maintenance scenarios Review drivers for entry barriers Analyze data for possible combining cost drivers for maintenance (e.g., stepwise regression, incrementally adding variables and observe the effects, correlations)