Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Systems and Software Engineering Software Metrics Unification and Productivity Domain Workshop Summary Brad.

Similar presentations


Presentation on theme: "University of Southern California Center for Systems and Software Engineering Software Metrics Unification and Productivity Domain Workshop Summary Brad."— Presentation transcript:

1 University of Southern California Center for Systems and Software Engineering Software Metrics Unification and Productivity Domain Workshop Summary Brad Clark Ray Madachy Barry Boehm Vu Nguyen 24th International Forum on COCOMO and Systems/Software Cost Modeling November 5, 2009

2 University of Southern California Center for Systems and Software Engineering Workshop Participants Tony Abolfotouh Jill Allen Redge Bartholomew Barry Boehm* Winsor Brown Brad Clark* Linda Esker Dennis Goldenson Jared Fortune John Gaffney Gary Hafen Barbara Hitchings 24th International Forum on COCOMO and Systems/Software Cost Modeling2 Qi Li Dan Ligett Ray Madachy* Ali Malik Vu Nguyen Wilson Rosa Rick Selby Susan Stefanec Tom Tan Ye Yang * Workshop moderators

3 University of Southern California Center for Systems and Software Engineering Topics Covered Source Lines of Code (SLOC) Types SLOC Conversion Ratios Experiment Requirements Volatility Measurements Operational Environment to Operational Context Mapping Productivity Domains to Application Difficulty Mapping Insights on Software Maintenance 24th International Forum on COCOMO and Systems/Software Cost Modeling3

4 University of Southern California Center for Systems and Software Engineering Source Lines of Code (SLOC) Types Ray Madachy, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling4

5 University of Southern California Center for Systems and Software Engineering Software Size Categories Overview Reviewed core software size type definitions and adaption parameters Participants provided experiences on size categories and domain- specific practices in reused/modified/converted/translated software Walked through modified code exercises 24th International Forum on COCOMO and Systems/Software Cost Modeling5

6 University of Southern California Center for Systems and Software Engineering Core Software Size Types

7 University of Southern California Center for Systems and Software Engineering Software Size Type Results Discussions forced clarification of categories and crisper definitions Practical sizing guidance captured in adaptation parameter ranges –E.g. maximum values where adapted code is instead replaced with new software identify range tops Created model-agnostic AAF weight ranges Added sub-categories for generated, converted and translated code to distinguish what is handled for applying equivalent size –Generator statements vs. generated –Translated as-is vs. optimized –Converted as-is vs. optimized 24th International Forum on COCOMO and Systems/Software Cost Modeling7

8 University of Southern California Center for Systems and Software Engineering Software Size Type Results (cont.) Category additions will affect SLOC inclusion rules Practical guidance and updated adaption parameter ranges to be included in AFCAA Software Cost Estimation Metrics Manual Change request for CodeCount to flag and count moved code 24th International Forum on COCOMO and Systems/Software Cost Modeling8

9 University of Southern California Center for Systems and Software Engineering Modified Code Exercise Results 24th International Forum on COCOMO and Systems/Software Cost Modeling9 * If DM or C M is greater than 50%, start over with new ** IM could be driven by safety critical applications, environments with high reliability requirements

10 University of Southern California Center for Systems and Software Engineering Next Steps Revisit size types after updating AFCAA manual definitions Create worked-out exercises Data analysis on existing data to find empirical value ranges for the reuse parameters for each size type. 24th International Forum on COCOMO and Systems/Software Cost Modeling10

11 University of Southern California Center for Systems and Software Engineering SLOC Conversion Ratios Experiment Brad Clark, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling11

12 University of Southern California Center for Systems and Software Engineering Purpose SLOC size is the most significant for SLOC-based models For analysis, the definition of a source line of code needs to be as consistent as possible to eliminate noise in the data –A logical source line of code has be selected as the baseline SLOC definition If a source line of code has been defined at either Physical or Non-Commented Source Statements (NCSS), these counts need to be converted to a logical SLOC –Physical: a line in a file, e.g. carriage returns –NCSS: a line in a file that is not a blank or comment line 24th International Forum on COCOMO and Systems/Software Cost Modeling12

13 University of Southern California Center for Systems and Software Engineering Logical Source Statements 24th International Forum on COCOMO and Systems/Software Cost Modeling13

14 University of Southern California Center for Systems and Software Engineering Experiment USC used the code count tool to count 21 application source code files in different languages: –C#: 4 apps: Rainbow, Cuyahoga, SharpMap & Kryo –C++: 3 apps: Agar, Better String Library & Crypto++ Library –Ada: 3 apps: TASKIT (Tasking Ada Simulation Kit), TEDIT & GNAVI v0.98 (Ada Visual Interface) –Perl: 3 apps: "Sun Automated Diagnostic Environment (StorADE)”, Enodes & OpenKore –Java: 4 apps: Robocode, Dozer, Jstock & dLog4j –PHP: 4 apps: Helix, Zcore, p4a-3.4.1 & FreeWebsho Physical, NCSS, and Logical counts were collected for each file 24th International Forum on COCOMO and Systems/Software Cost Modeling14

15 University of Southern California Center for Systems and Software Engineering Cautions This is the beginning of an investigation We need more data to produce conclusive results Data call for CSSE Affiliates for run the code count tool over several large applications –We only need the results, not the added, modified, deleted counts for each file The analysis presented here may surprise you Do not start using these results until more data is collected 24th International Forum on COCOMO and Systems/Software Cost Modeling15

16 University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling16

17 University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling17

18 University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling18

19 University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling19

20 University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling20

21 University of Southern California Center for Systems and Software Engineering 24th International Forum on COCOMO and Systems/Software Cost Modeling21

22 University of Southern California Center for Systems and Software Engineering Summary Data Count Physical to Logical NCSS to Logical Ada30.260.51 C#40.430.69 Perl30.430.70 PHP40.440.76 C++30.540.68 Java40.530.74 Overall210.420.71 24th International Forum on COCOMO and Systems/Software Cost Modeling22

23 The ratio between physical-ncb (no comment, no blank) and logical Total lines including comments, blanks, and code (you may call this physical count) Logical = count of statements Comments

24 University of Southern California Center for Systems and Software Engineering Conclusions Results are more severe that expected More data needs to be collected –Physical count –NCSS count –Logical count –Language Next Steps Request help from CSSE Affiliates that are already using the Code Count tool to submit 4 to 5 counts of complete applications 24th International Forum on COCOMO and Systems/Software Cost Modeling24

25 University of Southern California Center for Systems and Software Engineering Requirements Volatility Measurements Barry Boehm, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling25

26 University of Southern California Center for Systems and Software Engineering Requirements Volatility The final Software Resources Data Report (SRDR) requires the reporting of Requirements Volatility Requirements Volatility is defined as follows: –“As part of the final DD Form 2630-3 report, indicate the amount of requirements volatility using a qualitative scale (very low, low, nominal, high, very high) relative to similar systems of the same type. This should be a relative measure rather than an absolute one in order to understand how initial expectations were or were not met during the course of the software development.” 3. Product Size Reporting Provide Actuals at Final Delivery 1. Number of Software Requirements, not including External Interface Requirements (unless noted in associated Data Dictionary) 636 2. Number of External Interface Requirements (i.e., not under project control) 1325 3. Amount of Requirements Volatility encountered during development (1=Very Low.. 5=Very High) 4 24th International Forum on COCOMO and Systems/Software Cost Modeling

27 University of Southern California Center for Systems and Software Engineering Discussion -1 What are alternative questions to be asked about Requirements and Volatility? –Average rates of change (% per month) of Software Requirements and External Interface Requirements during reporting periods –Level of detail of requirements definition (see referenced examples) 1.Mission / Business Objectives 2.Mission / Business Events 3.Use Cases 4.Use Case Analysis 5.Functional Details 24th International Forum on COCOMO and Systems/Software Cost Modeling

28 University of Southern California Center for Systems and Software Engineering Discussion -2 Code count and Requirements count (added, modified, deleted) –Start with requirements count before Use Case counts SRDR data has beginning and ending requirements count –Volatility is measured against all “Shalls” Counting “modified” requirements presents challenges –Editorial change would modify the requirement without changing the requirement –Changing one character may change the whole specification, e.g. 4 seconds to 1 second response time –Develop requirements counting tool “Level of requirements” would make a difference between requirements counts of different systems 24th International Forum on COCOMO and Systems/Software Cost Modeling

29 University of Southern California Center for Systems and Software Engineering Discussion -3 Cross-check: ask for original estimate of effort and final actual effort –Attribute difference to requirements volatility, bad estimation, etc. A 10% change of requirements may not result in any size change in product size (SLOC) –Periodic collection of requirements changes would provide more insight Ask for a count of the code that was developed for delivery but not delivered –Dead code elimination would impact results –Debug code removal would impact results 24th International Forum on COCOMO and Systems/Software Cost Modeling

30 University of Southern California Center for Systems and Software Engineering Operational Environment to Operational Context Mapping Brad Clark, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling30

31 University of Southern California Center for Systems and Software Engineering Background Operational Environment: operating environment, platform or target host in which the software application will operate. There are many of these environments. Operational Context: a continuous range of operating constraints indexed at five points. The range of constraints consider different dimensions of an operational context including electrical power, computing capacity, storage capacity, repair capability, platform volatility, and physical environment accessibility This part of the workshop was an exercise that mapped multiple Operational Environments to an Operational Context with five scales This exercise is based on the presentation given by Thomas Tan earlier in the forum: “A Tractable Approach to Handling Software Productivity Domains” 24th International Forum on COCOMO and Systems/Software Cost Modeling31

32 University of Southern California Center for Systems and Software Engineering Very Unconstrained Fixed Ground Unconstrained Mobile Ground Shipboard Medical Testing Devices Constrained Avionics – Usual Avionics – Rugged PC Unmanned Airborne Vehicle – Large Missile – Man-in-the-loop Very Constrained Avionics – Sensors Missile – Usual Pace-Maker Manned Vehicle – TASER Unmanned Airborne Vehicle – Usual Unmanned Ground Vehicle – Armed Unmanned Space Submarine – Usual Highly Constrained Unmanned Vehicle – Micro-level Missile – ICBM Manned Space Submarine – Nuclear Operational ContextOperational Environment

33 Very UnconstrainedUnconstrainedConstrained Very Constrained Highly Constrained Fixed GroundUsual Mobile Ground Usual Shipboard Usual Avionics Usual; Rugged PCSensors Unmanned Airborne Large VehicleUsualMicro-level Missile Man-in-the-loopUsualICBM Manned Space Usual Unmanned Space Usual Newly added environments Submarine UsualNuclear Unmanned Ground Vehicle Armed Manned Vehicle TASER Pace-Maker Usual Medical Testing Devices Usual

34 University of Southern California Center for Systems and Software Engineering Productivity Domains to Application Difficulty Mapping Brad Clark, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling34

35 University of Southern California Center for Systems and Software Engineering Background Application Domains: historical terms used to express the realization that different systems exhibit productivities when developing software. There are many, many terms used to express Application Domains. Application Difficulty: the degree of difficulty in specifying, developing, and testing a software application. Expressed as a continuous range of “difficulty” indexed at five points. This part of the workshop was an exercise that mapped multiple Application Domains to an Application Difficulty with five scales This exercise is based on the presentation given by Thomas Tan earlier in the forum: “A Tractable Approach to Handling Software Productivity Domains” 24th International Forum on COCOMO and Systems/Software Cost Modeling35

36 University of Southern California Center for Systems and Software Engineering Application Difficulties Newly added application types: –Simple spreadsheet programs –Weather forecasting –Semi-automated forces

37 University of Southern California Center for Systems and Software Engineering Application Difficulties Very Easy Simple Spreadsheet Program Internet – Simple Web Pages Training – Set of screens (similar to spreadsheet) Control and Displays – GUI Builders Maintenance and Diagnostics – Fault Detection

38 University of Southern California Center for Systems and Software Engineering Application Difficulties Easy Command and Control – Taxi Cab Dispatch Middleware – TCP/IP Implementation Mobile Command Center – Truck Test and Evaluation Simulation and Modeling – Low fidelity simulators Scientific Systems – Offline data reduction Internet – Web Applications (Shopping) Business Systems – Large Nominal Tools – Verification Sensor Control and Processing Process Control Mission Planning

39 University of Southern California Center for Systems and Software Engineering Application Difficulties Challenging Tools – Safety critical development tools Spacecraft Bus Mission Management Communications – Handles noise, anomalies Infrastructure or Middleware - System of System (SOSCOE) Sensor Control and Processing – Data Fusion Weather forecasting Scientific Systems – Large Data Sets Control and Displays – Voice and image recognition Maintenance and Diagnostic – Fault Isolation and Prognostic, Executive (EAF Level 4) Weapons Delivery and Control – Space Internet – Mega Web Applications Semi-automated forces Training – Simulation-Network (distributed users) Test – Distributed debugging Simulation and Modeling – Physical phenomenon

40 University of Southern California Center for Systems and Software Engineering Application Difficulties Very Challenging Spacecraft Payload (F6) Business Systems – Trillion-Dollar/Day Transaction System Mission Management – Multi-level Security (Satisfy Orange Book) Mission Management – Safety (Developed to a Standard) Weapon Delivery and Control – Safety Command and Control – System of Systems (C4ISR) Control and Displays – Advance Human Prosthetics Communication – Radio Safety/Security/Frequency- hopping-anti-jam Executive – Security Certification (EAF Level 7)

41 Application DomainsVery EasyEasyNominalChallengingVery Challenging Business Systems Large biz system Trillion $/day transaction Internet Simple web pages Web application (shopping) Mega-web application Tools and Tool Systems Verification toolsSafety critical Scientific Systems Offline data reduction Large dataset Simulation and Modeling Low fidelity simulator Physical phenomenon Test and Evaluation Usual Distributed debugging TrainingSet of screens Simulation network Command and Control Taxi-cab dispatch SOS (C4ISR) Mission Management Usual Multi-level security and safety Weapon Delivery and Control Weapon spaceSafety Communications Noise, anomalies handling Radio Safety/Security Frequency- hopping Application Difficulty

42 Application DomainsVery EasyEasyNominalChallengingVery Challenging Control and DisplaysGUI builders Voice and image recognition Advance human prosthetics Infrastructure or Middleware TCP/IP SOS (SOSCOE) Executive EAF level 4+ Security certification (EAF Level 7) Information Assurance Maintenance and Diagnostics Fault detection Fault isolation and prognostics Mission Planning Usual Process Control Usual Sensor Control and Processing UsualData fusion Spacecraft Bus Usual Spacecraft Payload (F6) Newly proposed types Simple Spreadsheet Program Weather Forecasting Usual Semi-automated Forces Usual Mobile Command Center Truck Application Difficulty

43 University of Southern California Center for Systems and Software Engineering Next Steps Repeat this exercise at the Annual Research Review Begin analyzing DOD SRDR data for productivities by Operational Context and Application Difficulty (Productivity Groups) –Rigors “scrub” of the data (size, effort, schedule, context & difficulty –Divide data into productivity groups –Determine simple cost estimating relationships 43

44 University of Southern California Center for Systems and Software Engineering Insights on Software Maintenance Vu Nguyen, moderator 24th International Forum on COCOMO and Systems/Software Cost Modeling44

45 University of Southern California Center for Systems and Software Engineering Maintenance –Various activities not covered by cost estimation models –Important constraints and attributes Availability of test code and supporting materials from development team Transfer from development to maintenance organizations Re-verification Entry barriers Skill retention Fiscal funding constraints Team size and skill distribution Small changes may require much testing –Important drivers CPLX Defect level Tools support –Productivity ranges Understanding underlying reasons for the differences between new development and maintenance –Information assurance

46 University of Southern California Center for Systems and Software Engineering Next Steps Understand underlying reasons for the differences between new development and maintenance Explore drivers for different maintenance scenarios Review drivers for entry barriers Analyze data for possible combining cost drivers for maintenance (e.g., stepwise regression, incrementally adding variables and observe the effects, correlations)


Download ppt "University of Southern California Center for Systems and Software Engineering Software Metrics Unification and Productivity Domain Workshop Summary Brad."

Similar presentations


Ads by Google