Center for Software and Systems Engineering,

Slides:



Advertisements
Similar presentations
Software Cost Estimation
Advertisements

Cocomo II Constructive Cost Model [Boehm] Sybren Deelstra.
University of Southern California Center for Systems and Software Engineering Next-Generation Software Sizing and Costing Metrics Workshop Report Wilson.
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
University of Southern California Center for Systems and Software Engineering Productivity Data Analysis and Issues Brad Clark, Thomas Tan USC CSSE Annual.
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6, 6.5.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
May 11, 2004CS WPI1 CS 562 Advanced SW Engineering Lecture #5 Tuesday, May 11, 2004.
University of Southern California Center for Systems and Software Engineering © 2010, USC-CSSE 1 COCOMO II Maintenance Model Upgrade Vu Nguyen, Barry Boehm.
University of Southern California Center for Systems and Software Engineering Building Cost Estimating Relationships for Acquisition Decision Support Brad.
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
University of Southern California Center for Systems and Software Engineering Assessing the IDPD Factor: Quality Management Platform Project Thomas Tan.
1 CORADMO in 2001: A RAD Odyssey Cyrus Fakharzadeh 16th International Forum on COCOMO and Software Cost Modeling University of Southern.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 Assessing and Estimating Corrective, Enhancive, and Reductive.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
COCOMO II 資管研一 張永昌. Agenda Overall Model Definition COCOMO II Models for the Software Marketplace Sectors COCOMO II Model Rationale and Elaboration Development.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 Reuse and Maintenance Estimation Vu Nguyen March 17, 2009.
April 13, 2004CS WPI1 CS 562 Advanced SW Engineering General Dynamics, Needham Tuesdays, 3 – 7 pm Instructor: Diane Kramer.
University of Southern California Center for Software Engineering CSE USC 9/14/05 1 COCOMO II: Airborne Radar System Example Ray Madachy
University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas.
April 27, 2004CS WPI1 CS 562 Advanced SW Engineering Lecture #3 Tuesday, April 27, 2004.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 An Analysis of Changes in Productivity and COCOMO Cost.
Information System Economics Software Project Cost Estimation.
Page 1 COCOMO Model The original COCOMO model was first published by Barry Boehm in 1981 at CSE Center for Software Engineering. This is an acronym derived.
COCOMO-SCORM: Cost Estimation for SCORM Course Development
By K Gopal Reddy.  Metrics in software are of two types.direct and indirect.  Function points as indirect metrics.  Function points are used to measure.
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
Cost Estimation. Problem Our ability to realistically plan and schedule projects depends on our ability to estimate project costs and development efforts.
Quality Software Project Management Software Size and Reuse Estimating.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 26 Slide 1 Software cost estimation 2.
Chapter 3: Software Project Management Metrics
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Overview of COCOMO Reporter:Hui Zhang
University of Southern California Center for Systems and Software Engineering © 2010, USC-CSSE 1 Trends in Productivity and COCOMO Cost Drivers over the.
Function Points Synthetic measure of program size used to estimate size early in the project Easier (than lines of code) to calculate from requirements.
Proposed Metrics Definition Highlights Raymond Madachy Naval Postgraduate School CSSE Annual Research Review March 8, 2010.
The COCOMO model An empirical model based on project experience. Well-documented, ‘independent’ model which is not tied to a specific software vendor.
Cost Estimation Cost Estimation “The most unsuccessful three years in the education of cost estimators appears to be fifth-grade arithmetic. »Norman.
+ Incremental Development Productivity Decline Ramin Moazeni, Daniel Link.
Welcome to Software Project Management. CONVENTIONAL SOFTWARE MANAGEMENT The BEST and WORST thing about software is its flexibility. 1.Software development.
1 Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6, 6.5 LiGuo Huang Computer Science and Engineering Southern Methodist University.
COCOMO Software Cost Estimating Model Lab 4 Demonstrator : Bandar Al Khalil.
1 Agile COCOMO II: A Tool for Software Cost Estimating by Analogy Cyrus Fakharzadeh Barry Boehm Gunjan Sharman SCEA 2002 Presentation University of Southern.
TK2023 Object-Oriented Software Engineering
THE FAMU-CIS ALUMNI SYSTEM
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6
Productivity Data Analysis and Issues
PROJECT LIFE CYCLE AND EFFORT ESTIMATION
Metrics and Terms SLOC (source lines of code)
Cost Estimation with COCOMO II
Tutorial: Software Cost Estimation Tools – COCOMO II and COCOTS
COCOTS Life Cycle Estimation: Some Preliminary Observations
Constructive Cost Model
Pongtip Aroonvatanaporn CSCI 577b Spring 2011 March 25, 2011
SLOC and Size Reporting
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2
COCOMO Models.
Cost Estimation with COCOMO II
COCOMO 2 COCOMO 81 was developed with the assumption that a waterfall process would be used and that all software would be developed from scratch. Since.
Chapter 23 Software Cost Estimation
Cost Estimation with COCOMO II
Cost Estimation with COCOMO II
Ramin Moazeni Winsor Brown Barry Boehm
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6
Multi-Build Software Cost Estimation Using COINCOMO
Software cost estimation
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6
Using COCOMO for Software Decisions - from COCOMO II Book, Section 2
Presentation transcript:

Incremental Development Productivity Decline (IDPD) in Software Development Center for Software and Systems Engineering, Annual Research Review March 2012 Ramin Moazeni– Ph.D. Student Computer Science Department Viterbi School of Engineering University of Southern California

Outline Incremental Development Productivity Decline (IDPD) Overview Relevant Examples Cost Drivers Effect on number of increments

Incremental Development Productivity Decline (IDPD) Overview The “Incremental Productivity Decline” (IDPD) factor represents the percentage of decline in software producibility from one increment to the next. The decline is due to factors such as previous-increment breakage and usage feedback, increased integration and testing effort. Another source of productivity decline is that maintenance of reused previous build software is not based on equivalent lines of software credited during the previous build, but on the full amount of reused software. Build 1: 200 KSLOC new, 200K Reused@20% yields a 240 K ESLOC “count” for estimation models. Build 2: there are 400 KSLOC of Build 1 to maintain and integrate Such phenomena may cause the IDPD factor to be higher for some builds and lower for others.

Incremental Development Productivity Decline (IDPD) Example: Site Defense BMD Software 5 builds, 7 years, $100M Build 1 productivity over 300 SLOC/person month Build 5 productivity under 150 SLOC/person month Including Build 1-4 breakage, integration, rework 318% change in requirements across all builds A factor of 2 decrease across 4 builds corresponds to an average build to build IDPD factor of 20% productivity decrease per build Similar IDPD factors have been found for: Large commercial software such as multi-year slippage in delivery of MS Word for Windows and Windows Vista.

Quality Management Platform (QMP) Project QMP Project Information: Web-based application System is to facilitate the process improvement initiatives in many small and medium software organizations 6 builds, 6 years, different increment duration Size after 6th build: 548 KSLOC mostly in Java Average staff on project: ~20

Quality Management Platform (QMP) Project Data Collection Most data come from release documentation, build reports, and project member interviews/surveys Data include product size, effort by engineering phase, effort by engineering activities, defects by phase, requirements changes, project schedule, COCOMO II driver ratings (rated by project developers and organization experts) Data collection challenges: Incomplete and inconsistency data Different data format, depends on who filled the data report No system architecture documents available

Productivity (SLOC/PH) Productivity Variance Quality Management Platform (QMP) Project Data Analysis – Productivity Trends Build Size (KSLOC) Effort (PH) Productivity (SLOC/PH) Productivity Variance 1 69.12 7195 9.61 0.0% 2 64 9080 7.05 -26.6% 3 19 6647 2.86 -59.4% 4 39.2 8787.5 4.46 56.1% 5 207 31684.5 6.53 46.5% 6 65 16215.1 4.01 -38.6% Table 1 The slope of the trend line is -0.76 SLOC/PH per build Across the five builds, this corresponds to a 14% average decline in productivity per build. This is smaller than the 20% Incremental Development Productivity Decline (IDPD) factor observed for a large defense program Most likely because the project is considerably smaller in system size and complexity

Causes of IDPD Factor Explorations of the IDPD factors on several projects, the following sources of variation were identified: Higher IDPD (Less Productive) Lower IDPD (More Productive) Effort to maintain previous increment; bug fixing, COTS upgrades, interface changes Next increment requires previous increment modification Next increment spun out to more platforms Next increment has more previous increments to integrate/interact with Next increment touches less of the previous increment Staff turn over reduces experience Current staff more experienced, productive Next increment software more complex Next increment software less complex Previous increment incompletely developed, tested, integrated Previous increment did much of the next increment’s requirements, architecture

Software Categories & IDPD Factor Categorization of different type of software with different productivity curves through multiple build cycles, require different IDPD factors applicable to each class of software. Software Category Impact on IDPD Factor Non-Deployable Throw-away code. Low Build-Build integration. High reuse. IDPD factor lowest than any type of deployable/operational software Infrastructure Software Often the most difficult software. Developed early on in the program. IDPD factor likely to be the highest. Application Software Builds upon Infrastructure software. Productivity can be increasing, or at least “flat” Platform Software Developed by HW manufacturers. Single vendor, experienced staff in a single controlled environment. Integration effort is primarily with HW. IDPD will be lower due to the benefits mentioned above. Firmware (Single Build) IDPD factor not applicable. Single build increment.

IDPD Model & Data Definition Based on analysis of the IDPD effects, the next increment’s ESLOC, IDPD factor, and software effort can be estimated KSLOC (I) in the existing increments 1 through I The fraction F of these are going to be modified The previous increments (I) would then be considered as reused software and their equivalent lines of code calculated as EAF (I+1): EAF factor for the next increment’s development EAF (I+1) would be the EAF (I) adjusted by the product of Delphi-TBD effort multipliers for the following IDPD factors:  IDPD Drivers Description RCPLX Complexity of Increment I+1 relative to complexity of existing software IPCON Staff turnover from previous increment’s staff PRELY Previous increments’ required reliability IRESL Architecture and risk resolution (RESL) rating of I+1 relative to RESL of 1..I

IDPD Model & Data Definition - 2 The total EKSLOC(I+1) = EKSLOC (I) + EKSLOC (I+1 Added) and the EAF (I+1) factor would be used in the standard COCOMO II equations to compute PM (I+1). The standalone PM for Increment I+1, SPM (I+1), would be calculated from EKSLOC (I+1 Added) and EAF (I), using the standard COCOMO II equations. The IDPD (I+1) factor would then be   EKSLOC (I) = KSLOC (I) * (0.4*DM + 0.3*CM + 0.3*IM) IDPD(I+1) = SPM (I+1) / PM (I+1)

Data Collection Collect data of completed Incremental projects from industry Inclusion Criteria: Starting and ending dates are clear Has at least two increments of significant capability that have been integrated with other software (from other sources) and shown to work in operationally-relevant situations Has well-substantiated sizing and effort data by increment Less than an order of magnitude difference in size or effort per increment Realistic reuse factors for COTS and modified code Uniformly code-counted source code Effort pertaining just to increment deliverables

Data Collection - 2 Metrics Metric Description Product Name of software product Build Build number. A software product that has multiple builds (increments) Effort Total time in person-hours New SLOC SLOC count of new modules Reused SLOC SLOC count of reused modules COTS SLOC SOLC count of COTS modules CM The percentage of modified code DM The percentage of design modified IM The percentage of implementation and test needed for the reused/COTS modules SU Software Understanding UNFM Programmer Unfamiliarity Cost drivers Rating levels for 22 cost drivers

Preliminary Results - 1 Delphi Results 2 rounds of Delphi at CSSE and ISCAS IDPD Factor Productivity Range Variance RCPLX 1.79 0.098 IPCON 1.77 0.066 PRELY 1.49 0.029 IRESL 1.53

Preliminary Results - 2 IDPD Driver ratings VL L N H VH PRELY 0.972 0.986 1 1.008 1.025 RCPLX 0.985 0.904 0.783 0.732 IPCON 0.841 0.900 0.951 1.052 IRESL 1.065 1.046 0.988 0.933

Preliminary Results - 3 Relative Impact of Cost drivers on Effort: Productivity Ranges

Question?