Smi COCOMO II Calibration Status USC-CSE Annual Research Review March 2004.

Slides:



Advertisements
Similar presentations
Cost as a Business Driver 1 John Brown C Eng MIEE mr_ Software Cost Estimation.
Advertisements

Copyright 2000, Stephan Kelley1 Estimating User Interface Effort Using A Formal Method By Stephan Kelley 16 November 2000.
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
COSYSMO 2.0 Workshop Summary (held Monday, March 17 th 2008) USC CSSE Annual Research Review March 18, 2008 Jared Fortune.
Cocomo II Constructive Cost Model [Boehm] Sybren Deelstra.
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
COSYSMO: Constructive Systems Engineering Cost Model Ricardo Valerdi USC CSE Workshop October 25, 2001.
University of Southern California Center for Systems and Software Engineering Productivity Data Analysis and Issues Brad Clark, Thomas Tan USC CSSE Annual.
COCOMO II Calibration Brad Clark Software Metrics Inc. Don Reifer Reifer Consultants Inc. 22nd International Forum on COCOMO and Systems / Software Cost.
Smi COCOMO II Calibration Status COCOMO Forum October 2004.
Affiliate Feedback and Discussion 1. Future Research: COSYSMO Updated data collection effort Quantifying the effect of schedule Harmonizing across software.
1 Systems Engineering Reuse Principles Jared Fortune, USC Ricardo Valerdi, MIT COSYSMO COCOMO Forum 2010 Los Angeles, CA.
10/25/2005USC-CSE1 Ye Yang, Barry Boehm USC-CSE COCOTS Risk Analyzer COCOMO II Forum, Oct. 25 th, 2005 Betsy Clark Software Metrics, Inc.
Ch8: Management of Software Engineering. 1 Management of software engineering  Traditional engineering practice is to define a project around the product.
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
SOFTWARE PROJECT MANAGEMENT AND COST ESTIMATION © University of LiverpoolCOMP 319slide 1.
1 Discussion on Reuse Framework Jared Fortune, USC Ricardo Valerdi, MIT COSYSMO COCOMO Forum 2008 Los Angeles, CA.
Costar & SystemStar Estimation Tools Dan Ligett Softstar Systems (603)
COCOMO II Database Brad Clark Center for Software Engineering Annual Research Review March 11, 2002.
SE is not like other projects. l The project is intangible. l There is no standardized solution process. l New projects may have little or no relationship.
University of Southern California Center for Software Engineering CSE USC 9/14/05 1 COCOMO II: Airborne Radar System Example Ray Madachy
Measuring Dollar Savings from Software Process Improvement with COCOMO II Betsy Clark Software Metrics Inc. October 25, 2001 Acknowledgment: This presentation.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 An Analysis of Changes in Productivity and COCOMO Cost.
University of Southern California Center for Systems and Software Engineering Improving Affordability via Value-Based Testing 27th International Forum.
1 Cost Estimation CIS 375 Bruce R. Maxim UM-Dearborn.
Cost Management Week 6-7 Learning Objectives
Complete and Integrated Lifecycle Management. Challenges 1.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
ACT! 2008 (10.0) Product Tour for ACT! 2007 (9.0) Users.
Estimation Why estimate? What to estimate? When to estimate?
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
Project Management Estimation. LOC and FP Estimation –Lines of code and function points were described as basic data from which productivity metrics can.
1 © Copyright Q/P Management Group, Inc. All Rights Reserved. Software Estimating with Functional Metrics Scott Goldfarb Q/P Management Group,
By K Gopal Reddy.  Metrics in software are of two types.direct and indirect.  Function points as indirect metrics.  Function points are used to measure.
Software cost estimation Predicting the resources required for a software development process 1.
9/17/2002 COSYSMO Usage Experience Panel: What is Happening at Lockheed Martin Garry Roedler, Lockheed Martin Engineering Process Improvement Center
Pfleeger and Atlee, Software Engineering: Theory and PracticePage 3.1 © 2006 Pearson/Prentice Hall 3.1 Tracking Progress : Gantt Chart Activities shown.
10/27/20151Ian Sommerville.  Fundamentals of software measurement, costing and pricing  Software productivity assessment  The principles of the COCOMO.
Project Estimation Model By Deepika Chaudhary. Factors for estimation Initial estimates may have to be made on the basis of a high level user requirements.
Quality Software Project Management Software Size and Reuse Estimating.
The Art of Estimating Programming Tasks Adriana Lopez Development Director Dragon Age II.
SOFTWARE METRICS. Software Process Revisited The Software Process has a common process framework containing: u framework activities - for all software.
Hubbard Decision Research The Applied Information Economics Company Follow-up Bootstrap Case Study.
Effort Estimation ( 估计 ) And Scheduling ( 时序安排 ) Presented by Basker George.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Overview of COCOMO Reporter:Hui Zhang
Function Points Synthetic measure of program size used to estimate size early in the project Easier (than lines of code) to calculate from requirements.
Proposed Metrics Definition Highlights Raymond Madachy Naval Postgraduate School CSSE Annual Research Review March 8, 2010.
Using Bayesian Nets to Predict Software Defects in Arbitrary Software Lifecycles Martin Neil Agena Ltd London, UK Web:
1 Experience from Studies of Software Maintenance and Evolution Parastoo Mohagheghi Post doc, NTNU-IDI SEVO Seminar, 16 March 2006.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment: Tracking the.
Rating Very Very Extra Cost Drivers Low Low Nominal High High High Product Attributes Required software reliability Database size.
Cost Estimation Cost Estimation “The most unsuccessful three years in the education of cost estimators appears to be fifth-grade arithmetic. »Norman.
University of Southern California Center for Systems and Software Engineering Reducing Estimation Uncertainty with Continuous Assessment Framework Pongtip.
By Manish Shrotriya CSE MS Software Estimation Effort Estimation: how much effort is required to complete an activity. (How to define efforts: Line.
Chapter 5: Software effort estimation
1 Agile COCOMO II: A Tool for Software Cost Estimating by Analogy Cyrus Fakharzadeh Barry Boehm Gunjan Sharman SCEA 2002 Presentation University of Southern.
Software Development Module Code: CST 240 Chapter 6: Software Maintenance Al Khawarizmi International College, AL AIN, U.A.E Lecturer: Karamath Ateeq.
Project Cost Management
Estimate Testing Size and Effort Using Test Case Point Analysis
COCOMO III Workshop Summary
Productivity Data Analysis and Issues
Why did you choose us? To address and provide a solution to the many problems associated with your current manual filing system -Problems include: -Lack.
Metrics and Terms SLOC (source lines of code)
CMS HIPAA Transaction Implementation Status Checklist
Pongtip Aroonvatanaporn CSCI 577b Spring 2011 March 25, 2011
Phase Distribution of Software Development Effort
COSYSMO: Constructive Systems Engineering Cost Model
More on Estimation In general, effort estimation is based on several parameters and the model ( E= a + b*S**c ): Personnel Environment Quality Size or.
Center for Software and Systems Engineering,
Presentation transcript:

smi COCOMO II Calibration Status USC-CSE Annual Research Review March 2004

smi USC-CSE Annual Research Review - March A Little History Calibration effort started in January 2002 Confusion –Repository in an inconsistent state –“Uncharacterized” data from many sources –Process for duplicating the 2000 calibration results –Schedule compression rating was inconsistent Expectation –New data had a lot of variation but… –Affiliates (and the user population in general) want an “Accurate” and up-to-date model – not just one that explained variation PRED(.25) versus R 2

smi USC-CSE Annual Research Review - March Change in Approach Removed pre-1990 data from dataset used in calibration –This removed a lot of “converted” data Removed “bad” data –Incomplete: No duration data, estimated effort, no valid SLOC size Still use the Bayesian calibration approach developed by Chulani Changed to a holistic analysis approach: considered effort and duration together –Identified data that needed review –Schedule compression was automatically set

smi USC-CSE Annual Research Review - March Effort- Duration Error

smi USC-CSE Annual Research Review - March Effort- Duration Error Interpretation Effort EstimatesDuration Estimates Data Validation / Interpretation Under-estimated Actual size data is too small due to reuse modeling Actual error and duration included lifecycle phases not in the model Difficult, low productivity projects Under-EstimatedOver-EstimatedSchedule Compression required Over-estimatedUnder-estimatedFixed-staffing levels Project slow-down Schedule Stretch-out Over-estimated Actual data is too large due to physical SLOC count, reuse modeling Actual effort and duration cover fewer lifecycle phases than estimated Easy, high productivity

smi USC-CSE Annual Research Review - March Effort Estimate Error Compared to Size

smi USC-CSE Annual Research Review - March Duration Estimate Error Compared to Size

smi USC-CSE Annual Research Review - March Preliminary Results 89 project data Effort Estimation Accuracy –PRED(.30) = 92.1% (was 75% for 2000 calibration) –PRED(.25) = 88.8% (was 68%) –PRED(.20) = 84.3% (was 63%) Duration Estimation Accuracy –PRED(.30) = 82.0% (was 64% for 2000 calibration) –PRED(.25) = 70.8% (was 55%) –PRED(.20) = 64.0% (was 50%) 65 more project data to review

smi USC-CSE Annual Research Review - March Next Steps in Calibration Review and incorporate the remaining outstanding data Beta test the new driver values with some of the Affiliates Assess the need to elaborate (and possibly expand) the definition for Schedule Compression Attempt to develop “pre-sets” for different types of applications

smi USC-CSE Annual Research Review - March How Should the Model Handle Level-loaded staffing?

smi USC-CSE Annual Research Review - March Are Compression / Stretch-out Handled Adequately? Should there be an Extra Low rating? Should there be ratings for stretch-out other than 1.0

smi COCOMO II Driver Elaboration

smi USC-CSE Annual Research Review - March Motivation Recent experience has shown that engineers have difficulty in rating some of the COCOMO II Drivers Researchers have documented impact of COCOMO Drivers’ subjectivity Calibration data appears to show some differences in the interpretation of COCOMO II Driver ratings Our goal is to address these problems

smi USC-CSE Annual Research Review - March Examples of Questions DATA (DB bytes / Pgm SLOC) <  D/P <  D/P < 1000 D/P > 1000 RELY slight inconvenience low, easily recoverable losses Moderate, easily recoverable losses high financial loss risk to human life Programmer Capability – What is the 75 th percentile? Required Software Reliability – What if my application is not financial? Database Size – Are you mixing two different units of measure?

smi USC-CSE Annual Research Review - March Elaborating Cost Driver Workshop Goals No new math (elaboration not re-definition) No scare-factor (not too many inputs) –Something behind the curtain –Gradual unfolding More comprehensible vocabulary –Consider wider range of application users –Applicable to Business and Military Make it easier to use –Eye-ball average, optional weighted scoring Make it less subjective –Crisper definitions

smi USC-CSE Annual Research Review - March Driver “Subjectivity” Ranking Each driver was rated (1 to 10) for it’s level of subjectivity by participants. Ratings: 1 = Very Objective, 10 = Very Subjective

smi USC-CSE Annual Research Review - March Status Held a workshop in October 2003 on identifying drivers that need further elaboration Lack of progress –Need an effective collaboration technique that is not moderator intensive –Calibration work has taken precedence Possible collaboration solution exists by using same technology USC Software Engineering classes use for submission of their work. Calibration work is winding down (I think…) Expect soon on the startup of the Driver Elaboration working group

smi USC-CSE Annual Research Review - March For more information, requests or questions Brad Clark Software Metrics, Inc. Ye Yang USC-CSE