University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas.

Slides:



Advertisements
Similar presentations
Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Advertisements

A Sizing Framework for DoD Software Cost Analysis Raymond Madachy, NPS Barry Boehm, Brad Clark and Don Reifer, USC Wilson Rosa, AFCAA
Copyright 2000, Stephan Kelley1 Estimating User Interface Effort Using A Formal Method By Stephan Kelley 16 November 2000.
COSYSMO 2.0 Workshop Summary (held Monday, March 17 th 2008) USC CSSE Annual Research Review March 18, 2008 Jared Fortune.
Cocomo II Constructive Cost Model [Boehm] Sybren Deelstra.
USC 21 st International Forum on Systems, Software, and COCOMO Cost Modeling Nov 2006 University of Southern California Center for Software Engineering.
University of Southern California Center for Systems and Software Engineering Next-Generation Software Sizing and Costing Metrics Workshop Report Wilson.
University of Southern California Center for Systems and Software Engineering Productivity Data Analysis and Issues Brad Clark, Thomas Tan USC CSSE Annual.
University of Southern California Center for Systems and Software Engineering An Investigation on Domain-Based Effort Distribution Thomas Tan 26 th International.
COCOMO II Calibration Brad Clark Software Metrics Inc. Don Reifer Reifer Consultants Inc. 22nd International Forum on COCOMO and Systems / Software Cost.
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
University of Southern California Center for Systems and Software Engineering Building Cost Estimating Relationships for Acquisition Decision Support Brad.
Introduction Wilson Rosa, AFCAA CSSE Annual Research Review March 8, 2010.
Software Project Planning CS 414 – Software Engineering I Donald J. Bagert Rose-Hulman Institute of Technology December 12, 2002.
University of Southern California Center for Systems and Software Engineering Assessing the IDPD Factor: Quality Management Platform Project Thomas Tan.
SIM5102 Software Evaluation
COSYSMO Reuse Extension 22 nd International Forum on COCOMO and Systems/Software Cost Modeling November 2, 2007 Ricardo ValerdiGan Wang Garry RoedlerJohn.
SOFTWARE PROJECT MANAGEMENT AND COST ESTIMATION © University of LiverpoolCOMP 319slide 1.
USC 21 st International Forum on Systems, Software, and COCOMO Cost Modeling Nov 2006 University of Southern California Center for Software Engineering.
COCOMO II Database Brad Clark Center for Software Engineering Annual Research Review March 11, 2002.
University of Southern California Center for Systems and Software Engineering Software Cost Estimation Metrics Manual 26 th International Forum on COCOMO.
University of Southern California Center for Systems and Software Engineering © 2009, USC-CSSE 1 Reuse and Maintenance Estimation Vu Nguyen March 17, 2009.
SRDR Data Analysis Workshop Summary Brad Clark Ray Madachy Thomas Tan 25th International Forum on COCOMO and Systems/Software Cost Modeling November 5,
UNCLASSIFIED Schopenhauer's Proof For Software: Pessimistic Bias In the NOSTROMO Tool (U) Dan Strickland Dynetics Program Software Support
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy USC Center for Systems and Software Engineering
COSYSMO Reuse Extension COSYSMO Workshop – USC CSSE Annual Research Review March 17, 2008 Ricardo ValerdiGan Wang Garry RoedlerJohn Rieff Jared Fortune.
Software Efforts at the NRO Cost Group 21 st International Forum on COCOMO and Software Cost Modeling November 8, 2006.
Software Process and Product Metrics
Chapter 23 – Project planning Part 2. Estimation techniques  Organizations need to make software effort and cost estimates. There are two types of technique.
COCOMO-SCORM: Cost Estimation for SCORM Course Development
Chapter 6 : Software Metrics
Project Management Estimation. LOC and FP Estimation –Lines of code and function points were described as basic data from which productivity metrics can.
Lecture 4 Software Metrics
University of Southern California Center for Software Engineering C S E USC Using COCOMO for Software Decisions - from COCOMO II Book, Section 2.6 Barry.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Disciplined Software Engineering Lecture #2 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 1 1 Disciplined Software Engineering Lecture #2 Software Engineering.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 26 Slide 1 Software cost estimation 2.
SOFTWARE METRICS. Software Process Revisited The Software Process has a common process framework containing: u framework activities - for all software.
Chapter 3: Software Project Management Metrics
Copyright , Dennis J. Frailey CSE7315 – Software Project Management CSE7315 M16 - Version 8.01 SMU CSE 7315 Planning and Managing a Software Project.
CS 350: Introduction to Software Engineering Slide Set 2 Process Measurement C. M. Overstreet Old Dominion University Fall 2005.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
Effort Estimation In WBS,one can estimate effort (micro-level) but needed to know: –Size of the deliverable –Productivity of resource in producing that.
Proposed Metrics Definition Highlights Raymond Madachy Naval Postgraduate School CSSE Annual Research Review March 8, 2010.
The COCOMO model An empirical model based on project experience. Well-documented, ‘independent’ model which is not tied to a specific software vendor.
Software Measurement Measuring software indicators: metrics and methods Jalote-2002,
Copyright © , Dennis J. Frailey, All Rights Reserved Day 2, Part 1, Page 1 1/11/2004 Day 2, Part 1 Estimating Software Size Section 2 Calculating.
State of Georgia Release Management Training
Center for Systems and Software Engineering DoD Software Resource Data Reports (SRDRs) and Cost Data Analysis Workshop Summary Brad Clark University of.
Project Planning Goal 1 - Estimates are documented for use in tracking and planning project. Goal 2 - Project Activities and commitments planned and documented.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 2 Diploma of Project Management.
Software project management 3rd Umer khalid Lecturer University of Lahore Sargodha campus.
بشرا رجائی برآورد هزینه نرم افزار.
Effort Estimation Models for Contract Cost Proposal Evaluation
Estimate Testing Size and Effort Using Test Case Point Analysis
Software Project Configuration Management
SOFTWARE PROJECT MANAGEMENT AND COST ESTIMATION
Productivity Data Analysis and Issues
Cost Estimation with COCOMO II
Constructive Cost Model
SOFTWARE PROJECT MANAGEMENT AND COST ESTIMATION
SLOC and Size Reporting
More on Estimation In general, effort estimation is based on several parameters and the model ( E= a + b*S**c ): Personnel Environment Quality Size or.
COCOMO Models.
Cost Estimation with COCOMO II
COCOMO 2 COCOMO 81 was developed with the assumption that a waterfall process would be used and that all software would be developed from scratch. Since.
Cost Estimation with COCOMO II
Software metrics.
Center for Software and Systems Engineering,
Presentation transcript:

University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas Tan Wilson Rosa, Sponsor USC CSSE Annual Research Review March 8, 2011

University of Southern California Center for Systems and Software Engineering 22 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Project Overview Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School Metrics Manual will present data analysis from existing final SRDR data Additional information is crucial for improving data quality –Data homogeneity is important USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 4 Software Cost Databases Purpose: to derive estimating relationships and benchmarks for size, cost, productivity and quality Previous and current efforts to collect data from multiple projects and organizations –Data Analysis Center for Software (DACS) –Software Engineering Information Repository (SEIR) –International Software Benchmarking Standards Group (ISBSG) –Large Aerospace Mergers (Attempts to create company-wide databases) –USAF Mosemann Initiative (Lloyd Mosemann Asst. Sec. USAF) –USC CSSE COCOMO II repository –DoD Software Resources Data Report (SRDR) All have faced common challenges such as data definitions, completeness and integrity 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 5 Data Analysis Cost = a * X b Research Objectives Using SRDR data, improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. –Characterize different Application Domains and Operating Environments within DoD –Analyze collected data for simple Cost Estimating Relationships (CER) within each domain –Develop rules-of-thumb for missing data Make collected data useful to oversight and management entities Data RecordsCERs 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 66 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Software Resources Data Report The Software Resources Data Report (SRDR) is used to obtain both the estimated and actual characteristics of new software developments or upgrades. Both the Government program office and, later on after contract award, the software contractor submit this report. For contractors, this report constitutes a contract data deliverable that formalizes the reporting of software metric and resource data. All contractors, developing or producing any software development element with a projected software effort greater than $20M (then year dollars) on major contracts and subcontracts within ACAT I and ACAT IA programs, regardless of contract type, must submit SRDRs. The data collection and reporting applies to developments and upgrades whether performed under a commercial contract or internally by a government Central Design Activity (CDA) under the terms of a memorandum of understanding (MOU). Reports mandated for Initial Government, Initial Developer, and Final Developer USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Submittal Process Data accessed through Defense Cost and Resource Center (DCARC), The DCARC's Defense Automated Cost Information Management System (DACIMS) is the database with current and historical cost and software resource data for independent, substantiated estimates. 8 Government Program Office 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 99 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Proposed SRDR Modifications Data analysis problems with project types, size and effort normalization, and multiple builds motivated some of these: USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Proposed SRDR Modifications USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Proposed SRDR Modifications 12 … 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Proposed SRDR Modifications 13 … 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Modified Final Developer Form (1/3) USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Modified Final Developer Form (2/3) USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Modified Final Developer Form (3/3) USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Product and Development Description Our recommendation for one operating type and one application domain was not incorporated USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Staffing Our expanded number of experience levels was adopted for Personnel Experience USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Size (1/2) Accepted these recommendations: –Requirements volatility changed from relative scale to percentage –Reused code with Modifications is our Modified code (report DM, CM IM) –Reused code without Modifications is our Reused code (DM, CM =0, report IM) USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Size (2/2) Our recommendation for deleted code was accepted USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Resource and Schedule 21 We recommended the breakout of QA, CM and PM which were previously under “other” USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Quality 22 We recommended more common Defect metrics: Number of Defects Discovered and Number of Defects Removed. Gone is the Mean Time to Serious or Critical Defect (MTTD) or Computed Reliability USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 23 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering SRDR Revisions Two-year update cycle We submitted our recommendations in Spring 2010 Received draft of updated SRDR DID in Fall 2010 from committee reflecting our changes and more Adoption schedule unclear DCARC website shows 2007 version posted at px. px Issues: what version(s) to cover in manual for collection and analysis –E.g. previously we had guidance to mitigate downfalls of 2007 version USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 25 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering SRDR Raw Data (520 observations) 26 PM = 1.67 * KSLOC USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Data Conditioning Segregate data Normalize sizing data Map effort distribution USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 28 SRDR Data Segmentation 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering SRDR Data Fields Used –Contractor (assigned OID) –Component description (sanitized) –Development process –Percentage personnel experience –Peak staffing –Amount of relative requirements volatility –Primary & secondary lang. –Months of duration by activity –Hours of effort by activity & comments –Lines of code: new, modified, unmodified, auto-generated Missing –Our Application and Environment classification –Build information –Effort in Person Months –Adapted code parameter: DM, CM & IM –Equivalent KSLOC USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Derive Equivalent Size Normalize the SLOC counting method to Logical SLOC –Physical SLOC count converted to Logical SLOC count by programming language –Non-comment SLOC count converted to Logical SLOC count by programming language Convert Auto-Generated SLOC convert to Equivalent SLOC (ESLOC) –Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3) –DM = CM = 0; IM = 100 Convert Reused SLOC to ESLOC with AAF formula –DM = CM = 0; IM = 50 Convert Modified SLOC to ESLOC –Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3 –Default values: Low – Mean – High based on 90% confidence interval Create Equivalent SLOC count and scale to thousands (K) to derive EKSLOC (New + Auto-Gen+ Reused+ Modified) / 1000 = EKSLOC Remove all records with an EKSLOC below USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Convert Modified Size to ESLOC Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3) Problems with missing DM, CM & IM in SRDR data Program interviews provided parameters for some records For missing data, use records that have data in all fields to derive recommended values for missing data USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Map Effort Distribution Labor hours are reported for 7 activities: –Software Requirements –Software Architecture (including Detailed Design) –Software Code (including Unit Testing) –Software Integration and Test –Software Qualification Test –Software Developmental Test & Evaluation –Other (Mgt, QA, CM, PI, etc.) Create effort distribution percentages for records that have hours in requirements, architecture, code, integration and qualification test phases (developmental test evaluation and other phases may or may not be blank) Using activity distribution to backfill missing activities makes results worse 32 Currently don’t use Software Requirements and Developmental Test hours 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 33 Team Experience SRDR Data Definition –Report the percentage of project personnel in each category –Highly Experienced in the domain (three or more years of experience) –Nominally Experienced in the project domain (one to three years of experience) –Entry-level Experienced (zero to one year of experience) Need to include Team Experience (TXP) in CERs to estimate cost After analyzing the data, the following quantitative values are assigned: –Highly experienced: 0.60 –Nominally experienced: 1.00 –Entry-level experienced: USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Analysis Steps USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Derive EKSLOC USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Size - Effort 36 Effort for small software configuration items (CI) appears high. For larger CIs, it appears more normal USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Productivity Conundrum We can confirm the unusually high effort by comparing productivity to size. We see that productivity is lower for smaller sizes (5 to 40 EKSLOC) than larger sizes. This is counter to what we believe to be true – productivity should be lower for larger the software CIs (all other factors being held constant) USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Large Project CER We also can see that modeling this domain above 50 EKSLOC produces a model that shows more effort is required with larger CIs. This is what we expect. This model can be used in analysis to help quantify the amount of “extra” effort in projects below 50 KESLOC USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Overhead Function If we use the model from the previous slide on the data below 50 KESLOC, we can express the difference in “what we would expect” (the model) and “what we observe” (the data) as a function of size. Yet another model can be created to express the decreasing decreasing difference with increasing size. Call this model the “Overhead Function” USC CSSE Annual Research Review

Derive Effort USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Communications Domain 41 After applying the overhead function to the observed effort (by subtracting the overhead function effort from the observed effort), we get an overall Cost Estimating Relationship that seems reasonable USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Command & Control Domain USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 43 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering Manual Reviewers Website Review and Discussion USC CSSE Annual Research Review

University of Southern California Center for Systems and Software Engineering 45 Questions? For more information, contact: Ray Madachy Or Brad Clark Or Wilson Rosa 2011 USC CSSE Annual Research Review