Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas.

Similar presentations


Presentation on theme: "University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas."— Presentation transcript:

1 University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas Tan Wilson Rosa, Sponsor USC CSSE Annual Research Review March 8, 2011

2 University of Southern California Center for Systems and Software Engineering 22 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

3 University of Southern California Center for Systems and Software Engineering Project Overview Goal is to improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. Project led by the Air Force Cost Analysis Agency (AFCAA) working with service cost agencies, and assisted by University of Southern California and Naval Postgraduate School Metrics Manual will present data analysis from existing final SRDR data Additional information is crucial for improving data quality –Data homogeneity is important 32011 USC CSSE Annual Research Review

4 University of Southern California Center for Systems and Software Engineering 4 Software Cost Databases Purpose: to derive estimating relationships and benchmarks for size, cost, productivity and quality Previous and current efforts to collect data from multiple projects and organizations –Data Analysis Center for Software (DACS) –Software Engineering Information Repository (SEIR) –International Software Benchmarking Standards Group (ISBSG) –Large Aerospace Mergers (Attempts to create company-wide databases) –USAF Mosemann Initiative (Lloyd Mosemann Asst. Sec. USAF) –USC CSSE COCOMO II repository –DoD Software Resources Data Report (SRDR) All have faced common challenges such as data definitions, completeness and integrity 2011 USC CSSE Annual Research Review

5 University of Southern California Center for Systems and Software Engineering 5 Data Analysis Cost = a * X b Research Objectives Using SRDR data, improve the quality and consistency of estimating methods across cost agencies and program offices through guidance, standardization, and knowledge sharing. –Characterize different Application Domains and Operating Environments within DoD –Analyze collected data for simple Cost Estimating Relationships (CER) within each domain –Develop rules-of-thumb for missing data Make collected data useful to oversight and management entities Data RecordsCERs 2011 USC CSSE Annual Research Review

6 University of Southern California Center for Systems and Software Engineering 66 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

7 University of Southern California Center for Systems and Software Engineering Software Resources Data Report The Software Resources Data Report (SRDR) is used to obtain both the estimated and actual characteristics of new software developments or upgrades. Both the Government program office and, later on after contract award, the software contractor submit this report. For contractors, this report constitutes a contract data deliverable that formalizes the reporting of software metric and resource data. All contractors, developing or producing any software development element with a projected software effort greater than $20M (then year dollars) on major contracts and subcontracts within ACAT I and ACAT IA programs, regardless of contract type, must submit SRDRs. The data collection and reporting applies to developments and upgrades whether performed under a commercial contract or internally by a government Central Design Activity (CDA) under the terms of a memorandum of understanding (MOU). Reports mandated for Initial Government, Initial Developer, and Final Developer. 72011 USC CSSE Annual Research Review

8 University of Southern California Center for Systems and Software Engineering Submittal Process Data accessed through Defense Cost and Resource Center (DCARC), http://dcarc.pae.osd.mil. http://dcarc.pae.osd.mil The DCARC's Defense Automated Cost Information Management System (DACIMS) is the database with current and historical cost and software resource data for independent, substantiated estimates. 8 Government Program Office 2011 USC CSSE Annual Research Review

9 University of Southern California Center for Systems and Software Engineering 99 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

10 University of Southern California Center for Systems and Software Engineering Proposed SRDR Modifications Data analysis problems with project types, size and effort normalization, and multiple builds motivated some of these: 102011 USC CSSE Annual Research Review

11 University of Southern California Center for Systems and Software Engineering Proposed SRDR Modifications 112011 USC CSSE Annual Research Review

12 University of Southern California Center for Systems and Software Engineering Proposed SRDR Modifications 12 … 2011 USC CSSE Annual Research Review

13 University of Southern California Center for Systems and Software Engineering Proposed SRDR Modifications 13 … 2011 USC CSSE Annual Research Review

14 University of Southern California Center for Systems and Software Engineering Modified Final Developer Form (1/3) 142011 USC CSSE Annual Research Review

15 University of Southern California Center for Systems and Software Engineering Modified Final Developer Form (2/3) 152011 USC CSSE Annual Research Review

16 University of Southern California Center for Systems and Software Engineering Modified Final Developer Form (3/3) 162011 USC CSSE Annual Research Review

17 University of Southern California Center for Systems and Software Engineering Product and Development Description Our recommendation for one operating type and one application domain was not incorporated 172011 USC CSSE Annual Research Review

18 University of Southern California Center for Systems and Software Engineering Staffing Our expanded number of experience levels was adopted for Personnel Experience. 182011 USC CSSE Annual Research Review

19 University of Southern California Center for Systems and Software Engineering Size (1/2) Accepted these recommendations: –Requirements volatility changed from relative scale to percentage –Reused code with Modifications is our Modified code (report DM, CM IM) –Reused code without Modifications is our Reused code (DM, CM =0, report IM) 192011 USC CSSE Annual Research Review

20 University of Southern California Center for Systems and Software Engineering Size (2/2) Our recommendation for deleted code was accepted. 202011 USC CSSE Annual Research Review

21 University of Southern California Center for Systems and Software Engineering Resource and Schedule 21 We recommended the breakout of QA, CM and PM which were previously under “other”. 2011 USC CSSE Annual Research Review

22 University of Southern California Center for Systems and Software Engineering Quality 22 We recommended more common Defect metrics: Number of Defects Discovered and Number of Defects Removed. Gone is the Mean Time to Serious or Critical Defect (MTTD) or Computed Reliability. 2011 USC CSSE Annual Research Review

23 University of Southern California Center for Systems and Software Engineering 23 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

24 University of Southern California Center for Systems and Software Engineering SRDR Revisions Two-year update cycle We submitted our recommendations in Spring 2010 Received draft of updated SRDR DID in Fall 2010 from committee reflecting our changes and more Adoption schedule unclear DCARC website shows 2007 version posted at http://dcarc.pae.osd.mil/Policy/CSDR/csdrReporting.as px. http://dcarc.pae.osd.mil/Policy/CSDR/csdrReporting.as px Issues: what version(s) to cover in manual for collection and analysis –E.g. previously we had guidance to mitigate downfalls of 2007 version 242011 USC CSSE Annual Research Review

25 University of Southern California Center for Systems and Software Engineering 25 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

26 University of Southern California Center for Systems and Software Engineering SRDR Raw Data (520 observations) 26 PM = 1.67 * KSLOC 0.66 2011 USC CSSE Annual Research Review

27 University of Southern California Center for Systems and Software Engineering Data Conditioning Segregate data Normalize sizing data Map effort distribution 272011 USC CSSE Annual Research Review

28 University of Southern California Center for Systems and Software Engineering 28 SRDR Data Segmentation 2011 USC CSSE Annual Research Review

29 University of Southern California Center for Systems and Software Engineering SRDR Data Fields Used –Contractor (assigned OID) –Component description (sanitized) –Development process –Percentage personnel experience –Peak staffing –Amount of relative requirements volatility –Primary & secondary lang. –Months of duration by activity –Hours of effort by activity & comments –Lines of code: new, modified, unmodified, auto-generated Missing –Our Application and Environment classification –Build information –Effort in Person Months –Adapted code parameter: DM, CM & IM –Equivalent KSLOC 292011 USC CSSE Annual Research Review

30 University of Southern California Center for Systems and Software Engineering Derive Equivalent Size Normalize the SLOC counting method to Logical SLOC –Physical SLOC count converted to Logical SLOC count by programming language –Non-comment SLOC count converted to Logical SLOC count by programming language Convert Auto-Generated SLOC convert to Equivalent SLOC (ESLOC) –Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3) –DM = CM = 0; IM = 100 Convert Reused SLOC to ESLOC with AAF formula –DM = CM = 0; IM = 50 Convert Modified SLOC to ESLOC –Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3 –Default values: Low – Mean – High based on 90% confidence interval Create Equivalent SLOC count and scale to thousands (K) to derive EKSLOC (New + Auto-Gen+ Reused+ Modified) / 1000 = EKSLOC Remove all records with an EKSLOC below 1.0 302011 USC CSSE Annual Research Review

31 University of Southern California Center for Systems and Software Engineering Convert Modified Size to ESLOC Use AAF formula: (DM% * 0.4) + (CM% * 0.3) + (IM% * 0.3) Problems with missing DM, CM & IM in SRDR data Program interviews provided parameters for some records For missing data, use records that have data in all fields to derive recommended values for missing data 31 2011 USC CSSE Annual Research Review

32 University of Southern California Center for Systems and Software Engineering Map Effort Distribution Labor hours are reported for 7 activities: –Software Requirements –Software Architecture (including Detailed Design) –Software Code (including Unit Testing) –Software Integration and Test –Software Qualification Test –Software Developmental Test & Evaluation –Other (Mgt, QA, CM, PI, etc.) Create effort distribution percentages for records that have hours in requirements, architecture, code, integration and qualification test phases (developmental test evaluation and other phases may or may not be blank) Using activity distribution to backfill missing activities makes results worse 32 Currently don’t use Software Requirements and Developmental Test hours 2011 USC CSSE Annual Research Review

33 University of Southern California Center for Systems and Software Engineering 33 Team Experience SRDR Data Definition –Report the percentage of project personnel in each category –Highly Experienced in the domain (three or more years of experience) –Nominally Experienced in the project domain (one to three years of experience) –Entry-level Experienced (zero to one year of experience) Need to include Team Experience (TXP) in CERs to estimate cost After analyzing the data, the following quantitative values are assigned: –Highly experienced: 0.60 –Nominally experienced: 1.00 –Entry-level experienced: 1.30 2011 USC CSSE Annual Research Review

34 University of Southern California Center for Systems and Software Engineering Analysis Steps 342011 USC CSSE Annual Research Review

35 University of Southern California Center for Systems and Software Engineering Derive EKSLOC 352011 USC CSSE Annual Research Review

36 University of Southern California Center for Systems and Software Engineering Size - Effort 36 Effort for small software configuration items (CI) appears high. For larger CIs, it appears more normal. 2011 USC CSSE Annual Research Review

37 University of Southern California Center for Systems and Software Engineering Productivity Conundrum We can confirm the unusually high effort by comparing productivity to size. We see that productivity is lower for smaller sizes (5 to 40 EKSLOC) than larger sizes. This is counter to what we believe to be true – productivity should be lower for larger the software CIs (all other factors being held constant). 372011 USC CSSE Annual Research Review

38 University of Southern California Center for Systems and Software Engineering Large Project CER We also can see that modeling this domain above 50 EKSLOC produces a model that shows more effort is required with larger CIs. This is what we expect. This model can be used in analysis to help quantify the amount of “extra” effort in projects below 50 KESLOC. 382011 USC CSSE Annual Research Review

39 University of Southern California Center for Systems and Software Engineering Overhead Function If we use the model from the previous slide on the data below 50 KESLOC, we can express the difference in “what we would expect” (the model) and “what we observe” (the data) as a function of size. Yet another model can be created to express the decreasing decreasing difference with increasing size. Call this model the “Overhead Function” 392011 USC CSSE Annual Research Review

40 Derive Effort 40 2011 USC CSSE Annual Research Review

41 University of Southern California Center for Systems and Software Engineering Communications Domain 41 After applying the overhead function to the observed effort (by subtracting the overhead function effort from the observed effort), we get an overall Cost Estimating Relationship that seems reasonable. 2011 USC CSSE Annual Research Review

42 University of Southern California Center for Systems and Software Engineering Command & Control Domain 422011 USC CSSE Annual Research Review

43 University of Southern California Center for Systems and Software Engineering 43 Agenda Introduction SRDR Overview –Review of Accepted Changes –SRDR Baseline Issues Data Analysis Manual Reviewers Website 2011 USC CSSE Annual Research Review

44 University of Southern California Center for Systems and Software Engineering Manual Reviewers Website http://csse.usc.edu/afcaa/manual_draft/ Review and Discussion 442011 USC CSSE Annual Research Review

45 University of Southern California Center for Systems and Software Engineering 45 Questions? For more information, contact: Ray Madachy rjmadach@nps.edu Or Brad Clark bkclark@csse.usc.edu Or Wilson Rosa Wilson.Rosa@pentagon.af.mil 2011 USC CSSE Annual Research Review


Download ppt "University of Southern California Center for Systems and Software Engineering AFCAA Database and Metrics Manual Ray Madachy, Brad Clark, Barry Boehm, Thomas."

Similar presentations


Ads by Google