March 2002 COSYSMO: COnstructive SYStems Engineering Cost MOdel Ricardo Valerdi USC Annual Research Review March 11, 2002
March Outline Background on COSYSMO EIA632 Approach Delphi Survey Delphi Round 1 Results Analysis/Conclusions Lessons Learned/Improvements The Next Step Q & A
March “All models are wrong, but some of them are useful” - W. E. Deming Source:
March What is it? The purpose of the COSYSMO project is to enhance the current capability of the COCOMO II model by accounting for costs that are outside the realm of software engineering by introducing system engineering drivers.
March The Challenge To develop a preliminary model for estimating the cost impact of front- end System Engineering tasks in the design of software intensive systems. These include system definition, integration, and test activities as defined in standard EIA632.
March Approach Begin with front-end costs of information systems engineering Follow 7-step modeling methodology –Steps 5,6, & 7 (gather data and refine loop) Use model parameters compiled by TRW, SAIC, Raytheon, and USC/CSE –System size (requirements, TPMs, I/F) –Effort drivers (maturity, cohesion, stability)
March COCOMO Suite COCOMOII COQUALMO COPSEMO CORADMO COSYSMO COPROMO COCOTS
March COSYSMO Operational Concept COSYSMO Size Drivers Cost Drivers Effort Duration Calibration # Requirements # Interfaces # TPM’s # Scenarios # Modes # Platforms # Algorithms - 7 Application factors - 8 Team factors WBS guided By EIA 632 COCOMO II-based model
March EIA activities organized into 5 groups: 1.Acquisition and supply 2.Technical management Planning process Assessment process Control process 3.System design Requirements definition process Solution definition process 4.Product realization 5.Technical evaluation
March Delphi Survey 3 Sections: –Scope, Size, Cost Used to determine the range for size driver and effort multiplier ratings Identify the cost drivers to which effort is most sensitive to Reach consensus from systems engineering experts
March Size Drivers 1. Number of System Requirements 2. Number of Major Interfaces 3. Number of Technical Performance Measures 4. Number of Operational Scenarios 5. Number of Modes of Operation 6. Number of Different Platforms 7. Number of Unique Algorithms
March Cost Drivers 1. Requirements understanding 2. Architecture understanding 3. Level of service requirements, criticality, difficulty 4. Legacy transition complexity 5. COTS assessment complexity 6. Platform difficulty 7. Required business process reengineering Application Factors (7)
March Cost Drivers (cont…) 1. Number and diversity of stakeholder communities 2. Stakeholder team cohesion 3. Personnel capability 4. Personal experience/continuity 5. Process maturity 6. Multisite coordination 7. Formality of deliverables 8. Tool support Team Factors (8)
March Delphi Round 1 23 Surveys returned Aerospace2 Galorath1 Lockheed Martin8 Raytheon4 SAIC6 TRW1 USC1
March System Engineering Effort Per EIA Stage Stage Supplier Performance Technical Management Requirements Definition Solution Definition Systems Analysis Requirements Validation Design Solution Verification End Products Validation Delphi 5.2% 13.1% 16.6% 18.1% 19.2% 11.3% 10.5% 6.6% Suggested 5% 15% 20% 15% 5% Std. Dev.
March Delphi Round 1 Highlights (cont.) Range of sensitivity for Size Drivers # Algorithms # Requirements # Interfaces # TPM’s # Scenarios # Modes # Platforms 5.57 Relative Effort
March Two Most Sensitive Size Drivers Suggested Rel. Effort Delphi Respondents EMR Rel. Effort Standard Deviation # Interfaces # Algorithms
March Delphi Round 1 Highlights (cont.) Range of sensitivity for Cost Drivers (Application Factors) EMR Requirements und. Architecture und. Level of service reqs. Legacy transition COTS Platform difficulty Bus. process reeng
March Delphi Round 1 Highlights (cont.) Range of sensitivity for Cost Drivers (Team Factors) Tool support Stakeholder comm. Stakeholder cohesion Personnel capability Personal experience Process maturity Multisite coord. Formality of deliv EMR 4 2
March Suggested EMR Delphi Respondents EMR Mean Standard Deviation Arch. Under Reqs. Under Pers. Cap Serv. Req Four Most Sensitive Cost Drivers
March Conclusions Not only do we need to better manage requirements, we also need to manage: 1) # of Interfaces 2) # of Algorithms 3) Personnel Capability 4) Level of service requirements, criticality, difficulty 5) Level of understanding “Control the controllables”
March Lessons Learned/Improvements Lesson 1 – There are lots of people and groups Interested in more precisely estimating system costs - And they are willing to help do it free. Lesson 2 – Currently, system engineering effort is estimated using activity-based costing heuristics Lesson 3 – When mounting a Delphi, clearly identify what you are trying to do. - Else, system engineers will attack you with a shotgun Lesson 4 – We could use help in developing a better designed template for Delphi instruments
March The Next Steps Incorporate suggestions from Delphi 1 Write report Get Masters degree Data from completed systems will then be used to statistically confirm or deny initial ratings Round 2 of Delphi …in the future
March Special Thanks to: Advisors Dr. Boehm, Dr. Axelband, Don Reifer Affiliates Gary Hafen, Tony Jordano, Chris Miller, Karen Owens, Don Reifer, Garry Roedler, Evin Stump, Gary Thomas, Marilee Wheaton
March Where can I get more info? COSYSMO Working Group meeting Thursday March 14 th 12:00 - 5:00 valerdi.com/cosysmo
March
March Delphi Round 1 Participants