Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 ESD.36 11/27/07 Ricardo Valerdi, PhD

Similar presentations


Presentation on theme: "1 ESD.36 11/27/07 Ricardo Valerdi, PhD"— Presentation transcript:

1 1 ESD.36 11/27/07 Ricardo Valerdi, PhD [rvalerdi@mit.edu]

2 2 Roadmap (1)Why estimate systems engineering? (2)Explanation of COSYSMO; (3)Limitations; (4)Recent developments/next steps;

3 All models are wrong… …but some of them are useful.

4 4 Systems Engineering Knowledge Hierarchy Systems Architecting heuristics (Rechtin 1991) Vee Model (Forsberg & Mooz 1995) SE Standards (ANSI/EIA 1999, ISO/IEC 2002) Maturity Models (CMMI 2002) COSYSMO (Valerdi et al 2003) DoD Architecture Framework (DODAF 2004) GUTSE [1] (Friedman 2004) [1] Ontologies (Honour & Valerdi 2006) VBSSE [2 ] (Jain & Boehm 2006) [2 ] Observation  Classification  Abstraction  Quantification & Measurement  Symbolic Representatio n  Symbolic Manipulation  Prediction  [1] [1] Grand Unified Theory of Systems Engineering [2] [2] Value-Based Systems & Software Engineering Dixit, I., Valerdi, R., “Challenges in the Development of Systems Engineering as a Profession,” INCOSE Symposium, San Diego, CA, June 2007.

5 5 Why measure systems engineering? Cost Overrun as a Function of SE Effort NASA Data (Honour 2004)

6 6 Historical Overview of COCOMO Suite of Models COQUALMO 1998 COCOMO 81 1981 COPROMO 1998 COSoSIMO 2004 Legend: Model has been calibrated with historical project data and expert (Delphi) data Model is derived from COCOMO II Model has been calibrated with expert (Delphi) data COCOTS 2000 COSYSMO 2002 CORADMO 1999 iDAVE 2003 COPLIMO 2003 COPSEMO 1998 COCOMO II 2000 DBA COCOMO 2004 COINCOMO 2004 Security Extension 2004 Costing Secure System 2004 Software Cost Models Software Extensions Other Independent Estimation Models Dates indicate the time that the first paper was published for the model

7 7 State of the Practice Capability to measure systems engineering is limited in current cost models Possible approaches –Heuristics/rules of thumb (Honour) –Analogy –% of SW or HW effort (COCOMOII, PRICE-H) –% of total effort (Honour) –A function of complexity (Ernstoff) Systems Engineering is evolving –INCOSE (est. 1992) –Standards (EIA/ANSI632, EIA/ANSI731, ISO/IEC15288) –Academic degrees We can start where COCOMO left off…

8 8 Key Definitions & Concepts Calibration: the tuning of parameters based on project data CER: a model that represents the cost estimating relationships of factors Cost Estimation: prediction of both the person-effort and elapsed time of a project Driver: A factor that is highly correlated to the amount of Systems Engineering effort Parametric: an equation or model that is approximated by a set of parameters Rating Scale: a range of values and definitions for a particular driver Understanding: an individual’s subjective judgment of their level of comprehension

9 9 COSYSMO Scope Addresses first four phases of the system engineering lifecycle (per ISO/IEC 15288) Considers standard Systems Engineering Work Breakdown Structure tasks (per EIA/ANSI 632) Conceptualize Develop Oper Test & Eval Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle

10 10 COSYSMO Size Drivers Effort Multipliers Effort Calibration # Requirements # Interfaces # Scenarios # Algorithms + 3 Volatility Factors - Application factors -8 factors - Team factors -6 factors - Schedule driver COSYSMO Operational Concept

11 11 Where: PM NS = effort in Person Months (Nominal Schedule) A = calibration constant derived from historical project data k = {REQ, IF, ALG, SCN} w x = weight for “easy”, “nominal”, or “difficult” size driver = quantity of “k” size driver E = represents diseconomy of scale EM = effort multiplier for the j th cost driver. The geometric product results in an overall effort adjustment factor to the nominal effort. Model Form

12 12 How is Systems Engineering Defined? Acquisition and Supply –Supply Process –Acquisition Process Technical Management –Planning Process –Assessment Process –Control Process System Design –Requirements Definition Process –Solution Definition Process Product Realization –Implementation Process –Transition to Use Process Technical Evaluation –Systems Analysis Process –Requirements Validation Process –System Verification Process –End Products Validation Process What is included from EIA/ANSI 632 “Processes for Engineering a System”?

13 COSYSMO Data Sources BoeingIntegrated Defense Systems (Seal Beach, CA) RaytheonIntelligence & Information Systems (Garland, TX) Northrop GrummanMission Systems (Redondo Beach, CA) Lockheed MartinTransportation & Security Solutions (Rockville, MD) Integrated Systems & Solutions (Valley Forge, PA) Systems Integration (Owego, NY) Aeronautics (Marietta, GA) Maritime Systems & Sensors (Manassas, VA; Baltimore, MD; Syracuse, NY) General DynamicsMaritime Digital Systems/AIS (Pittsfield, MA) Surveillance & Reconnaissance Systems/AIS (Bloomington, MN) BAE Systems National Security Solutions/ISS (San Diego, CA) Information & Electronic Warfare Systems (Nashua, NH) SAIC Army Transformation (Orlando, FL) Integrated Data Solutions & Analysis (McLean, VA) L-3 Communications Greenville, TX

14 14 4 Size Drivers 1. Number of System Requirements 2. Number of System Interfaces 3. Number of System Specific Algorithms 4. Number of Operational Scenarios Weighted by complexity, volatility, and degree of reuse

15 15 Counting Rules Example COSYSMO example for sky, kite, sea, and underwater levels where: Sky level: Build an SE cost model Kite level: Adopt EIA 632 as the WBS and ISO 15288 as the life cycle standard Sea level: Utilize size and cost drivers, definitions, and counting rules Underwater level: Perform statistical analysis of data with software tools and implement model in Excel Source: Cockburn 2001

16 16 14 Cost Drivers 1. Requirements understanding 2. Architecture understanding 3. Level of service requirements 4. Migration complexity 5. Technology Risk 6. Documentation Match to Life Cycle Needs 7. # and Diversity of Installations/Platforms 8. # of Recursive Levels in the Design Application Factors (8)

17 17 14 Cost Drivers (cont.) 1. Stakeholder team cohesion 2. Personnel/team capability 3. Personnel experience/continuity 4. Process capability 5. Multisite coordination 6. Tool support Team Factors (6)

18 18 ISO/IEC 15288 Conceptualize Develop Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle EIA/ANSI 632 Acquisition & Supply Technical Management System Design Product Realization Technical Evaluation Operational Test & Evaluation Effort Profiling

19 19 Limitations of the model 1. Mostly qualitative drivers 2. Variance of Delphi responses 3. Small sample size 4. Aerospace-heavy 5. Calibration is biased by successful projects because successful projects share data, bad ones don’t 6. Model will not work outside of calibrated range 7. A fool with a tool is still a fool

20 20 Academic prototype Commercial Implementations Proprietary Implementations COSYSMO-R SECOST SEEMaP

21 21 COSYSMO is… An evolving hypothesis A community of practice –Over 200 practitioners in the COSYSMO distribution list –Three workshops per year A collection of knowledge from –over 500 years of experience (expert opinion) –62 completed programs (historical data) A model A tool An upcoming book

22 22 Latest COSYSMO Innovations Risk (w/Lockheed Martin)‏ –Risk extension added to COSYSMO for probability distributions for size and cost driver –Gaffney, J., Valerdi, R., “Reducing Risk and Uncertainty in COSYSMO Size and Cost Drivers: Some Techniques for Enhancing Accuracy,” Conference on Systems Engineering Research, April 2006, Hoboken, NJ. Reuse (w/BAE Systems, Lockheed Martin, Raytheon)‏ –Motivated by feedback from Bradley Fleming, et al (LMCO Syracuse)‏ –Addresses new, modified, reused, and deleted requirements –Valerdi, R., Gaffney, J., “Extending COSYSMO to Accommodate Reuse,” 21 st COCOMO Forum, November 2006, Herndon, VA. Adoption (w/Systems & Software Consortium)‏ –Provides a 10-step process for adoption –See paper: Miller, C., Valerdi, R., “COSYSMO Adoption Process,” 21 st COCOMO Forum, November 2006, Herndon, VA.

23 23 Next Steps Integration between systems and software cost estimation –COSYSMO/COCOMO II overlap Impact of diseconomies of scale Estimation of SE effort in Operation & Maintenance phases Local calibration for specific domains (i.e., space systems)

24 24 MIT Lean Advancement Initiative corporate affiliates USC Center for Systems & Software Engineering corporate affiliates Air Force Space & Missile Systems Center INCOSE –Measurement Working Group –Corporate Advisory Board COSYSMO Development Support

25 25 Contact Ricardo Valerdi MIT rvalerdi@mit.edu (617) 253-8583 www.valerdi.com/cosysmo

26 26 Backup slides

27 EasyNominalDifficult # of System Requirements0.51.005.0 # of Interfaces1.74.39.8 # of Critical Algorithms3.46.518.2 # of Operational Scenarios9.822.847.4 Size Driver Weights

28 28 Cost Driver Rating Scales Very LowLowNominalHighVery High Extra HighEMR Requirements Understanding1.871.371.000.770.60 3.12 Architecture Understanding1.641.281.000.810.65 2.52 Level of Service Requirements0.620.791.001.361.85 2.98 Migration Complexity 1.001.251.551.93 Technology Risk0.670.821.001.321.75 2.61 Documentation0.780.881.001.131.28 1.64 # and diversity of installations/platforms 1.001.231.521.87 # of recursive levels in the design0.760.871.001.211.47 1.93 Stakeholder team cohesion1.501.221.000.810.65 2.31 Personnel/team capability1.501.221.000.810.65 2.31 Personnel experience/continuity1.481.221.000.820.67 2.21 Process capability1.471.211.000.880.770.682.16 Multisite coordination1.391.181.000.900.800.721.93 Tool support1.391.181.000.850.72 1.93

29 29

30 30 7-step Modeling Methodology Analyze Existing literature 1 2 3 4 5 6 7 Perform Behavioral Analysis Identify Relative Significance Perform Expert- Judgement, Delphi Assessment Gather Project Data Determine Bayesian A-Posteriori Update Gather more data; refine model Determine statistical significance

31 31 Size Drivers vs. Effort Multipliers Size Drivers: Additive, Incremental –Impact of adding a new item inversely proportional to current size 10 -> 11 rqts = 10% increase 100 -> 101 rqts = 1% increase Effort Multipliers: Multiplicative, system-wide –Impact of adding a new item independent of current size 10 rqts + high security = 40% increase 100 rqts + high security = 40% increase

32 32 Risk Conditions

33 33 Reuse Terminology New: –Items that are completely new Managed: –Items that are incorporated and require no added SE effort other than technical management Adopted: –Items that are incorporated unmodified but require verification and validation Modified: –Items that are incorporated but require tailoring or interface changes, and verification and validation Deleted: –Items that are removed from a legacy system, which require design analysis, tailoring or interface changes, and verification and validation Notes: New items are generally unprecedented Those items that are inherited but require architecture or implementation changes should be counted as New

34 34 Modified vs. New Threshold Reuse Continuum Modified Adopted New 1.0 0 Deleted Managed Reuse weight 0.65 0.51 0.43 0.15

35 35 Data Before Reuse Application

36 36 Same Data After Reuse Application

37 37 Improved Correlation Achieved for Similar Programs PRED(30) PRED(25) PRED(20)

38 38 SE Cost Estimation Life Cycle using COSYSMO Historical Data Collection Call for Participation Check Relevance / Informal Mapping Understand inputs and identify pilot programs Informal mapping at the WBS level Test run Industry Calibrated model Tailor COSYSMO to organization Local Calibration Large-scale rollout to other projects Train Champion Training for Users Piloting Institutionalization / adoption = V&V opportunity

39 39


Download ppt "1 ESD.36 11/27/07 Ricardo Valerdi, PhD"

Similar presentations


Ads by Google