COSOSIMO Jo Ann Lane University of Southern California

Slides:



Advertisements
Similar presentations
COST ESTIMATION TECHNIQUES AND COCOMO. Cost Estimation Techniques 1-)Algorithmic cost modelling 2-)Expert judgement 3-)Estimation by analogy 4)-Parkinsons.
Advertisements

Cost as a Business Driver 1 John Brown C Eng MIEE mr_ Software Cost Estimation.
Testing Workflow Purpose
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
On Representing Uncertainty In Some COCOMO Model Family Parameters October 27, 2004 John Gaffney Fellow, Software & Systems.
Chapter 26 Estimation for Software Projects
COCOMO Suite Model Unification Tool Ray Madachy 23rd International Forum on COCOMO and Systems/Software Cost Modeling October 27, 2008.
Cocomo II Constructive Cost Model [Boehm] Sybren Deelstra.
ICS Management Poor management is the downfall of many software projects Software project management is different from other engineering management.
University of Southern California Center for Systems and Software Engineering USC CSSE Research Overview Barry Boehm Sue Koolmanojwong Jo Ann Lane Nupul.
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
11/08/06Copyright 2006, RCI1 CONIPMO Workshop Out-brief 21 st International Forum on COCOMO and Software Cost Modeling Donald J. Reifer Reifer Consultants,
Integration of Software Cost Estimates Across COCOMO, SEER- SEM, and PRICE-S models Tom Harwick, Engineering Specialist Northrop Grumman Corporation Integrated.
COSOSIMO* Workshop 13 March 2006 Jo Ann Lane University of Southern California Center for Software Engineering CSE Annual.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 10/23/01 1 COSYSMO Portion The COCOMO II Suite of Software Cost Estimation.
Ch8: Management of Software Engineering. 1 Management of software engineering  Traditional engineering practice is to define a project around the product.
COCOMO II 資管研一 張永昌. Agenda Overall Model Definition COCOMO II Models for the Software Marketplace Sectors COCOMO II Model Rationale and Elaboration Development.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
System-of-Systems Cost Modeling: COSOSIMO July 2005 Workshop Results Jo Ann Lane University of Southern California Center for Software Engineering.
1 Discussion on Reuse Framework Jared Fortune, USC Ricardo Valerdi, MIT COSYSMO COCOMO Forum 2008 Los Angeles, CA.
Estimating System of Systems Engineering (SoSE) Effort Jo Ann Lane, USC Symposium on Complex Systems Engineering January 11-12, 2007.
Expert COSYSMO Update Raymond Madachy USC-CSSE Annual Research Review March 17, 2009.
UNCLASSIFIED Schopenhauer's Proof For Software: Pessimistic Bias In the NOSTROMO Tool (U) Dan Strickland Dynetics Program Software Support
COSOSIMO* Workshop Outbrief 14 March 2006 Jo Ann Lane University of Southern California Center for Software Engineering CSE.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Chapter 23 – Project planning Part 2. Estimation techniques  Organizations need to make software effort and cost estimates. There are two types of technique.
Information System Economics Software Project Cost Estimation.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
UML - Development Process 1 Software Development Process Using UML (2)
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
1 ECE 453 – CS 447 – SE 465 Software Testing & Quality Assurance Lecture 22 Instructor Paulo Alencar.
After Lesson 6 next is Lesson 13 to fit topic on Software Development SOFTWARE PROJECT MANAGEMENT.
An Introduction to Software Architecture
Estimation Why estimate? What to estimate? When to estimate?
Project Management Estimation. LOC and FP Estimation –Lines of code and function points were described as basic data from which productivity metrics can.
Testing Workflow In the Unified Process and Agile/Scrum processes.
1 Lecture 17: Chapter 26 Estimation for Software Projects Slide Set to accompany Software Engineering: A Practitioner’s Approach, 7/e by Roger S. Pressman.
1 Chapter 23 Estimation for Software Projects. 2 Software Project Planning The overall goal of project planning is to establish a pragmatic strategy for.
OOI CI LCA REVIEW August 2010 Ocean Observatories Initiative OOI Cyberinfrastructure Architecture Overview Michael Meisinger Life Cycle Architecture Review.
Cost Estimation What is estimated? –resources (humans, components, tools) –cost (person-months) –schedule (months) Why? –Personnel allocation –Contract.
Review of Software Process Models Review Class 1 Software Process Models CEN 4021 Class 2 – 01/12.
March Jo Ann Lane University of Southern California Center for Software Engineering CONSTRUCTIVE SYSTEM OF SYSTEMS INTEGRATION COST MODEL COSOSIMO.
University of Southern California Center for Systems and Software Engineering COCOMO Suite Toolset Ray Madachy, NPS Winsor Brown, USC.
SFWR ENG 3KO4 Slide 1 Management of Software Engineering Chapter 8: Fundamentals of Software Engineering C. Ghezzi, M. Jazayeri, D. Mandrioli.
Overview of COCOMO Reporter:Hui Zhang
Function Points Synthetic measure of program size used to estimate size early in the project Easier (than lines of code) to calculate from requirements.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
UNCLASSIFIED Approved for Public Release 07-MDA-2965 (26 OCT 07) Load Bearing Walls: Early Sizing Estimation In The NOSTROMO Tool (U) Dan Strickland Dynetics.
Cost Estimation Cost Estimation “The most unsuccessful three years in the education of cost estimators appears to be fifth-grade arithmetic. »Norman.
Project Planning Goal 1 - Estimates are documented for use in tracking and planning project. Goal 2 - Project Activities and commitments planned and documented.
Lecture 6 Title: Project Cost Management MIS 434.
SwCDR (Peer) Review 1 UCB MAVEN Particles and Fields Flight Software Critical Design Review Peter R. Harvey.
(8) Potential required for planning with management Top-Down Estimating Method: Top-down estimating method is also called Macro Model. Using it, estimation.
COCOMO Software Cost Estimating Model Lab 4 Demonstrator : Bandar Al Khalil.
1 Agile COCOMO II: A Tool for Software Cost Estimating by Analogy Cyrus Fakharzadeh Barry Boehm Gunjan Sharman SCEA 2002 Presentation University of Southern.
THE FAMU-CIS ALUMNI SYSTEM
Chapter 33 Estimation for Software Projects
Project Cost Management
Software Engineering: A Practitioner’s Approach, 6/e Chapter 23 Estimation for Software Projects copyright © 1996, 2001, 2005 R.S. Pressman & Associates,
Constructive Cost Model
Pongtip Aroonvatanaporn CSCI 577b Spring 2011 March 25, 2011
Software Development & Project Management
COCOMO Models.
Chapter 33 Estimation for Software Projects
Software Engineering: A Practitioner’s Approach, 6/e Chapter 23 Estimation for Software Projects copyright © 1996, 2001, 2005 R.S. Pressman & Associates,
Software Cost Estimation
University of Southern California Center for Software Engineering
Chapter 26 Estimation for Software Projects.
Center for Software and Systems Engineering,
Presentation transcript:

Estimating System-of-System Architecture Definition and Integration Effort COSOSIMO Jo Ann Lane University of Southern California Center for Software Engineering

Goals of This Presentation Provide an overview of System-of-system concepts The desired system-of-system activities to be covered by the cost model The cost model approaches, concepts, and definitions Current issues/questions under investigation Present an example using current investigational version of cost model Solicit data for further model investigations and calibration Obtain feedback/suggestions on approach and data collection PSM July 2004

System of Systems (SoS) Concept Sm S2 S1 S11 S12 S1n S21 S22 S2n Sm1 Sm2 Smn …… PSM July 2004

High Level Partitioning of Cost Models COSOSIMO SOS System of System Architecting Integration/Test System COSYSMO System Integration/Test Architecting Requirements Analysis Software Software Acceptance Test COCOMO II Preliminary Design Integration Detailed Design Unit Test Legend COCOMO COSYSMO COSOSIMO Coding PSM July 2004

Constructive System-of-System Integration Cost Model (COSOSIMO) Parametric model to estimate the effort associated with the definition and integration of software-intensive “system of systems” components Includes at least one size driver and 6 exponential scale factors related to effort Targets input parameters that can be determined in early phases Goal is to have zero overlap with COCOMO II and COSYSMO PSM July 2004

Key Activities Covered by Each Cost Model COSOSIMO SoS architecture definition SoS integration activities Development of SoS integration lab Development of SoS level test plans and procedures Execution of test SoS test procedures High-level isolation of problems detected during integration COSYSMO System/sub-system definition Operational concepts Operational scenarios System/sub-system elaboration System integration and test Resolution of system-level errors detected during test activities Deployment Maintenance COCOMO II Application and system software development Elaboration Construction Development of test tools and simulators (estimated as a separate set of software) Resolution of software errors detected during test activities PSM July 2004

Model Differences COSOSIMO COSYSMO COCOMO II System of Systems architecture definition and integration Pre and Post COCOMO II effort Very new Only expert validation 6 exponential scale factors Candidate drivers Effective KSLOC (eKSLOC) Logical interfaces at SoS level COSYSMO Systems engineering Entire life cycle 3 years old 11 calibration points 18 drivers Size is driven by requirements interfaces algorithms operational scenarios COCOMO II Software development Development phases 20+ years old 161 calibration points 23 drivers Size is driven by effective SLOC (eSLOC) PSM July 2004

COSOSIMO Operational Concept Size Drivers Interface-related eKSLOC Number of logical interfaces at SoS level COSOSIMO SoS Definition and Integration Effort Exponential Scale Factors Integration simplicity Integration risk resolution Integration stability Component readiness Integration capability Integration processes Calibration PSM July 2004

COSOSIMO Model Equations Level 1 IPM (Si) = Ai  Size (Sij) Bi j=1 ni Level 0 IPM (SoS) = A0  IPM (Si) B0 i=1 mi Two level model that First determines integration effort for first level subsystems…. Then, using subsystem integration effort and SoS characteristics, determines SoS integration effort… Level 0 SOS Level 1 S1 S2 …… Sm S11 S12 S1n S21 S22 S2n Sm1 Sm2 Smn …… …… …… PSM July 2004

COSOSIMO Model Parameters IPM Integration effort in Person Months Si The ith subsystem within the SoS A Constant derived from historical project data Size Determined by computing the weighted average of the size driver(s) ni Number of Subsystem level 2 components comprising the ith subsystem m Number of Subsystem level 1 components comprising the SoS Bi Effort exponent for the ith subsystem based on the subsystem’s 6 exponential scale factors. The sum of the scale factors results in an overall exponential effort adjustment factor to the nominal effort. B0 Effort exponent for the SoS based on the SOS’ 6 exponential scale factors. The sum of the scale factors results in an overall exponential effort adjustment factor to the nominal effort. PSM July 2004

Current Level 1 Size Driver Subsystem development size measured in effective KSLOC (eKSLOC) eKSLOC can be calculated using COCOMO II Size weighted by Complexity Volatility Degree of COTS/reuse S1 S2 S3 S4 PSM July 2004

Additional Proposed Size Drivers Number of major interfaces Number of operational scenarios S1 S2 S3 S4 Each weighted by Complexity Volatility Degree of COTS/reuse PSM July 2004

Proposed Size Driver Definitions PSM July 2004

Subsystem Software Size This driver represents the software subsystem size. It is measured in terms of effective thousand lines of code (eKSLOC). eKSLOC can calculated using COCOMO II or a comparable estimation model or technique. Easy Nominal Difficult - Simple algorithms - Straightforward, but non-trivial algorithms - Complex algorithms - Basic math - Algebraic by nature - Difficult math (calculus) - Straightforward structure - Nested structure with decision logic - Recursive in structure with distributed control - Simple data - Relational data - Persistent data - Timing not an issue - Timing a constraint - Dynamic, with timing issues PSM July 2004

Number of Major Interfaces This driver represents the number of shared major physical and logical boundaries between subsystem components or functions (internal interfaces) and those external to the subsystem (external interfaces). These interfaces typically can be quantified by counting the number of interfaces identified in either the subsystem’s context diagram and/or by counting the significant interfaces in all applicable Interface Control Documents. Easy Nominal Difficult - Well defined - Loosely defined - Ill defined - Uncoupled - Loosely coupled - Highly coupled - Cohesive - Moderate cohesion - Low cohesion - Well behaved - Predictable behavior - Poorly behaved PSM July 2004

Number of Operational Scenarios This driver represents the number of operational scenarios that a system must satisfy. Such threads typically result in end-to-end test scenarios that are developed to validate the system and satisfy all of its requirements. The number of scenarios can typically be quantified by counting the number of unique end-to-end tests used to validate the system functionality and performance or by counting the number of high-level use cases developed as part of the operational architecture. Easy Nominal Difficult - Well defined - Loosely defined - Ill defined - Loosely coupled - Moderately coupled - Tightly coupled or many dependencies/conflicting requirements - Timelines not an issue - Timelines a constraint - Tight timelines through scenario network PSM July 2004

Proposed Exponential Scale Factor Definitions PSM July 2004

Integration Simplicity (ISMPL) Represents a parameter which includes system component coupling, processing criticality, scope of key performance parameters, and system precedentedness. Very Low Low Nominal High Very High Extra High Very strong coupling Very strong criticality Cross-cutting key performance parameters Highly unprecedented Strong coupling Strong criticality Mostly unprecedented Both strong & weak coupling Mixed criticality Partly unprecedented Moderate coupling Moderate criticality Some new aspects Weak coupling Low criticality Few new aspects Very weak coupling No cross-cutting key performance parameters No new aspects PSM July 2004

Integration Risk Resolution (IRESL) Represents a multi-attribute parameter which includes number of integration risk items, risk management/mitigation plan, compatible schedules and budgets, expert availability, tool support, level of uncertainty in integration risk areas. IRESL is the subjective weighted average of the listed characteristics. Characteristic Very Low Low Nominal High Very High Number and criticality of risk items Risk mitigation activities Schedule, budget, and internal milestones compatible with Risk Management Plan and integration scope % of top software system integrators available to project Tool support available for tracking issues Level of uncertainty in integration risk area > 10 critical None 20% Extreme 5-10 critical Little 40% Significant 2-4 critical Some 60% Considerable 1 critical Risks generally covered Generally 80% Good <10 non-critical Risks fully covered Mostly 100% Strong PSM July 2004

Integration Stability (ISBLY) Indicates anticipated change in integration components during system of system integration activities. Very Low Low Nominal High Very High Extra High 10% change during integration period 7% change during integration period 4% change during integration period 2% change during integration period 1% change during integration period No change during integration period PSM July 2004

Component Readiness (CREDY) Indicates readiness of component (sub-component) for integration. Includes level of verification and validation (V&V) that has been performed prior to integration and level of subsystem integration activities that have been performed prior to integration into the SOSIL. Very Low Low Nominal High Very High Extra High Minimally V&V’d No pre-integration Some V&V Minimal pre-integration Moderate V&V Some pre-integration Considerable V&V Moderate pre-integration Extensive V&V Considerable pre-integration Thoroughly V&V’d Extensive pre-integration PSM July 2004

Integration Capability (ICAPY) Represents a multi-attribute parameter which includes the integration team cooperation and cohesion, integration personnel capability and continuity, and integration personnel experience (application, language, tool, and platform). ICAPY is the subjective weighted average of the listed characteristics. Factor Very Low Low Nominal High Very High Extra High ITEAM IPERS IPREX Very difficult team interactions 35th percentile 30% turnover rate  5 months experience with app, lang, tools, platforms Some difficult team interactions 45th percentile 20% turnover rate 9 months experience with app, lang, tools, platforms Basically cooperative teams 55th percentile 12% turnover rate 1 year experience with app, lang, tools, platforms Largely cooperative teams 65th percentile 9% turnover rate 2 years experience with app, lang, tools, platforms Highly cooperative teams 75th percentile 6% turnover rate 4 years experience with app, lang, tools, platforms Seamless team interactions 85th percentile 4% turnover rate 6 years experience with app, lang, tools, platforms PSM July 2004

Integration Processes (IPROC) Represents a parameter that rates the maturity level and completeness of an integration team’s integration processes, plans, and the SOS integration lab (SOSIL). IPROC is the subjective weighted average of the listed characteristics. Very Low Low Nominal High Very High Extra High Ad-hoc integration process Minimal SOSIL CMMI Level 1 (lower half) Minimal integration plans Immature core SOSIL CMMI Level 1 (upper half) Some integration plans Partly mature core SOSIL CMMI Level 2 Moderate plans Mature core SOSIL CMMI Level 3 Considerable plans Partly mature extended SOSIL CMMI Level 4 Extensive plans Fully mature extended SOSIL CMMI Level 5 PSM July 2004

Current Issues PSM July 2004

Issues and Questions Currently Under Investigation What is the best size driver If software size used Should it be limited to the software performing interface operations How should COTS product interfaces be accounted for If number of logical interfaces is used Which ones to include What level to count How to specify complexities associated with various interfaces Do user scenarios and user interfaces capture additional size information needed to better estimate level of effort If multiple size drivers used, what is the relative weight of each Model outputs Desired granularity of effort estimates Associated schedule? PSM July 2004

Issues and Questions Currently Under Investigation (continued) How to ensure no overlap with COSYSMO or COCOMO II models Are current scale factors Relevant Sufficient Are current scale factor values/range of values appropriate How well do the various model variations track with respect to Expert judgment Actual experiences/projects PSM July 2004

SOS Estimation Example Using Only Software Size as the Size Driver Hazardous Materials Response System of Systems PSM July 2004

System of System Architecture Example Hazardous Materials Response SOS Command And Control 3000 eKSLOC Network Comms 800 eKSLOC Sensor Data Processing 500 eKSLOC HazMat Identification 900 eKSLOC PSM July 2004

SOS Integration Calculations with Nominal Level 1 and Level 0 Drivers Total SOS Integration Effort: ~9310 Person Months or 775.8 Person Years PSM July 2004

Potential Range of Values for Example Case Person Months % of Total Estimated Nominal Development Effort Nominal 9310 29% Best 6477 20% Worst 11907 37% PSM July 2004

Parametric Cost Model Critical Path Tasks and Status Converge on preliminary cost drivers, WBS Converge on detailed definitions and rating scales Obtain initial exploratory dataset Refine model based on data collection and analysis experience Obtain IOC calibration dataset Refine IOC model and tool PSM July 2004

Upcoming Calendar of Events: 2004/2005 USC CSE Annual Research Review (Los Angeles, CA) COCOMO Forum (Los Angeles, CA) … J A S O N D J F M A M J 2004 2005 Practical Software & Systems Measurement Workshop (Keystone, CO) Proposed First Working Group Meeting PSM July 2004

Next Steps Refine the model based on delphi inputs and actual data Working group meeting at October 2004 COCOMO II Workshop We would appreciate your help! PSM July 2004

Questions or Comments? Jo Ann Lane Websites Books Articles jalane@tns.net Websites http://cse.usc.edu (COSOSIMO web site coming soon…) Books Boehm, B., et al, Software Cost Estimation with COCOMOII, 1st Ed, Prentice Hall, 2000 Boehm, B., Software Engineering Economics, 1st Ed, Prentice Hall, 1981 Articles Boehm, B., et al., Future Trends, Implications in Cost Estimation Models, CrossTalk April 2000. Gilligan, John M., Department of the Air Force, Military-Use Software: Challenges and Opportunities, CrossTalk, January 2004. PSM July 2004