Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Systems and Software Engineering Dr. Mauricio Peña January 28, 2013.

Similar presentations


Presentation on theme: "University of Southern California Center for Systems and Software Engineering Dr. Mauricio Peña January 28, 2013."— Presentation transcript:

1 University of Southern California Center for Systems and Software Engineering Dr. Mauricio Peña January 28, 2013

2 University of Southern California Center for Systems and Software Engineering 2 Agenda Motivation Introduction and key definitions Description of the model development process Explanation of its size and cost drivers Limitations Data sources and estimation accuracy COSYSMO 2.0 - reuse extension Requirements Volatility Extension Take-aways

3 University of Southern California Center for Systems and Software Engineering 3 Thoughts on Cost Models Cost models serve as valuable tools for engineers and project managers to estimate engineering effort, but… –A model is not reality –All models are wrong, but some of them are useful –People are generally optimistic (and so are models) –Beware of Longevity Bias - If something has been around longer, it must be better (right?) Parametric models rely on historical data - assumes “past performance is the best indication of future performance” “If you are going to sin, sin consistently” Valerdi, R. (2010) Heuristics for Systems Engineering Cost Estimation, 14th Annual PSM Users’ Group Conference Koehler, D. J., Harvey, N. (1997). Confidence judgments by actors and observers. Journal of Behavioral Decision Making. 10, 221-242 Eidelman, S., Pattershall, J., & Crandall, C.S. (in press). Longer is better. Journal of Experimental Social Psychology.

4 University of Southern California Center for Systems and Software Engineering 4 Motivation Most widely used estimation tools treat Systems Engineering as a subset of S/W or H/W effort Complex systems are not dominated by either H/W or S/W but by the allocation of functions and interactions of system elements – the realm of SE Consequently, SE resources should be forecasted based on the tasks that Systems Engineering performs and not as an arbitrary percentage of another effort The Systems Engineering community has a need for better estimation capability Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

5 University of Southern California Center for Systems and Software Engineering 5 When is that Systems Engineer going to tell me the S/W requirements?

6 University of Southern California Center for Systems and Software Engineering 6 Cost Estimation Methods Parametric cost estimation Bottom Up & Activity Based Top Down & Design To Cost Case studies and analogy Expert opinion Heuristics & rules of thumb Maturity and Sophistication

7 University of Southern California Center for Systems and Software Engineering 7 Heuristics & Rules of Thumb A Heuristic is a rule of thumb, educated guess, simplification, intuitive judgment or common sense –Heuristics are effective tools for making decisions when little information is available –However, they lack fidelity and are based on experience/context that may not be apply to your system Examples: –The Systems Engineering % of the total budget should be: 6-15% for recurring systems 15-20% for new developments –Other Rules: If the system unprecedented, then raise the budget by 50% If the system faces an extreme requirement (safety, performance, etc), then raise the budget by 25%

8 University of Southern California Center for Systems and Software Engineering 8 Expert Opinion Informal approach that takes the subjective opinion of domain experts as an input –Simple and useful in the absence of empirical data –The estimate is only as good as the expert’s opinion - even the most highly competent experts can be wrong The Delphi and Wideband Delphi methods are common techniques used to capture expert opinion The Oracle of Delphi

9 University of Southern California Center for Systems and Software Engineering 9 Design-to-Cost or CAIV 1 Cost Performance Objective (goal) Threshold (required) Objective (target) Threshold (no greater than) Trade off region 1 CAIV = Cost as and Independent Variable

10 University of Southern California Center for Systems and Software Engineering 10 Bottom Up and Activity-Based Source: ISO/IEC 15288. System Level 1 Level 2 Level 3

11 University of Southern California Center for Systems and Software Engineering 11 CSSE Parametric Cost Models The Constructive Systems Engineering Cost Model (COSYSMO) was developed by the USC Center for Software and Systems Engineering (CSSE) in collaboration with INCOSE and Industry affiliates COSYSMO is the first generally-available parametric cost model designed to estimate Systems Engineering effort Built on experience from COCOMO 1981, COCOMO II –Most widely used software cost models worldwide –Developed with Affiliate funding, expertise, data support Source: 7th Annual Practical Software and Systems Measurement Conference. COSYSMO Workshop, Boehm

12 University of Southern California Center for Systems and Software Engineering 12 Source: EIA/ANSI 632 Processes for Engineering a System (1999) 5 Fundamental Processes for Engineering a System

13 University of Southern California Center for Systems and Software Engineering 13 NASA data (Honour 2002) Systems Engineering Effort vs. Program Cost

14 University of Southern California Center for Systems and Software Engineering 14 Key Definitions and Concepts CER: a model that represents the cost estimating relationships between factors Cost Estimation: prediction of both the person-effort and elapsed time of a project Driver: A factor that drives the amount of Systems Engineering effort Parametric: an equation or model that is approximated by a set of parameters Rating Scale: a range of values and definitions for a particular driver Boehm, B. Reiffer, D., and Valerdi, R. (2003) 7th Annual Practical Software and Systems Measurement Conference. COSYSMO Workshop

15 University of Southern California Center for Systems and Software Engineering 15 COSYSMO Scope Addresses first four phases of the system engineering lifecycle (per ISO/IEC 15288) Considers standard Systems Engineering Work Breakdown Structure tasks (per EIA/ANSI 632) Conceptualize Develop Oper Test & Eval Transition to Operation Operate, Maintain, or Enhance Replace or Dismantle

16 University of Southern California Center for Systems and Software Engineering 16 Life Cycle Phase Definition Conceptualize phase focuses on identifying stakeholder needs, exploring different solution concepts, and proposing candidate solutions. The Development phase involves refining the system requirements, creating a solution description, and building a system. The Operational Test & Evaluation Phase involves verifying/validating the system and performing the appropriate inspections before it is delivered to the user The Transition to Operation Phase involves the transition to utilization of the system to satisfy the users’ needs. Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

17 University of Southern California Center for Systems and Software Engineering 17 COSYSMO Size Drivers Effort Multipliers Effort Calibration # Requirements # Interfaces # Scenarios # Algorithms - Application factors -8 factors - Team factors -6 factors WBS guided by EIA/ANSI 632 COSYSMO Operational Concept Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

18 University of Southern California Center for Systems and Software Engineering 18 Parametric Cost Estimating Relationship Where: PM NS = effort in Person Months (Nominal Schedule) A = constant derived from historical project data Size = determined by computing the weighted average of the (4) size drivers E = represents diseconomy of scale n = number of cost drivers (14) EM = effort multiplier for the ith cost driver. The geometric product results in an overall effort adjustment factor to the nominal effort Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

19 University of Southern California Center for Systems and Software Engineering Model Differences 19 FeatureCOCOMO IICOSYSMO EstimatesSoftware DevelopmentSystems Engineering Life Cycle PhasesMBASE 1 /RUP Phases: 1.Inception 2.Elaboration 3.Construction, and 4.Transition ISO/IEC 15288 Phases: 1.Conceptualize 2.Develop 3.Operational Test & Evaluation 4.Transition to Operation Form of the Model1 Size Factor (SLOC) 5 Scale Factors 18 Effort Multipliers 4 Size Factors 1 Scale Factor 14 Effort Multipliers Represents diseconomy of scale through Five Scale FactorsOne Exponential System Factor 1 Model Based Systems Architecting and Software Engineering Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

20 University of Southern California Center for Systems and Software Engineering 20 Size Drivers vs. Effort Multipliers –Size Drivers: Additive, Incremental –Impact of adding a new item inversely proportional to current size –10 11 rqts = 10% increase –100 101 rqts = 1% increase Effort Multipliers: Multiplicative, system-wide –Impact of adding a new item independent of current size –10 rqts + high security = 40% increase –100 rqts + high security = 40% increase Boehm, B. Reiffer, D., and Valerdi, R. (2003) 7th Annual Practical Software and Systems Measurement Conference. COSYSMO Workshop

21 University of Southern California Center for Systems and Software Engineering 21 Counting Rules Framework Sky level –Top-level objective –Summary use case Kite level –Additional information as to how the sky-level objective is to be will be satisfied Seal level –User level task –Environment in which the developer interacts with the stakeholder Underwater level –Detailed design and implementation Sources: Cockburn (2001); Valerdi (2005)

22 University of Southern California Center for Systems and Software Engineering 22 Requirements Counting Example Objective: Broadcast television signals over the Continental United States (CONUS) compatible with 18 inch receive antennas 1.The system shall be able to receive a Ku-band signal from the customer ground station and downlink the signal to CONUS coverage with a minimum Equivalent isotropically radiated power (EIRP) of 30 dBm 1.1The system shall have a pointing accuracy of 0.5 degrees 1.2The system shall radiate a minimum of 3000 W of RF power –1.2.1 The system shall be able to produce 4000 W of DC power –1.2.2 The system shall be able to store 500 W-Hr of energy –1.2.3 The system shall be able to dissipate 2000 W of heat 1.2.3.1The radiator surface area shall be greater than… 1.2.3.2The radiator surface properties shall be……. 1.2.3.3Metal surfaces shall be blanketed……. Too High? Too Low? Just Right

23 University of Southern California Center for Systems and Software Engineering 23 EasyNominalDifficult # of System Requirements0.51.005.0 # of Interfaces1.74.39.8 # of Critical Algorithms3.46.518.2 # of Operational Scenarios9.822.847.4 Size Driver Weights Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

24 University of Southern California Center for Systems and Software Engineering 24 UNDERSTANDING FACTORS –Requirements understanding –Architecture understanding –Stakeholder team cohesion –Personnel experience/continuity COMPLEXITY FACTORS –Level of service requirements –Technology Risk –# of Recursive Levels in the Design –Documentation Match to Life Cycle Needs OPERATIONS FACTORS –# and Diversity of Installations/Platforms –Migration complexity PEOPLE FACTORS –Personnel/team capability –Process capability ENVIRONMENT FACTORS –Multisite coordination –Tool support Cost Driver Clusters Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

25 University of Southern California Center for Systems and Software Engineering 25 Cost Driver Rating Scales Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California Very LowLowNominalHighVery HighExtra HighEMR Requirements Understanding1.871.371.000.770.60 3.12 Architecture Understanding1.641.281.000.810.65 2.52 Level of Service Requirements0.620.791.001.361.85 2.98 Migration Complexity 1.001.251.551.93 Technology Risk0.670.821.001.321.75 2.61 Documentation0.780.881.001.131.28 1.64 # and diversity of installations/platforms 1.001.231.521.87 # of recursive levels in the design0.760.871.001.211.47 1.93 Stakeholder team cohesion1.501.221.000.810.65 2.31 Personnel/team capability1.501.221.000.810.65 2.31 Personnel experience/continuity1.481.221.000.820.67 2.21 Process capability1.471.211.000.880.770.682.16 Multisite coordination1.391.181.000.900.800.721.93 Tool support1.391.181.000.850.72 1.93

26 University of Southern California Center for Systems and Software Engineering 26 Effort Profiling ISO/IEC 15288 – Common framework for describing the life cycle of systems EIA/ANSI 632 – Integrated set of fundamental systems engineering processes Source : Draft Report ISO Study Group May 2, 2000

27 University of Southern California Center for Systems and Software Engineering 27 Limitations of the model 1. Mostly qualitative drivers 2. Variance of Delphi responses 3. Small sample size 4. Aerospace-heavy 5. Calibration is biased by successful projects because successful projects share data, bad ones don’t 6. Model will not work outside of calibrated range 7. A fool with a tool is still a fool

28 University of Southern California Center for Systems and Software Engineering 28

29 University of Southern California Center for Systems and Software Engineering 29 COSYMO 1.0 Model Form Where PM NS = effort in Person Months (Nominal Schedule) A = calibration constant derived from historical project data k = {Requirements, Interfaces, Algorithms, Scenarios} w x = weight for “Easy”, “Nominal”, or “Difficult” size driver Ф= quantity of “k” size driver E = represents (dis)economies of scale EM = effort multiplier for the jth cost driver. Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

30 University of Southern California Center for Systems and Software Engineering COSYSMO 1.0 Data Sources RaytheonIntelligence & Information Systems (Garland, TX) Northrop GrummanMission Systems (Redondo Beach, CA) Lockheed MartinTransportation & Security Solutions (Rockville, MD) Integrated Systems & Solutions (Valley Forge, PA) Systems Integration (Owego, NY) Aeronautics (Marietta, GA) Maritime Systems & Sensors (Manassas, VA) General DynamicsMaritime Digital Systems/AIS (Pittsfield, MA) Surveillance & Reconnaissance Systems/AIS (Bloomington, MN) BAE Systems National Security Solutions/ISS (San Diego, CA) Information & Electronic Warfare Systems (Nashua, NH) SAIC Army Transformation (Orlando, FL) Integrated Data Solutions & Analysis (McLean, VA) Source : Valerdi, R. (2005) USC CSSE Annual Research Review. COSYSMO Working Group

31 University of Southern California Center for Systems and Software Engineering 31 COSYSMO Estimation Accuracy COSYSMO is capable of estimating systems engineering effort within 30% of actuals, 50% of the time, or PRED(.30) = 50%. R 2 = coefficient of determination, provides a measure of how well future outcomes are likely to be predicted by the model. OrganizationNR-squaredPRED(30) 123123 10 7 10 0.94 0.59 0.62 70% 43% 50% Total2756% Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California

32 University of Southern California Center for Systems and Software Engineering 32 COSYSMO 2.0 Operational Concept Based on 2009 dissertation by Dr. Jared Fortune

33 University of Southern California Center for Systems and Software Engineering 33 COSYSMO 2.0 Model Form Based on 2009 dissertation by Dr. Jared Fortune

34 University of Southern California Center for Systems and Software Engineering 34 Reuse Category Weights Based on 2009 dissertation by Dr. Jared Fortune

35 University of Southern California Center for Systems and Software Engineering 35 COSYSMO 2.0 Implementation Results Across 44 projects at 1 diversified organization Using COSYSMO: –PRED(.30) = 14% –PRED(.40) = 20% –PRED(.50) = 20% –R 2 = 0.50 Using COSYSMO 2.0: –PRED(.30) = 34% –PRED(.40) = 50% –PRED(.50) = 57% –R 2 = 0.72 Result: 36 of 44 (82%) estimates improved Based on 2009 dissertation by Dr. Jared Fortune

36 University of Southern California Center for Systems and Software Engineering Requirements Volatility: A Sizing and Cost Estimation Challenge Our dynamic competitive environment leads to rapid changes in objectives, constraints and priorities Requirements are often emergent instead of pre- specifiable Requirements changes can be costly, particularly in the later stages of the lifecycle process These trends have lead to the increasing use of incremental and evolutionary development strategies 36 Source: Kotonya and Sommerville (1995) Requirements volatility should be anticipated, managed, and accounted for in cost estimates

37 University of Southern California Center for Systems and Software Engineering 37 Requirements Volatility Extension COSYSMO Size Drivers Effort Multipliers Effort Calibration # Requirements # Interfaces # Scenarios # Algorithms - Application factors -8 factors - Team factors -6 factors Reuse Categories Volatility Factor

38 University of Southern California Center for Systems and Software Engineering Organizations that Participated in the Research The Aerospace Corporation Northrop Grumman Corporation The Boeing Company Raytheon United Launch Alliance BAE TI Metricas Ltda. IBM Distributed Management MIT USC Lockheed Martin Ericsson España Samsung SDS Rolls Royce Softstar Texas Tech The US Army The US Navy The US Air force The Australian Department of Defense 38

39 University of Southern California Center for Systems and Software Engineering 39 Requirements Volatility Metrics

40 University of Southern California Center for Systems and Software Engineering 40 Causes of Requirements Volatility External factors –Shifting customer priorities and needs –Changes in the political and business environment –Addition or change of stakeholders –Development of new technologies –Changes in co-dependent systems Internal factors –Deficient requirements development processes –Lack of experienced systems engineering resources applied to requirements analysis –Poor initial understanding or interpretation of customer needs by the development team –Changes in organizational structure and policies Sources: Kotonya and Sommerville (1995); Houston (2000); Zowghi and Nurmuliani (2002); Kulk and Verhoef 2008; Ferreira, S., Collofello, J., Shunk, D., and Mackulak, G. (2009)

41 University of Southern California Center for Systems and Software Engineering 41 Requirements Volatility Causal Model Diagram Contextual / Environmental Changes Experienced Staff Changes in COTS Products Changes in Org / structure/ Process Poor Understanding of the System & Customer Needs Customer-driven Scope Change Changes in Co-dependent Systems Requirements Process Maturity Technology Maturity Requirements Volatility Sources: Kotonya and Sommerville (1995); Hammer et al. (1998); Malaiya and Denton (1999); Stark et al. (1999); Houston (2000); Zowghi and Nurmuliani (2002); Kulk and Verhoef 2008; Ferreira, S., Collofello, J., Shunk, D., and Mackulak, G. (2009)

42 University of Southern California Center for Systems and Software Engineering 42 Impacts of Requirements Volatility Several research studies have found that requirements volatility is correlated with an increase in The functional size of the project (discarded or additional requirements) Engineering effort Cost and schedule duration Changes in requirements result in additional rework and increased defect density The impact of changing requirements is greater the later the change occurs in the project lifecycle Removing a requirement may not necessarily result in a net decrease in engineering effort and cost Sources: Kotonya and Sommerville (1995); Hammer et al. (1998); Malaiya and Denton (1999); Stark et al. (1999); Houston (2000); Zowghi and Nurmuliani (2002); Kulk and Verhoef 2008; Ferreira, S., Collofello, J., Shunk, D., and Mackulak, G. (2009)

43 University of Southern California Center for Systems and Software Engineering 43 Impacts of Volatility Causal Model Diagram Project Effort Rework / Defects Number of System Requirements Customer Satisfaction Quality Project Cost Project Schedule Requirements Volatility +/- Sources: Kotonya and Sommerville (1995); Hammer et al. (1998); Malaiya and Denton (1999); Stark et al. (1999); Houston (2000); Zowghi and Nurmuliani (2002); Kulk and Verhoef 2008; Ferreira, S., Collofello, J., Shunk, D., and Mackulak, G. (2009)

44 University of Southern California Center for Systems and Software Engineering Incorporating Volatility Effects Incorporated volatility effects through a scale factor (SF) added to the diseconomies of scale exponent (E) Similar approach used to model volatility effects in Ada COCOMO Prior research points to the compounding or exponential effect of requirements volatility 44 Source: Boehm, B. and Royce, W. (1989); Kulk and Verhoef (2008); Wang, G. et al., (2008)

45 University of Southern California Center for Systems and Software Engineering Volatility Scale Factor Where –REVL = The % of the baseline requirements that is expected to change over the system lifecycle –w vl = aggregate lifecycle phase volatility weighting factor And: –w x = weighting factor for each life cycle phase 1 –Θ x = % of total requirements changes per life cycle phase 45 Expected REVL is rated as Very Low, Low, Moderate, High, and Very High

46 University of Southern California Center for Systems and Software Engineering REVL Levels Cumulative Volatility 46 DevelopmentConceptualize Operational Test & Eval Transition to Operations REVL (% of baseline requirements changes) 6% 22% 38% 54%

47 University of Southern California Center for Systems and Software Engineering 47 Model Performance Comparison Model performance evaluated using the baseline model and a model with local calibration The performance of COSYSMO improves by including the requirements volatility factor

48 University of Southern California Center for Systems and Software Engineering Take-aways Parametric cost models provide good estimates for engineering work and can be used to analyze –Investments –Tradeoffs –Risk Understand a model’s sensitivity and limitations –Compared with other estimates (expert-based, analogy, activity-based, etc…) –Apply the model within the calibrated range and application domain –Use consistent definitions and metrics Reuse and requirements volatility are meaningful factors in cost estimation 48

49 University of Southern California Center for Systems and Software Engineering 49 For more information visit: http://cosysmo.mit.edu/ Or contact mauricip@usc.edu

50 University of Southern California Center for Systems and Software Engineering 50 References Boehm, B., Abts, C., Brown, A.W., Chulani, S., Clark, B., Horowitz, E., Madachy, R., Reifer, D.J., and Steece, B. (2000). Software Cost Estimation with COCOMO II. Prentice Hall Department of Defense (2010). Quadrennial Defense Review Report Eidelman, S., Pattershall, J., & Crandall, C.S. (in press). Longer is better. Journal of Experimental Social Psychology. Ferreira, S., Collofello, J., Shunk, D., and Mackulak, G. (2009). “Understanding the effects of requirements volatility in software engineering by using analytical modeling and software process simulation.” The Journal of Systems and Software. Vol. 82, pp 1568-1577. Wang, G., Boehm, B., Valerdi, R., and Shernoff, A. (2008). “Proposed Modification to COSYSMO Estimating Relationship.” Technical Report. University of Southern California, Center for Systems and Software Engineering. General Accounting Office (2004). Stronger Management Practices are Needed to Improve DOD’s Software-intensive Weapon Acquisitions (GAO-04-393). Defense Acquisitions. Hammer, T., Huffman, L., and Rosenberg, L. (1998). “Doing requirements right the first time.” Crosstalk, the Journal of Defense Software Engineering. Pp 20-25. Houston, Dan X. (2000). A Software Project Simulation Model for Risk Management, Ph.D. Dissertation, Arizona State University ISO/IEC (2008). ISO/IEC 15288:2008 (E) Systems Engineering - System Life Cycle Processes. Koehler, D. J., Harvey, N. (1997). Confidence judgments by actors and observers. Journal of Behavioral Decision Making. 10, 221- 242 Kotonya, G., Sommerville, I., (1998). Requirements Engineering: Processes and Techniques. John Wiley and Sons, Ltd. Malaiya, Y., and Denton, J. (1999). “Requirements Volatility and Defect Density.” Proceedings of the International Symposium on Software Reliability Engineering. MIL-STD-498. 1994. Software Development and Documentation. U.S. Department of Defense. Nguyen, V. and Boehm, B. (2010). A COCOMO Extension for Software Maintenance. 25th International Forum on COCOMO and Systems/Software Cost Modeling Valerdi, R. (2005). The constructive systems engineering cost model (COSYSMO). Doctoral Dissertation. University of Southern California, Industrial and Systems Engineering Department. Valerdi, R. (2010) Heuristics for Systems Engineering Cost Estimation, 14th Annual PSM Users’ Group Conference Zowghi, D. and Nurmuliani, N. (2002). A Study of the Impact of Requirements Volatility on Software Project Performance. Proceedings of the Ninth Asia-Pacific Software Engineering Conference

51 University of Southern California Center for Systems and Software Engineering Back-up

52 University of Southern California Center for Systems and Software Engineering 52 Number of System Requirements This driver represents the number of requirements for the system-of-interest at a specific level of design. The quantity of requirements includes those related to the effort involved in system engineering the system interfaces, system specific algorithms, and operational scenarios. Requirements may be functional, performance, feature, or service-oriented in nature depending on the methodology used for specification. They may also be defined by the customer or contractor. Each requirement may have effort associated with it such as verification and validation, functional decomposition, functional allocation, etc. System requirements can typically be quantified by counting the number of applicable shalls/wills/shoulds/mays in the system or marketing specification. Note: some work is involved in decomposing requirements so that they may be counted at the appropriate system-of-interest.

53 University of Southern California Center for Systems and Software Engineering 53 Number of System Interfaces This driver represents the number of shared physical and logical boundaries between system components or functions (internal interfaces) and those external to the system (external interfaces). These interfaces typically can be quantified by counting the number of external and internal system interfaces among ISO/IEC 15288-defined system elements

54 University of Southern California Center for Systems and Software Engineering 54 Number of System Algorithms This driver represents the number of newly defined or significantly altered functions that require unique mathematical algorithms to be derived in order to achieve the system performance requirements. As an example, this could include a complex aircraft tracking algorithm like a Kalman Filter being derived using existing experience as the basis for the all aspect search function. Another example could be a brand new discrimination algorithm being derived to identify friend or foe function in space-based applications. The number can be quantified by counting the number of unique algorithms needed to realize the requirements specified in the system specification or mode description document.

55 University of Southern California Center for Systems and Software Engineering 55 Number of Operational Scenarios This driver represents the number of operational scenarios that a system must satisfy. Such scenarios include both the nominal stimulus-response thread plus all of the off- nominal threads resulting from bad or missing data, unavailable processes, network connections, or other exception-handling cases. The number of scenarios can typically be quantified by counting the number of system test thread packages or unique end-to- end tests used to validate the system functionality and performance or by counting the number of use cases, including off-nominal extensions, developed as part of the operational architecture.

56 University of Southern California Center for Systems and Software Engineering 56 Systems Engineering WBS per ANSI/EIA 632


Download ppt "University of Southern California Center for Systems and Software Engineering Dr. Mauricio Peña January 28, 2013."

Similar presentations


Ads by Google