Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001

Similar presentations


Presentation on theme: "University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001"— Presentation transcript:

1 University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001 (boehm@sunset.usc.edu) (http://sunset.usc.edu) Competing on Schedule, Cost, and Quality: The Role of Software Models

2 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE2 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The SAIV/CAIV/SCQAIV Process Models Conclusions and References

3 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE3 Traditional and e-Services Development Traditional Development Standalone systems Stable requirements Rqts. determine capabilities Control over evolution Enough time to keep stable Repeatability-oriented process, maturity models e-Services Development Everything connected--maybe Rapid requirements change COTS capabilities determine rqts. No control over COTS evolution Ever-decreasing cycle times Adaptive process models

4 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE4 USC: Model Integration Research MBASE, CeBASE Success Models Product Models Property Models Process Models Win-Win Business Case Analysis Results Chains Risk Software Warranties Correctness RAD Six Sigma Stories Award Fees Agility JAD QFD Golden Rule Waterfall Spiral RUP XP SAIV CAIV SCQAIV Risk Management Business Process Reengineering CMM’s Peopleware IPT’s Agile Development Groupware Easy WinWin Experience Factory GQM UML XML CORBA COM Architectures Product Lines OO Analysis & Design Requirements Operational Concepts Domain Ontologies COTS GOTS COCOMO II COCOTS CORADMO System Dynamics Metrics - ilities COQUALMO Simulation and Modeling

5 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE5 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The SAIV/CAIV/SCQAIV Process Models Conclusions and References

6 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE6 Competing on Schedule and Quality - A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) –“Loss” – financial; reputation; future prospects, … For multiple sources of loss: sources RE =  [Prob (Loss) * Size (Loss)] source

7 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE7 Example RE Profile: Time to Ship - Loss due to unacceptable dependability Time to Ship (amount of testing) RE = P(L) * S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

8 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE8 Example RE Profile: Time to Ship - Loss due to unacceptable dependability - Loss due to market share erosion Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

9 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE9 Example RE Profile: Time to Ship - Sum of Risk Exposures Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Sweet Spot Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

10 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE10 Comparative RE Profile: Safety-Critical System Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): defects High-Q Sweet Spot

11 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE11 Comparative RE Profile: Internet Startup Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): delays Low-TTM Sweet Spot TTM: Time to Market

12 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE12 Conclusions So Far Unwise to try to compete on both cost/schedule and quality –Some exceptions: major technology or marketplace edge There are no one-size-fits-all cost/schedule/quality strategies Risk analysis helps determine how much testing (prototyping, formal verification, etc.) is enough –Buying information to reduce risk Often difficult to determine parameter values –Some COCOMO II values discussed next

13 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE13 0.80.91.01.11.21.3 Slight inconvenience Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) In-house support software 1.0 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects 1 hour High RELY Rating Very High Nominal Low Very Low

14 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE14 0.80.91.01.11.21.3 Slight inconvenience Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader 0.92 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects 1 hour High RELY Rating Very High Nominal Low Very Low

15 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE15 0.80.91.01.11.21.3 Slight inconvenience (1 hour) Low, easily recoverable loss Moderate recoverable loss High Financial Loss Loss of Human Life 1 month 1 day 2 years 100 years Defect RiskRough MTBF(mean time between failures) Commercial quality leader 1.10 In-house support software 1.0 Commercial cost leader 0.92 0.82 Startup demo Safety-critical 1.26 Relative Cost/Source Instruction Software Development Cost/Quality Tradeoff - COCOMO II calibration to 161 projects High RELY Rating Very High Nominal Low Very Low

16 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE16 COCOMO II RELY Factor Dispersion 0 20 40 60 80 100 Very Low NominalHigh Very High t = 2.6 t > 1.9 significant

17 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE17 COCOMO II RELY Factor Phenomenology RELY = Very Low RELY = Very High Rqts. and Product Design Detailed Design Code and Unit test Integration and test Little detail Many TBDs Little verification Minimal QA, CM, standards, draft user manual, test plans Minimal PDR Detailed verification, QA, CM, standards, PDR, documentation IV & V interface Very detailed test plans, procedures Basic design information Minimal QA, CM, standards, draft user manual, test plans Informal design inspections Detailed verification, QA, CM, standards, CDR, documentation Very thorough design inspections Very detailed test plans, procedures IV & V interface Less rqts. rework Detailed test procedures, QA, CM, documentation Very thorough code inspections Very extensive off- nominal tests IV & V interface Less rqts., design rework No test procedures Minimal path test, standards check Minimal QA, CM Minimal I/O and off- nominal tests Minimal user manual No test procedures Many requirements untested Minimal QA, CM Minimal stress and off-nominal tests Minimal as-built documentation Very detailed test procedures, QA, CM, documentation Very extensive stress and off-nominal tests IV & V interface Less rqts., design, code rework

18 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE18 “Quality is Free” Did Philip Crosby’s book get it all wrong? Investments in dependable systems – Cost extra for simple, short-life systems – Pay off for high-value, long-life systems

19 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE19 Software Life-Cycle Cost vs. Dependability 0.8 Very Low NominalHigh Very High 0.9 1.0 1.1 1.2 1.3 1.4 1.10 1.0 0.92 1.26 0.82 Relative Cost to Develop COCOMO II RELY Rating

20 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE20 Software Life-Cycle Cost vs. Dependability 0.8 Very Low NominalHigh Very High 0.9 1.0 1.1 1.2 1.3 1.4 1.10 0.92 1.26 0.82 Relative Cost to Develop, Maintain COCOMO II RELY Rating 1.23 1.10 0.99 1.07

21 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE21 Software Life-Cycle Cost vs. Dependability 0.8 Very Low NominalHigh Very High 0.9 1.0 1.1 1.2 1.3 1.4 1.10 0.92 1.26 0.82 Relative Cost to Develop, Maintain COCOMO II RELY Rating 1.23 1.10 0.99 1.07 1.11 1.05 70% Maint. 1.07 1.20 Low-dependability inadvisable for evolving systems

22 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE22 Software Ownership Cost vs. Dependability 0.8 Very Low NominalHigh Very High 0.9 1.0 1.1 1.2 1.3 1.4 1.10 0.92 1.26 0.82 Relative Cost to Develop, Maintain, Own and Operate COCOMO II RELY Rating 1.23 1.10 0.99 1.07 1.11 1.05 70% Maint. 1.07 1.20 0.76 0.69 VL = 2.55 L = 1.52 Operational-defect cost at Nominal dependability = Software life cycle cost Operational - defect cost = 0

23 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE23 Conclusions So Far - 2 Quality is better than free for high-value, long-life systems There is no universal dependability sweet spot –Yours will be determined by your value model –And the relative contributions of dependability techniques Risk analysis helps answer “How much is enough?” COCOMO II provides schedule-cost-quality tradeoff analysis framework

24 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE24 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The COCOMO Suite of Tradeoff Models - COCOMO II, CORADMO, COQUALMO-ODC The SAIV/CAIV/SCQAIV Process Models Conclusions and References

25 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE25 1.Introduction 2.Model Definition 3.Application Examples 4.Calibration 5.Emerging Extensions 6.Future Trends Appendices –Assumptions, Data Forms, User’s Manual, CD Content COCOMO II Book Table of Contents - Boehm, Abts, Brown, Chulani, Clark, Horowitz, Madachy, Reifer, Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000 CD: Video tutorials, USC COCOMO II.2000, commercial tool demos, manuals, data forms, web site links, Affiliate forms

26 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE26 To help people reason about the cost and schedule implications of their software decisions Purpose of COCOMO II

27 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE27 Major Decision Situations Helped by COCOMO II Software investment decisions –When to develop, reuse, or purchase –What legacy software to modify or phase out Setting project budgets and schedules Negotiating cost/schedule/performance tradeoffs Making software risk management decisions Making software improvement decisions –Reuse, tools, process maturity, outsourcing

28 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE28 Need to ReEngineer COCOMO 81 New software processes New sizing phenomena New reuse phenomena Need to make decisions based on incomplete information

29 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE29 COCOMO II Model Stages

30 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE30 Early Design and Post-Architecture Model    FactorsScaleProcess SizeEffort sMultiplier Environment          Environment: Product, Platform, People, Project Factors Size: Nonlinear reuse and volatility effects Process: Constraint, Risk/Architecture, Team, Maturity Factors   FactorsScaleProcess  Effort Schedule Multiplier 

31 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE31 Relative cost Amount Modified 1.0 0.75 0.5 0.25 0.50.75 1.0 0.55 0.70 1.0 0.046 Usual Linear Assumption Data on 2954 NASA modules [Selby, 1988] Nonlinear Reuse Effects

32 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE32 COCOMO II. 2000 Productivity Ranges Productivity Range 11.21.41.61.822.22.4 Product Complexity (CPLX) Analyst Capability (ACAP) Programmer Capability (PCAP) Time Constraint (TIME) Personnel Continuity (PCON) Required Software Reliability (RELY) Documentation Match to Life Cycle Needs (DOCU) Multi-Site Development (SITE) Applications Experience (AEXP) Platform Volatility (PVOL) Use of Software Tools (TOOL) Storage Constraint (STOR) Process Maturity (PMAT) Language and Tools Experience (LTEX) Required Development Schedule (SCED) Data Base Size (DATA) Platform Experience (PEXP) Architecture and Risk Resolution (RESL) Precedentedness (PREC) Develop for Reuse (RUSE) Team Cohesion (TEAM) Development Flexibility (FLEX) Scale Factor Ranges: 10, 100, 1000 KSLOC

33 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE33 Percentage of sample projects within 30% of actuals -Without and with calibration to data source COCOMO II Estimation Accuracy: COCOMO81COCOMOII.2000COCOMOII.1997 # Projects 6383161 Effort Schedule 81% 52% 64% 75% 80% 61% 62% 72% 81% 65%

34 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE34 COCOMO II cost drivers (except SCED) Language Level, experience,... COCOMO II Phase Distributions (COPSEMO) RAD Extension Baseline effort, schedule Effort, schedule by stage RAD effort, schedule by phase RVHL DPRS CLAB RESL COCOMO II RAD Extension (CORADMO) PPOS RCAP

35 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE35 Effect of RCAP on Cost, Schedule

36 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE36 COCOMO II Current COQUALMO System COQUALMO Defect Introduction Model Defect Removal Model Software platform, Project, product and personnel attributes Software Size Estimate Defect removal profile levels Automation, Reviews, Testing Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size

37 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE37 Defect Removal Rating Scales Highly advanced tools, model- based test More advance test tools, preparation. Dist- monitoring Well-defined test seq. and basic test coverage tool system Basic test Test criteria based on checklist Ad-hoc test and debug No testing Execution Testing and Tools Extensive review checklist Statistical control Root cause analysis, formal follow Using historical data Formal review roles and Well- trained people and basic checklist Well-defined preparation, review, minimal follow-up Ad-hoc informal walk- through No peer review Peer Reviews Formalized specification, verification. Advanced dist- processing More elaborate req./design Basic dist- processing Intermediate- level module Simple req./design Compiler extension Basic req. and design consistency Basic compiler capabilities Simple compiler syntax checking Automated Analysis Extra HighVery HighHighNominalLowVery Low COCOMO II p.263

38 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE38 Defect Removal Estimates - Nominal Defect Introduction Rates Delivered Defects / KSLOC Composite Defect Removal Rating

39 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE39 COQUALMO-ODC Model Objectives Support cost-schedule-quality tradeoff analysis Provide reference for defect monitoring and control Evolve to cover all major classes of project - With different defect distributions (e.g. COTS-based) - Start simple;grow opportunistically

40 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE40 Example of Desired Model - I 6 9 12 15 30 60 90 600K 1200K 1800K Cost @ $20K/PM Effort (PM) VL L N H VH Current: RELY rating Desired: Defect MTBF KSLOC (hr) 100 10 1 0.1 0.01 1 24 720 17,000 10 6 Details for any given point-next Time (Mo.)

41 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE41 Example of Desired Model - II 30 KSLOC; RELY = VH; 75PM; 12Mo.; Delivered Defect = 0.3 PhaseRqts.DesignCode & Unit testIntegration & Test Effort(PM) Cost($K) Schedule(Mo.) Defects in/out/left - Rqts. -Design -Timing -Interface ….. 60/50/10130/116/24264/234/308/37.7/0.3 60/50/1010/16/42/5/11/2/0 120/100/2010/25/52/6.9/0.1 12/6/62/4/41/4.9/0.1 30/25/55/9/10/1/0 ….. 12.543.519 250870380 3.15.83.1

42 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE42 Current COQUALMO shortfalls One-size-fits-all model - May not fit COTS/Web, embedded applications Defect uniformity, independence assumptions - Unvalidated hypotheses COCOMO II C-S-Q trade offs just to RELY levels - Not to delivered defect density, MTBF Need for more calibration data - ODC data could extend and strengthen model

43 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE43 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The COCOMO Suite of Tradeoff Models - COCOMO II, CORADMO, COQUALMO-ODC The SAIV/CAIV/SCQAIV Process Models Conclusions and References

44 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE44 Example of Desired Model - II 30 KSLOC; RELY = VH; 75PM; 12Mo.; Delivered Defect = 0.3 PhaseRqts.DesignCode & Unit testIntegration & Test Effort(PM) Cost($K) Schedule(Mo.) Defects in/out/left - Rqts. -Design -Timing -Interface ….. 60/50/10130/116/24264/234/308/37.7/0.3 60/50/1010/16/42/5/11/2/0 120/100/2010/25/52/6.9/0.1 12/6/62/4/41/4.9/0.1 30/25/55/9/10/1/0 ….. 12.543.519 250870380 3.15.83.1

45 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE45 Current COQUALMO shortfalls One-size-fits-all model - May not fit COTS/Web, embedded applications Defect uniformity, independence assumptions - Unvalidated hypotheses COCOMO II C-S-Q trade offs just to RELY levels - Not to delivered defect density, MTBF Need for more calibration data - ODC data could extend and strengthen model

46 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE46 Outline Traditional and e-Services Development - USC research model perspectives Software Schedule-Cost-Quality Tradeoffs - Risk exposure - Development and ownership costs The COCOMO Suite of Tradeoff Models - COCOMO II, CORADMO, COQUALMO-ODC The SAIV/CAIV/SCQAIV Process Models Conclusions and References

47 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE47 The SAIV Process Model 1. Shared vision and expectations management 2. Feature prioritization 3. Schedule range estimation 4. Architecture and core capabilities determination 5. Incremental development 6. Change and progress monitoring and control

48 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE48 Shared Vision, Expectations Management, and Feature Prioritization Use stakeholder win-win approach Developer win condition: Don’t overrun fixed 9-month schedule Clients’ win conditions: 24 months’ worth of features Win-Win negotiation –Which features are most critical? –COCOMO II: How many features can be built within a 9-month schedule?

49 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE49 COCOMO II Estimate Ranges

50 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE50 Software Product Production Function Value of software product to organization InvestmentHigh-payoffDiminishing returns Operating system Data management Basic application functions Main application functions Humanized I/O Secondary application functions Animated graphics Tertiary Application Functions Natural speech input T=12 mo. T=6 mo. Availability of delivery by time T 90% 50%

51 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE51 Core Capability Incremental Development, and Coping with Rapid Change Core capability not just top-priority features –Useful end-to-end capability –Architected for ease of adding, dropping marginal features Worst case: Deliver core capability in 9 months, with some extra effort Most likely case: Finish core capability in 6-7 months –Add next-priority features Cope with change by monitoring progress –Renegotiate plans as appropriate

52 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE52 SAIV Experience I: USC Digital Library Projects Life Cycle Architecture package in fixed 12 weeks –Compatible operational concept, prototypes, requirements, architecture, plans, feasibility rationale Initial Operational Capability in 12 weeks –Including 2-week cold-turkey transition Successful on 24 of 26 projects –Failure 1: too-ambitious core capability Cover 3 image repositories at once –Failure 2: team disbanded Graduation, summer job pressures 92% success rate vs industry 16% in Standish Report

53 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE53 Rapid Value TM Project Approach Define Design Develop Deploy Lines of readiness Are we ready for the next step? Iteration Scope, Listening, Delivery focus Identify System Actors Document Business Processes Generate Use Cases Define basic development strategies Object Domain Modeling Detailed Object Design, Logical Data Model Object Interactions, System Services Polish Design, Build Plan Build 1 Build 2 Stabilization Build Release to Test Beta Program Pilot Program Production 16-24 week fixed schedule LA SPIN Copyright © 2001 C-bridge

54 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE54 Conclusions: SAIV Critical Success Factors Working with stakeholders in advance to achieve a shared product vision and realistic expectations; Getting clients to develop and maintain prioritized requirements; Scoping the core capability to fit within the high-payoff segment of the application’s production function for the given schedule; Architecting the system for ease of adding and dropping features; Disciplined progress monitoring and corrective action to counter schedule threats Also works for Cost as Independent Variable –And “Cost, Schedule, Quality: Pick All Three”

55 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE55 Conclusions Future IT systems require new model perspectives - Product, process, property, success models USC COCOMO and MBASE model families helpful - Tradeoff and decision analysis - Integrated product and process development - Rapid Application Development - Process management and improvement USC-IBM COQUALMO-ODC model a valuable next step

56 University of Southern California Center for Software Engineering C S E USC 10/24/01©USC-CSE56 Further Information V. Basili, G. Caldeira, and H. Rombach, “The Experience Factory” and “The Goal Question Metric Approach,” in J. Marciniak (ed.), Encyclopedia of Software Engineering, Wiley, 1994. B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000. B. Boehm, D. Port, “Escaping the Software Tar Pit: Model Clashes and How to Avoid Them,” ACM Software Engineering Notes, January, 1999. B. Boehm et al., “Using the Win Win Spiral Model: A Case Study,” IEEE Computer, July 1998, pp. 33-44. R. van Solingen and E. Berghout, The Goal/Question/Metric Method, McGraw Hill, 1999. COCOMO II, MBASE items : http://sunset.usc.edu CeBASE items : http://www.cebase.org


Download ppt "University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001"

Similar presentations


Ads by Google