Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 1 Developing and Integrating Software/System Product, Process,

Similar presentations


Presentation on theme: "University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 1 Developing and Integrating Software/System Product, Process,"— Presentation transcript:

1 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 1 Developing and Integrating Software/System Product, Process, Property, and Success Models Barry Boehm, USC-CSE TAMU, SMU Presentations March 22, 23, 2001 Boehm@sunset.usc.edu http://sunset.usc.edu

2 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 2 Outline Developing Software/System Models at USC- CSE –Process Models –Product Models –Property Models –Success Models Integrating Software/System Models Conclusions Further information

3 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 3 Process Model Research at USC-CSE Spiral Model Extensions –WinWin spiral model –Life cycle anchor points WinWin Negotiation Model –IPT, JAD guidelines –Groupware support tools Object-oriented processes with COTS, reuse RAD-SAIV process model System dynamics models

4 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 4 Spiral Model Experience Where do objectives, constraints, alternatives come from? - Win Win extensions Lack of intermediate milestones - Anchor Points: LCO, LCA, IOC - Concurrent-engineering spirals between anchor points The WinWin Spiral Model 2. Identify Stakeholders’ win conditions 1. Identify next-level Stakeholders Reconcile win conditions. Establish next level objectives, constraints, alternatives 3. Evaluate product and process alternatives. Resolve Risks 4. Define next level of product and process - including partitions 5. Validate product and process definitions 6. Review, commitment7. Win-Win Extensions Original Spiral

5 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 5 Life Cycle Anchor Points Common System/Software stakeholder commitment points –Defined in concert with Government, industry affiliates –Coordinated with Rational’s Unified Software Development Process Life Cycle Objectives (LCO) –Stakeholders’ commitment to support system architecting –Like getting engaged Life Cycle Architecture (LCA) –Stakeholders’ commitment to support full life cycle –Like getting married Initial Operational Capability (IOC) –Stakeholders’ commitment to support operations –Like having your first child

6 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 6 Win Win Spiral Anchor Points (Risk-driven level of detail for each element) *WWWWWHH: Why, What, When, Who, Where, How, How Much Milestone ElementLife Cycle Objectives (LCO)Life Cycle Architecture (LCA) Definition of Operational Concept Top-level system objectives and scope - System boundary - Environment parameters and assumptions - Evolution parameters Operational concept - Operations and maintenance scenarios and parameters - Organizational life-cycle responsibilities (stakeholders) Elaboration of system objectives and scope of increment Elaboration of operational concept by increment Top-level functions, interfaces, quality attribute levels, including: - Growth vectors and priorities - Prototypes Stakeholders’ concurrence on essentials Elaboration of functions, interfaces, quality attributes, and prototypes by increment - Identification of TBD’s( (to-be-determined items) Stakeholders’ concurrence on their priority concerns Top-level definition of at least one feasible architecture - Physical and logical elements and relationships - Choices of COTS and reusable software elements Identification of infeasible architecture options Choice of architecture and elaboration by increment - Physical and logical components, connectors, configurations, constraints - COTS, reuse choices - Domain-architecture and architectural style choices Architecture evolution parameters Elaboration of WWWWWHH* for Initial Operational Capability (IOC) - Partial elaboration, identification of key TBD’s for later increments Assurance of consistency among elements above All major risks resolved or covered by risk management plan Identification of life-cycle stakeholders - Users, customers, developers, maintainers, interoperators, general public, others Identification of life-cycle process model - Top-level stages, increments Top-level WWWWWHH* by stage Assurance of consistency among elements above - via analysis, measurement, prototyping, simulation, etc. - Business case analysis for requirements, feasible architectures Definition of System Requirements Definition of System and Software Architecture Definition of Life- Cycle Plan Feasibility Rationale System Prototype(s) Exercise key usage scenarios Resolve critical risks Exercise range of usage scenarios Resolve major outstanding risks

7 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 7 Anchor Points and Rational USDP Phases Engineering Stage Manufacturing Stage Inception ElaborationConstructionTransition Feasibility Iterations Architecture Iterations Usable Iterations Product Releases Management REQREQ DESDES IMPIMP DEPDEP REQREQ DESDES IMPIMP DEPDEP REQREQ DESDES IMPIMP DEPDEP REQREQ DESDES IMPIMP DEPDEP LCOLCAIOC RATIONAL S o f t w a r e C o r p o r a t i o n

8 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 8 WinWin Negotiation Model Win Condition Rationale Attachments Agreement Rationale Attachments Option Rationale Attachments Issue Rationale Attachments Involves Addresses Adopts Covers Win-Win equilibrium –All Win Conditions covered by Agreements –No outstanding Issues

9 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 9 EasyWinWin Activities Now a commercial product from GroupSystems.com

10 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 10 Converge on Win Conditions FastFocus

11 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 11 Product Model Research at USC-CSE Guidelines for LCO, LCA product deliverables –Operational Concept Description ­Shared vision, domain model, use case scenarios –Requirements Description ­Priorities, evolution requirements, WinWin traceability –Architecture description ­Extended UML, COTS integration –Feasibility Rationale ­Assurance of consistency, feasibility, risk resolution Architecture Research

12 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 12 Architecture Research SAAGE (Software Architecture Analysis, Generation, and Evolution) Architecture style and view integration –UML view integration

13 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 13 Research focus: – style-based application design and implementation – component- and connector-based architectural composition – architecture modeling and analysis – architecture-based software evolution – consistent refinement of architecture into design – system generation and configuration management Supported by an integrated toolset composed of in- house and third-party tools Experimental application to mobile wireless networks Software Architecture, Analysis, Generation, and Evolution (SAAGE)

14 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 14 SAAGE Overview Integrated environment for transforming C2-style architectures into UML

15 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 15 View Integration Model (UML/Analyzer Tool) implements generic view integration model describes and identifies causes of architectural and design mismatches across UML views is integrated with Rational Rose® complements pure view comparison with transformation (abstraction, translation, and unification) and mapping techniques currently supports class, object, sequence, collaboration, state, and C2 diagrams uses mismatch rules and transformation rules

16 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 16 Consistency between Architecture and Design C2 Architecture (represented in UML) Class Design (UML class diagram) Automatically derived Views

17 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 17 Property Model Research at USC-CSE Constructive Cost Model (COCOMO) II COCOMO II Extensions –COCOTS: COTS Integration –COQUALMO: Delivered Defect Density –CORADMO: Rapid Development Cost and Schedule –COPSEMO: Phase/Activity Distribution –COPROMO: Productivity Improvement Analysis –COSYMO: System Engineering Property Tradeoff Assistance: QARCC, S-COST

18 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 18 COCOMO II Model Stages

19 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 19 Early Design and Post-Architecture Models    FactorsScaleProcess SizeEffort sMultiplier tEnvironmen          Environment: Product, Platform, People, Project Factors Size: Nonlinear reuse and volatility effects Process: Constraint, Risk/Architecture, Team, Maturity Factors   FactorsScaleProcess  Effort Schedule Multiplier 

20 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 20 COCOMO II. 2000 Productivity Ranges Productivity Range 11.21.41.61.822.22.4 Product Complexity (CPLX) Analyst Capability (ACAP) Programmer Capability (PCAP) Time Constraint (TIME) Personnel Continuity (PCON) Required Software Reliability (RELY) Documentation Match to Life Cycle Needs (DOCU) Multi-Site Development (SITE) Applications Experience (AEXP) Platform Volatility (PVOL) Use of Software Tools (TOOL) Storage Constraint (STOR) Process Maturity (PMAT) Language and Tools Experience (LTEX) Required Development Schedule (SCED) Data Base Size (DATA) Platform Experience (PEXP) Architecture and Risk Resolution (RESL) Precedentedness (PREC) Develop for Reuse (RUSE) Team Cohesion (TEAM) Development Flexibility (FLEX) Scale Factor Ranges: 10, 100, 1000 KSLOC

21 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 21 Percentage of sample projects within 30% of actuals -Without and with calibration to data source COCOMO II Estimation Accuracy: COCOMO81COCOMOII.2000COCOMOII.1997 # Projects 6383161 Effort Schedule 81% 52% 64% 75% 80% 61% 62% 72% 81% 65%

22 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 22 Process Maturity (PMAT) Effects Clark Ph.D. dissertation (112 projects) –Research model: 12-23% per level –COCOMO II subsets: 9-29% per level COCOMO II.1999 (161 projects) –4-11% per level PMAT positive contribution is statistically significant – Effort reduction per maturity level, 100 KDSI project – Normalized for effects of other variables

23 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 23 COCOMO vs. COCOTS Cost Sources TIME STAFFING

24 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 24 Integrated COQUALMO COCOMO II COQUALMO Defect Removal Model Software Size estimate Software product, process, computer and personnel attributes Defect removal capability levels Software development effort, cost and schedule estimate Number of residual defects Defect Introduction Model Defect density per unit of size

25 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 25 COCOMO II RAD Extension (CORADMO) COCOMO II cost drivers (except SCED) Language Level, experience,... COCOMO II Stage Distributions (COPSEMO) CORADMO Baseline effort, schedule Effort, schedule by stage RAD effort, schedule by stage RVHL DPRS CLAB RESL PPOS RCAP

26 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 26 1.Introduction 2.Model Definition 3.Application Examples 4.Calibration 5.Emerging Extensions 6.Future Trends Appendices* –Assumptions, Data Forms, User’s Manual, CD Content CD: video tutorials, USC COCOMO II 2000, commercial tool demos, manuals, data forms, web site links, Affiliate forms COCOMO II Book Table of Contents -Boehm, Abts, Brown, Chulani, Clark, Horowitz, Madachy, Reifer, Steece Software Cost Estimation with COCOMO II, Prentice Hall, 2000

27 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 27 Success Model Research at USC-CSE Stakeholder Win-Win (Theory W) –Win win negotiation model and tools –Two Cultures and expectations management Success Model profiles and model clashes

28 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 28 Unmet Expectations Problems LCO success condition –Describes at least one feasible architecture –Satisfying requirements within cost/schedule/resource constraints –Viable cost-effective business case –Stakeholder concurrence on key system parameters Projects That Failed LCO Criteria -1996: 4 out of 16 (25%) -1997: 4 out of 15 (27%) why?

29 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 29 Requirements and Expectations: Domain Model Clashes Easy/hard things for software people “If you can do queries with all those ands, ors, synonyms, data ranges, etc., it should be easy to do natural language queries.” “If you can scan the document and digitize the text, it should be easy to digitize the figures too.” Easy/hard things for librarians “It was nice that you could add this access feature, but it overly (centralizes, decentralizes) control of our intellectual property rights.” “It was nice that you could extend the system to serve the medical people, but they haven’t agreed to live with our usage guidelines.”

30 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 30 1998 Simplifier/Complicator Experiment Identify application simplifiers and complicators –For each digital library sub-domain –For both developers and clients Provide with explanations to developers and clients –Highlight relation to risk management Homework exercise to analyze simplifiers and complicators –For two of upcoming digital library projects Evaluate effect on LCO review failure rate

31 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 31 Example S&C’s 1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 20, 31, 32, 35, 36, 37, 39 Type of Application Simple Block DiagramExamplesSimplifiersComplicators Multimedia Archive · Use standard query languages · Use standard or COTS search engine · Uniform media formats · Natural language processing · Automated cataloging or indexing · Digitizing large archives · Digitizing complex or fragile artifacts · Automated annotation/descrip tion/ or meanings to digital assets · Integration of legacy systems MM asset info Catalog MM Archive query MM asset update query update notification · Rapid access to large Archives · Access to heterogeneous media collections

32 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 32 The Results Projects That Failed LCO Criteria -1996: 4 out of 16 (25%) -1997: 4 out of 15 (27%) -1998: 1 out of 19 (5%) -1999: 1 out of 22 (5%) -2000: 2 out of 20 (10%) Post-1997 failures due to non-S&C causes –Team cohesion, client outages

33 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 33 The Model-Clash Spider Web: Master Net

34 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 34 Outline Developing Software/System Models at USC-CSE –Process, Product, Property, and Success Models Integrating Software/System Models –Motivation: Model Clashes –Model-Based (System) Architecting and Software Engineering (MBASE) Examples of Successful MBASE Use –Over 50 Digital Library projects –TRW CCPDS-R project Early Adopters Conclusions Further information

35 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 35 Understanding the Tar Pit: Model Clashes University of Southern California Center for Software Engineering CSE USC Model Clash: An incompatibility among the underlying assumptions of a set of models –Often unrecognized –Produces conflicts, confusion, mistrust, frustration, rework, throwaway systems

36 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 36 Examples of Model Clashes Design-to-schedule process, unprioritized requirements, and tightly-coupled architecture COTS-driven product and Waterfall process Risk-based process and spec-based progress payments Evolutionary development without life-cycle architecture Golden Rule and stakeholder win-win Spec-based process and IKIWISI success model –I’ll know it when I see it

37 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 37 MBASE Integration Framework Process models Life cycle anchor points Risk management Key practices Success models Business case IKIWISI Stakeholder win-win Property models Cost Schedule Performance Reliability Product models Domain model Requirements Architecture Code Documentation Planning and control Milestone content Evaluation and analysis Process entry/exit criteria Product evaluation criteria

38 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 38 IPM 1 guide progress in selecting, and serve and satisfy … reify … … intermediate … Product Models are refinements of impose constraints on provide parameters for set context for identify, prioritize Stakeholders Success Models Property Models Process Models Domain/ Environment Models Conceptual Product Models Reified Product Models IPM n enablesatisficing among determine the relevance of provide parameters for Provideevaluations for reifying WinWin Spiral Process Life Cycle Architecture Package Plan in LCA Package MBASE Process Framework

39 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 39 University of Southern California Center for Software Engineering CSE USC Success Models Drive Other Model Choices Success Model Key Stake- holders Key Property Models Process Model Product Model Demo agent-based E- commerce system at COMDEX in 9 months Entrepreneurs, venture capitalists, customers Schedule estimation Design-to-schedule Domain constrained by schedule; architected for ease in dropping features to meet schedule Safe air traffic control system Controllers, Govt. agencies, developers Safety models Initial spiral to risk-manage COTS, etc.; Final waterfall to verify safety provisions Architected for fault tolerance, ease of safety verification

40 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 40 MBASE Electronic Process Guide (1)

41 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 41 MBASE Electronic Process Guide (2)

42 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 42 Outline Developing Software/System Models at USC-CSE –Process, Product, Property, and Success Models Integrating Software/System Models –Motivation: Model Clashes –Model-Based (System) Architecting and Software Engineering (MBASE) Examples of Successful MBASE Use –Over 100 Digital Library projects –TRW CCPDS-R project MBASE, CeBASE, and CMMI Conclusions Further information

43 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 43 The Challenge 15 Digital Library Applications –2 sentence problem statements –Librarian clients 86 Graduate Students –30% with industry experience –Largely unfamiliar with each other, Library ops. *Develop LCA packages in 11 weeks Re-form teams from 30 continuing students *Develop IOC packages in 12 more weeks –Including 2-week beta test and transition

44 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 44 University of Southern California Center for Software Engineering CSE USC Problem Statement #4: Medieval Manuscripts Ruth Wallach, Reference Center, Doheny Memorial Library I am interested in the problem of scanning medieval manuscripts in such a way that a researcher would be able to both read the content, but also study the scribe’s hand, special markings, etc. A related issue is that of transmitting such images over the network.

45 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 45 MBASE Model Integration: LCO Stage determines content of Domain Model WinWin Taxonomy Basic Concept of Operation Frequent Risks, Architecture Options Stakeholders, Primary win conditions WinWin Negotiation Model IKIWISI Prototypes, Properties Models, Process Framework Environment Models WinWin Agreements Viable Architecture Options Updated Concept of Operation Life Cycle Plan elements Outstanding LCO risks Requirements Description LCO Rationale Life Cycle Objectives (LCO) Package Anchor Point Model determines identifies determines serves as table of contents for situatesexercise focus use of focus use of determines guides determination of validate inputs for provides initializeadoptidentify update achieveiterate to feasibility, consistency determines exit criteria for validates readiness of initializesinitializes

46 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 46 University of Southern California Center for Software Engineering CSE USC

47 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 47 MBASE Laboratory 15 software engineering projects/year - 5-person USC Digital Library applications Rapidly developing successful applications - Multimedia, virtual assistants, data acquisition Integrating models and tools - DARPA-EDCS architecture and WinWin tools - Rational Rose, Unified Modeling Language Rapidly improving artifact integration - 1996 integrated specs, plans: 160 pages - 1997 integrated specs, plans: 110 pages Results transitioning to early adopters Ultimate goal: Model-integrated SW Engr. agents

48 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 48 Case Study: CCPDS-R Project Overview CharacteristicCCPDS-R Domain Ground based C3 development Size/language 1.15M SLOC Ada Average number of people 75 Schedule 75 months Process/standards DOD-STD-2167A Iterative development Rational host DEC host DEC VMS targets Contractor TRW Customer USAF Current status Delivered On-budget, On-schedule Environment RATIONAL S o f t w a r e C o r p o r a t I o n

49 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 49 CCPDS-R MBASE Models Success Models –Reinterpreted DOD-STD-2167a; users involved –Award fee flowdown to performers Product Models –Domain model and architecture –Message-passing middleware (UNAS) Process Models –Ada process model and toolset –Incremental builds; early delivery Property Models –COCOMO cost & schedule –UNAS - based performance modeling –Extensive progress and quality metrics tools

50 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 50 Common Subsystem Macroprocess Development Life Cycle ConstructionElaboration Inception Competitive design phase: Architectural prototypes Planning Requirements analysis Contract award Architecture baseline under change control Early delivery of “alpha” capability to user Architecture IterationsRelease Iterations SSRIPDRPDRCDR 0 5 10 15 20 25 RATIONAL S o f t w a r e C o r p o r a t I o n (LCA) (LCO)

51 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 51 MBASE, CeBASE, and CMMI Model-Based (System) Architecting and Software Engineering (MBASE) –Extension of WinWin Spiral Model –Avoids process/product.property/success model clashes –Provides project-oriented guidelines Center for Empirically-Based Software Engineering (CeBASE) –Led by USC, UMaryland –Sponsored by NSF, others –Empirical software data collection and analysis –Integrates MBASE, Experience Factory, GMQM into CeBASE Method Goal-Model-Question-Metric Method Integrated organization/portfolio/project guidelines CeBASE Method implements Integrated Capability Maturity Model (CMMI) and more –Parts of People CMM, but light on Acquisition CMM

52 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 52 Center for Empirically-Based Software Engineering (CeBASE) Strategic Vision Strategic Framework Strategic Process: Experience Factory Tailoring G/L:Goal-Model-Question-Metric Tactical Process:Model Integration (MBASE); WinWin Spiral Strategic Framework Empirical Methods Quantitative Qualitative Experimental Ethnographic Observational Analysis Surveys, Assessments Parametric Models Critical Success Factors Dynamic Models Root Cause Analysis Pareto 80-20 Relationships Experience Base (Context; Results) Project, Context Attributes Empirical Results; References Implications and Recommended Practices Experience Feedback Comments Initial foci: COTS-based systems; Defect reduction

53 University of Southern California Center for Software Engineering CSE USC Integrated GMQM-MBASE Experience Factory Org. Value Propositions (VP’s) -Stakeholder values Current situation w.r.t. VP’s Improvement Goals, Priorities Global Scope, Results Chain Value/business case models Org-Portfolio Shared Vision Strategy elements Evaluation criteria/questions Improvement plans -Progress metrics -Experience base Org. Strategic Plans Organization/ Portfolio: EF-GMQM Monitor environment -Update models Implement plans Evaluate progress -w.r.t. goals, models Determine, apply corrective actions Update experience base Org. Monitoring & Control Monitoring & Control Context Project Value Propositions -Stakeholder values Current situation w.r.t. VP’s Improvement Goals, Priorities Project Scope, Results Chain Value/business case models Project Shared Vision Project: MBASE Planning context Plan/Goal mismatches Project Plans Planning Context Initiatives Progress/Plan/Goal mismatches -shortfalls, opportunities, risks Project vision, goals Shortfalls, opportunities, risks Scoping context Shortfalls, opportunities, risks Planning Context Monitor environment -Update models Implement plans Evaluate progress -w.r.t. goals, models, plans Determine, apply corrective actions Update experience base Proj. Monitoring & Control Monitoring & Control context Progress/Plan/goal mismatches -Shortfalls, opportunities, risks Plan/goal mismatches Monitoring & Control context Project experience, progress w.r.t. plans, goals LCO: Life Cycle Objectives LCA: Life Cycle Architecture IOC: Initial Operational Capability GMQM: Goal-Model-Question-Metric Paradigm MBASE: Model-Based (System) Architecting and Software Engineering -Applies to organization’s and projects’ people, processes, and products 3/22/01 53 ©USC-CSE LCO/LCA Package -Ops concept, prototypes, rqts, architecture, LCplan, rationale IOC/Transition/Support Package -Design, increment plans, quality plans, T/S plans Evaluation criteria/questions Progress metrics

54 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 54 COCOMO II Experience Factory: IV Corporate parameters: tools, processes, reuse Ok? Rescope COCOMO 2.0 Recalibrate COCOMO 2.0 System objectives: fcn’y, perf., quality Execute project to next Milestone Ok? Done? End Revise Milestones, Plans, Resources Evaluate Corporate SW Improvement Strategies Accumulate COCOMO 2.0 calibration data No Revised Expectations M/S Results Yes Milestone expectations Improved Corporate Parameters No Yes Cost, Sched, Risks Cost, Sched, Quality drivers No Milestone plans, resources

55 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 55 CeBASE Method Coverage of CMMI - I Process Management –Organizational Process Focus: 100+ –Organizational Process Definition: 100+ –Organizational Training: 100- –Organizational Process Performance: 100- –Organizational Innovation and Deployment: 100+ Project Management –Project Planning: 100 –Project Monitoring and Control: 100+ –Supplier Agreement Management: 50- –Integrated Project Management: 100- –Risk Management: 100 –Integrated Teaming: 100 –Quantitative Project Management: 70-

56 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 56 CeBASE Method Coverage of CMMI - II Engineering –Requirements Management: 100 –Requirements Development: 100 –Technical Solution: 60+ –Product Integration: 70+ –Verification: 70- –Validation: 80+ Support –Configuration Management: 70- –Process and Product Quality Assurance: 70- –Measurement and Analysis: 100- –Decision Analysis and Resolution: 100- –Organizational Environment for Integration: 80- –Causal Analysis and Resolution: 100

57 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 57 Conclusions Research on several classes of models is synergetic –COCOMO II, WinWin, spiral extensions, architecture research each helped by each other MBASE proving successful in avoiding model clashes USC-CSE Affiliate support has been essential

58 University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 58 Commercial Industry (18) –Automobile Club of Southern California, C-Bridge, Daimler Chrysler, EDS, Fidelity Group, Galorath, Group Systems.Com, Hughes, IBM, Lucent, Marotz, Microsoft, Motorola, Price Systems, Rational, Sun, Telcordia, Xerox Aerospace Industry (9) –Boeing, Draper Labs, GDE Systems, Litton, Lockheed Martin, Northrop, Grumman, Raytheon, SAIC, TRW Government (3) –FAA, US Army Research Labs, US Army TACOM FFRDC’s and Consortia (4) –Aerospace, JPL, SEI, SPC International (1) –Chung-Ang U. (Korea) USC-CSE Affiliates (35)

59 University of Southern California Center for Software Engineering CSE USC Further Information V. Basili, G. Caldeira, and H. Rombach, “The Experience Factory” and “The Goal Question Metric Approach,” in J. Marciniak (ed.), Encyclopedia of Software Engineering, Wiley, 1994. B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000. B. Boehm, D. Port, “Escaping the Software Tar Pit: Model Clashes and How to Avoid Them,” ACM Software Engineering Notes, January, 1999. B. Boehm et al., “Using the Win Win Spiral Model: A Case Study,” IEEE Computer, July 1998, pp. 33-44. R. van Solingen and E. Berghout, The Goal/Question/Metric Method, McGraw Hill, 1999. COCOMO II, MBASE items : http://sunset.usc.edu CeBASE items : http://www.cebase.org


Download ppt "University of Southern California Center for Software Engineering CSE USC 3/22/01 ©USC-CSE 1 Developing and Integrating Software/System Product, Process,"

Similar presentations


Ads by Google