Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 1 The Future of Software Engineering Barry Boehm, USC Boeing.

Similar presentations


Presentation on theme: "University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 1 The Future of Software Engineering Barry Boehm, USC Boeing."— Presentation transcript:

1 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 1 The Future of Software Engineering Barry Boehm, USC Boeing Presentation May 10, 2002

2 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 2 Commercial Industry (15) –Daimler Chrysler, Freshwater Partners, Galorath, Group Systems.Com, Hughes, IBM, Cost Xpert Group, Microsoft, Motorola, Price Systems, Rational, Reuters Consulting, Sun, Telcordia, Xerox Aerospace Industry (6) –Boeing, Lockheed Martin, Northrop Grumman, Raytheon, SAIC, TRW Government (8) –DARPA, DISA, FAA, NASA-Ames, NSF, OSD/ARA/SIS, US Army Research Labs, US Army TACOM FFRDC’s and Consortia (4) –Aerospace, JPL, SEI, SPC International (1) –Chung-Ang U. (Korea) Acknowledgments: USC-CSE Affiliates

3 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 3 Software Engineering Trends Traditional Standalone systems Mostly source code Requirements-driven Control over evolution Focus on software Stable requirements Premium on cost Staffing workable Future Everything connected-maybe Mostly COTS components Requirements are emergent No control over COTS evolution Focus on systems and software Rapid change Premium on value, speed, quality Scarcity of critical talent

4 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 4 From This... Small Unit UAV Other Layered Sensors Network Centric Force Distributed Fire Mechanisms Robotic Direct Fire Robotic NLOS Fire Robotic Sensor Manned C2/Infantry Squad To This... Exploit Battlefield Non-Linearities using Technology to Reduce the Size of Platforms and the Force Network Centric Distributed Platforms Large-System Example: Future Combat Systems

5 Total Collaborative Effort to Support FCS

6 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 6 FCS Product/Process Interoperability Challenge

7 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 7 Small-Systems Example: eCommerce Define Design Develop Deploy Lines of readiness Are we ready for the next step? Iteration Scope, Listening, Delivery focus Identify System Actors Document Business Processes Generate Use Cases Define basic development strategies Object Domain Modeling Detailed Object Design, Logical Data Model Object Interactions, System Services Polish Design, Build Plan Build 1 Build 2 Stabilization Build Release to Test Beta Program Pilot Program Production 16-24 week fixed schedule LA SPIN Copyright © 2001 C-bridge

8 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 8 Future Software/Systems Challenges Distribution, mobility, autonomy Systems of systems, networks of networks, agents of agents Group-oriented user interfaces Nonstop adaptation and upgrades Synchronizing many independent evolving systems Hybrid, dynamic, evolving architectures Complex adaptive systems behavior

9 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 9 Complex Adaptive Systems Characteristics Order is not imposed; it emerges (Kauffman) Viability determined by adaptation rules –Too inflexible: gridlock –Too flexible: chaos Equilibrium determined by “fitness landscape” –A function assigning values to system states Adaptation yields better outcomes then optimization –Software approaches: Agile Methods

10 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 10 Getting the Best from Agile Methods Representative agile methods The Agile Manifesto The planning spectrum –Relations to Software CMM and CMMI Agile and plan-driven home grounds How much planning is enough? –Risk-driven hybrid approaches

11 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 11 Leading Agile Methods Adaptive Software Development (ASD) Agile Modeling Crystal methods Dynamic System Development Methodology (DSDM) * eXtreme Programming (XP) Feature Driven Development Lean Development Scrum

12 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 12 XP: The 12 Practices The Planning Game Small Releases Metaphor Simple Design Testing Refactoring Pair Programming Collective Ownership Continuous Integration 40-hour Week On-site Customer Coding Standards -Used generatively, not imperatively

13 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 13 The Agile Manifesto Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: That is, while there is value in the items on the right, we value the items on the left more.

14 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 14 Counterpoint: A Skeptical View Letter to Computer, Steven Rakitin, Dec. 2000 “customer collaboration over contract negotiation” Translation: Let's not spend time haggling over the details, it only interferes with our ability to spend all our time coding. We’ll work out the kinks once we deliver something... “responding to change over following a plan” Translation: Following a plan implies we would have to spend time thinking about the problem and how we might actually solve it. Why would we want to do that when we could be coding?

15 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 15 The Planning Spectrum Hackers XP Adaptive SW Devel. Milestone Risk- Driven Models …… Milestone Plan-Driven Models Inch- Pebble Micro-milestones, Ironbound Contract Software CMM Agile Methods CMMI

16 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 16 Agile and Plan-Driven Home Grounds Agile Home GroundPlan-Driven Home Ground Agile, knowledgeable, collocated, collaborative developers Plan-oriented developers; mix of skills Above plus representative, empowered customers Mix of customer capability levels Reliance on tacit interpersonal knowledge Reliance on explicit documented knowledge Largely emergent requirements, rapid change Requirements knowable early; largely stable Architected for current requirements Architected for current and foreseeable requirements Refactoring inexpensive Refactoring expensive Smaller teams, products Larger teams, products Premium on rapid value Premium on high-assurance

17 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 17 How Much Planning Is Enough? - A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) –“Loss” – financial; reputation; future prospects, … For multiple sources of loss: sources RE =  [Prob (Loss) * Size (Loss)] source

18 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 18 Example RE Profile: Planning Detail - Loss due to inadequate plans Time and Effort Invested in plans RE = P(L) * S(L) high P(L): inadequate plans high S(L): major problems (oversights, delays, rework) low P(L): thorough plans low S(L): minor problems

19 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 19 Example RE Profile: Planning Detail - Loss due to inadequate plans - Loss due to market share erosion Time and Effort Invested in Plans RE = P(L) * S(L) low P(L): few plan delays low S(L): early value capture high P(L): plan breakage, delay high S(L): value capture delays high P(L): inadequate plans high S(L): major problems (oversights, delays, rework)) low P(L): thorough plans low S(L): minor problems

20 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 20 low P(L): thorough plans low S(L): minor problems Example RE Profile: Planning Detail - Sum of Risk Exposures Time and Effort Invested in Plans RE = P(L) * S(L) low P(L): few plan delays low S(L): early value capture high P(L): plan breakage, delay high S(L): value capture delays Sweet Spot high P(L): inadequate plans high S(L): major problems (oversights, delays, rework)

21 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 21 Comparative RE Profile: Plan-Driven Home Ground Time and Effort Invested in Plans RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): large system rework Plan-Driven Sweet Spot

22 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 22 Comparative RE Profile: Agile Home Ground Time and Effort Invested in Plans RE =P(L) * S(L) Mainstream Sweet Spot Lower S(L): easy rework Agile Sweet Spot

23 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 23 Conclusions: CMMI and Agile Methods Agile and plan-driven methods have best-fit home grounds –Increasing pace of change requires more agility Risk considerations help balance agility and planning –Risk-driven “How much planning is enough?” Risk-driven agile/plan-driven hybrid methods available –Adaptive Software Development, RUP, MBASE, CeBASE Method –Explored at USC-CSE Affiliates’ Executive Workshop, March 12-14 CMMI provides enabling criteria for hybrid methods –Risk Management, Integrated Teaming

24 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 24 Software Engineering Trends Traditional Standalone systems Mostly source code Requirements-driven Control over evolution Focus on software Stable requirements Premium on cost Staffing workable Future Everything connected-maybe Mostly COTS components Requirements are emergent No control over COTS evolution Focus on systems and software Rapid change Premium on value, speed, quality Scarcity of critical talent

25 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 25 Getting the Best from COTS COTS Software: A Behavioral Definition COTS Empirical Hypotheses: Top-10 List – COTS Best Practice Implications COTS-Based Systems (CBS) Challenges

26 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 26 COTS Software: A Behavioral Definition Commercially sold and supported –Avoids expensive development and support No access to source code –Special case: commercial open-source Vendor controls evolution –Vendor motivation to add, evolve features No vendor guarantees –Of dependability, interoperability, continuity, …

27 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 27 COTS Top-10 Empirical Hypotheses-I 1.Over 99% of all executing computer instructions come from COTS –Each has passed a market test for value 2.More than half of large-COTS-product features go unused 3.New COTS release frequency averages 8-9 months –Average of 3 releases before becoming unsupported 4.CBS life-cycle effort can scale as high as N 2 –N = # of independent COTS products in a system 5.CBS post-deployment costs exceed development costs –Except for short-lifetime systems - Basili & Boehm, Computer, May 2001, pp. 91-93

28 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 28 Different counting rules Try counting software as Lines of Code in Service (LOCS) =  (#platforms) * (#object LOCS/platform) Usual Hardware-Software Trend Comparison Time Log N New transistors in service/year New Source Lines of Code/year SW HW

29 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 29 Lines of Code in Service: U.S. Dept. of Defense

30 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 30 1. DoD LOCS: % COTS, 2000 Platform# P’form (M) LOCS P’form (M) LOCS (T) % COTSNon-COTS LOCS (T) Mainframe.015200395.15 Mini.101001099.10 Micro210020099.51.00 Combat22480.80 Total2172.05 COTS often an economic necessity (<1%) M = Million, T = Trillion

31 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 31 2. Use of COTS Features: Microsoft Word and Power Point Extra features cause extra work Consider build vs. buy for small # features - K. Torii Lab, Japan: ISERN 2000 Workshop Individuals: 12-16% of features used 10-Person Group: 26-29% of features used

32 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 32 3. COTS Release Frequency: Survey Data Adaptive maintenance often biggest CBS life cycle cost Average of 3 releases before becoming unsupported - Ron Kohl/GSAW surveys: 1999-2002* * Ground System Architecture Workshops, Aerospace Corp. GSAW Survey Release Frequency (months) 19996.3 20008.5 20018.75 20029.6

33 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 33 4. CBS Effort Scaling As High As N 2 Strong COTS Coupling N + N(N-1)/2 = N(N+1)/2 (here = 6) Effort ~ N (here = 3) Effort ~ Custom SW COTS Weak COTS Coupling COTS Custom SW COTS

34 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 34 4. CBS Effort Scaling - II # COTS Relative COTS Adaptation Effort Reduce # COTS or Weaken COTS coupling - COTS choices, wrappers, domain architectures, open standards, COTS refresh synchronization 86428642 10 1 2 3 4 Strong COTS Coupling Weak COTS Coupling

35 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 35 COTS Top-10 Empirical Hypotheses-II 6.Less than half of CBS development effort comes from glue code –But glue code costs 3x more per instruction 7.Non-development CBS costs are significant –Worth addressing, but not overaddressing 8.COTS assessment and tailoring costs vary by COTS product class 9.Personnel factors are the leading CBS effort drivers –Different skill factors are needed for CBS and custom software 10.CBS project failure rates are higher than for custom software –But CBS benefit rates are higher also

36 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 36 6. COCOTS Effort Distribution: 20 Projects - Mean % of Total COTS Effort by Activity ( +/- 1 SD) 49.07% 50.99% 61.25% 20.27% 20.75% 21.76% 31.06% 11.31% -7.57% -7.48% 0.88% 2.35% -20.00% -10.00% 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% assessmenttailoring glue code system volatility % Person-months Glue code generally less than half of total No standard CBS effort distribution profile

37 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 37 Custom DevelopmentCBS Development 9. Personnel Factors Are Leading CBS Effort Drivers: Different Skill Factors Needed Rethink recruiting and evaluation criteria Rqts./implementation assessment Algorithms; data & control structures Code reading and writing Adaptation to rqts. changes White-box/black-box testing Rqts./ COTS assessment COTS mismatches; connector structures COTS mismatches; assessment, tailoring, glue code development; coding avoidance Adaptation to rqts., COTS changes Black-box testing

38 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 38 10. CBS Projects Tend to Fail Hard Major Sources of CBS Project Failure These are major topics for COTS risk assessment CBS skill mismatches CBS inexperience CBS optimism Weak life cycle planning CBS product mismatches Hasty COTS assessment Changing vendor priorities New COTS market entries

39 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 39 CBS Challenges Process specifics –Milestones; role of “requirements” –Multi-COTS refresh frequency and process Life cycle management –Progress metrics; development/support roles Cost-effective COTS assessment and test aids –COTS mismatch detection and redressal Increasing CBS controllability –Technology watch; wrappers; vendor win-win Better empirical data and organized experience

40 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 40 Software Engineering Trends Traditional Standalone systems Mostly source code Requirements-driven Control over evolution Focus on software Stable requirements Premium on cost Staffing workable Future Everything connected-maybe Mostly COTS components Requirements are emergent No control over COTS evolution Focus on systems and software Rapid change Premium on value, speed, quality Scarcity of critical talent

41 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 41 Integrating Software and Systems Engineering The software “separation of concerns” legacy –Erosion of underlying modernist philosophy –Resulting project social structure Responsibility for system definition The CMMI software paradigm

42 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 42 The “Separation of Concerns” Legacy “The notion of ‘user’ cannot be precisely defined, and therefore has no place in CS or SE.” –Edsger Dijkstra, ICSE 4, 1979 “Analysis and allocation of the system requirements is not the responsibility of the SE group but is a prerequisite for their work.” –Mark Paulk at al., SEI Software CMM v.1.1, 1993

43 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 43 Cosmopolis: The Erosion of Modernist Philosophy Dominant since 17 th century Formal, reductionist –Apply laws of cosmos to human polis Focus on written vs. oral; universal vs. particular; general vs. local; timeless vs. timely –one-size-fits-all solutions Strong influence on focus of computer science Weak in dealing with human behavior, rapid change

44 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 44 Resulting Project Social Structure

45 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 45 Responsibility for System Definition Every success-critical stakeholder has knowledge of factors that will thwart a project’s success –Software developers, users, customers, … It is irresponsible to underparticipate in system definition –Govt./TRW large system: 1 second response time It is irresponsible to overparticipate in system definition –Scientific American: focus on part with software solution

46 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 46 The CMMI Software Paradigm System and software engineering are integrated –Software has a seat at the center table Requirements, architecture, and process are developed concurrently –Along with prototypes and key capabilities Developments done by integrated teams –Collaborative vs. adversarial process –Based on shared vision, negotiated stakeholder concurrence

47 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 47 Systems Engineering Cost Model (COSYSMO) Barry Boehm, Donald Reifer and Ricardo Valerdi

48 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 48 Development Approach Use USC seven step model building process Keep things simple –Start with Inception phase first when only large- grain information tends to be available –Use EIA 632 to bound tasks that are part of the effort –Focus on software intensive subsystems first (see next chart for explanation), then extend to total systems Develop the model in three steps –Step 1 – limit effort to software-intensive systems –Step 2 – tackle remainder of phases in life cycle –Step 3 – extend work to broad range of systems Build on previous work

49 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 49 The Form of the Model COSYSMO Size Drivers Effort Multipliers Effort Duration Calibration # Requirements # Interfaces # Scenarios # Algorithms + Volatility Factor - Application factors -8 factors - Team factors -8 factors - Schedule driver WBS guided By EIA 632 COCOMO II-based model

50 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 50 Step 2: Size the Effort Using Function Point Analogy Weightings of factors relative work use average requirement as basis To compute size, develop estimates for the drivers and then sum the columns

51 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 51 Basic Effort Multipliers - I Detailed rating conventions will be developed for these multipliers Applications Factors (8) –Requirements understanding –Architecture understanding –Level of service requirements, criticality, difficulty –Legacy transition complexity –COTS assessment complexity –Platform difficulty –Required business process reengineering –Technology maturity

52 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 52 Basic Effort Multipliers - II Detailed rating conventions will be developed for these multipliers Team Factors (8) –Number and diversity of stakeholder communities –Stakeholder team cohesion –Personnel capability –Personnel experience/continuity –Process maturity –Multi-site coordination –Formality of deliverables –Tool support

53 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 53 Next Steps Data collection – gather actual effort and duration data from actual system engineering effort Model calibration – calibrate the model first using expert opinion, then refine as actual data being collected Model validation – validate that the model generates answers that reflect a mix of expert opinions and real- world data (move towards the actual data as it becomes available) Model publication – publish the model periodically as new versions become available (at least bi-annually) Model extension and refinement – extend the model to increase its scope towards the right; refine the model to increase its predictive accuracy

54 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 54 Software Engineering Trends Traditional Standalone systems Mostly source code Requirements-driven Control over evolution Focus on software Stable requirements Premium on cost Staffing workable Future Everything connected-maybe Mostly COTS components Requirements are emergent No control over COTS evolution Focus on systems and software Rapid change Premium on value, speed, quality Scarcity of critical talent

55 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 55 Value-Based Software Engineering (VBSE) Essential to integration of software and systems engineering –Software increasingly determines system content –Software architecture determines system adaptability 7 key elements of VBSE Implementing VBSE –CeBASE Method and method patterns –Experience factory, GQM, and MBASE elements –Means of achieving CMMI Level 5 benefits

56 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 56 7 Key Elements of VBSE 1.Benefits Realization Analysis 2.Stakeholders’ Value Proposition Elicitation and Reconciliation 3.Business Case Analysis 4.Continuous Risk and Opportunity Management 5.Concurrent System and Software Engineering 6.Value-Based Monitoring and Control 7.Change as Opportunity

57 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 57 The Information Paradox (Thorp) No correlation between companies’ IT investments and their market performance Field of Dreams –Build the (field; software) –and the great (players; profits) will come Need to integrate software and systems initiatives

58 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 58 DMR/BRA* Results Chain INITIATIVE OUTCOME Implement a new order entry system ASSUMPTION Contribution Order to delivery time is an important buying criterion Reduce time to process order Reduced order processing cycle (intermediate outcome) Increased sales Reduce time to deliver product *DMR Consulting Group’s Benefits Realization Approach

59 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 59 The Model-Clash Spider Web: Master Net

60 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 60 EasyWinWin OnLine Negotiation Steps

61 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 61 Red cells indicate lack of consensus. Oral discussion of cell graph reveals unshared information, unnoticed assumptions, hidden issues, constraints, etc.

62 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 62 Example of Business Case Analysis ROI= (Benefits-Costs)/Costs Time Return on Investment 1 2 3 Option B- Rapid Option B Option A

63 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 63 Architecture Choices F1, F2, F3F4, F5 F6, F7 F8, F9, F10, F11 F12 Intelligent Router … Rigid Hyper-flexible Flexible

64 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 64 Architecture Flexibility Determination

65 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 65 Value-Based Software Engineering (VBSE) Essential to integration of software and systems engineering –Software increasingly determines system content –Software architecture determines system adaptability 7 key elements of VBSE Implementing VBSE –CeBASE Method and method patterns –Experience factory, GQM, and MBASE elements –Means of achieving CMMI Level 5 benefits

66 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 66 Initiatives Planning context Progress/Plan/ Goal Mismatches Experience Base Analyzed experience, Updated models Achievables, Opportunities Org. Improvement Goals –Goal-related questions, metrics Org. Improvement Strategies –Goal achievement models Org. Improvement Initiative Planning & Control Initiative Plans –Initiative-related questions, metrics Initiative Monitoring and Control –Experience-Base Analysis Org. Shared Vision & Improvement Strategy Experience Factory Framework - III Project Shared Vision and Strategy Planning Context Models and data Project experience Org. Goals Project Planning and Control Models and data

67 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 67 VBSE Experience Factory Example Shared vision: Increased market share, profit via rapid development Improvement goal: Reduce system development time 50% Improvement strategy: Reduce all task durations 50% Pilot project: Rqts, design, code reduced 50%; test delayed –Analysis: Test preparation insufficient, not on critical-path Experience base: OK to increase effort on non- critical-path activities, if it decreases time on critical-path activities

68 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 68 CeBASE Method Strategic Framework Org. Value Propositions (VP’s) -Stakeholder values Current situation w.r.t. VP’s Improvement Goals, Priorities Global Scope, Results Chain Value/business case models Org-Portfolio Shared Vision Strategy elements Evaluation criteria/questions Improvement plans -Progress metrics -Experience base Org. Strategic Plans Organization/ Portfolio: Experience Factory, GQM Monitor environment -Update models Implement plans Evaluate progress -w.r.t. goals, models Determine, apply corrective actions Update experience base Org. Monitoring & Control Monitoring & Control Context Project Value Propositions -Stakeholder values Current situation w.r.t. VP’s Improvement Goals, Priorities Project Scope, Results Chain Value/business case models Project Shared Vision Project: MBASE Planning context Plan/Goal mismatches Project Plans Planning Context Initiatives Progress/Plan/Goal mismatches -shortfalls, opportunities, risks Project vision, goals Shortfalls, opportunities, risks Scoping context Shortfalls, opportunities, risks Planning Context Monitor environment -Update models Implement plans Evaluate progress -w.r.t. goals, models, plans Determine, apply corrective actions Update experience base Proj. Monitoring & Control Monitoring & Control context Progress/Plan/goal mismatches -Shortfalls, opportunities, risks Plan/goal mismatches Monitoring & Control context Project experience, progress w.r.t. plans, goals LCO: Life Cycle Objectives LCA: Life Cycle Architecture IOC: Initial Operational Capability GMQM: Goal-Model-Question-Metric Paradigm MBASE: Model-Based (System) Architecting and Software Engineering -Applies to organization’s and projects’ people, processes, and products LCO/LCA Package -Ops concept, prototypes, rqts, architecture, LCplan, rationale IOC/Transition/Support Package -Design, increment plans, quality plans, T/S plans Evaluation criteria/questions Progress metrics

69 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 69 CeBASE Method Coverage of CMMI - I Process Management –Organizational Process Focus: 100+ –Organizational Process Definition: 100+ –Organizational Training: 100- –Organizational Process Performance: 100- –Organizational Innovation and Deployment: 100+ Project Management –Project Planning: 100 –Project Monitoring and Control: 100+ –Supplier Agreement Management: 50- –Integrated Project Management: 100- –Risk Management: 100 –Integrated Teaming: 100 –Quantitative Project Management: 70-

70 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 70 CeBASE Method Coverage of CMMI - II Engineering –Requirements Management: 100 –Requirements Development: 100 –Technical Solution: 60+ –Product Integration: 70+ –Verification: 70- –Validation: 80+ Support –Configuration Management: 70- –Process and Product Quality Assurance: 70- –Measurement and Analysis: 100- –Decision Analysis and Resolution: 100- –Organizational Environment for Integration: 80- –Causal Analysis and Resolution: 100

71 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 71 CeBASE Method Simplified via Method Patterns Schedule as Independent Variable (SAIV) –Also applies for cost, cost/schedule/quality Opportunity Trees –Cost, schedule, defect reduction Agile COCOMO II –Analogy-based estimate adjuster COTS integration method patterns Process model decision table

72 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 72 The SAIV Process Model 1. Shared vision and expectations management 2. Feature prioritization 3. Schedule range estimation 4. Architecture and core capabilities determination 5. Incremental development 6. Change and progress monitoring and control

73 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 73 Shared Vision, Expectations Management, and Feature Prioritization Use stakeholder win-win approach Developer win condition: Don’t overrun fixed 24-week schedule Clients’ win conditions: 56 weeks’ worth of features Win-Win negotiation –Which features are most critical? –COCOMO II: How many features can be built within a 24-week schedule?

74 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 74 COCOMO II Estimate Ranges

75 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 75 Core Capability Incremental Development, and Coping with Rapid Change Core capability not just top-priority features –Useful end-to-end capability –Architected for ease of adding, dropping marginal features Worst case: Deliver core capability in 24 weeks, with some extra effort Most likely case: Finish core capability in 16-20 weeks –Add next-priority features Cope with change by monitoring progress –Renegotiate plans as appropriate

76 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 76 SAIV Experience I: USC Digital Library Projects Life Cycle Architecture package in fixed 12 weeks –Compatible operational concept, prototypes, requirements, architecture, plans, feasibility rationale Initial Operational Capability in 12 weeks –Including 2-week cold-turkey transition Successful on 24 of 26 projects –Failure 1: too-ambitious core capability Cover 3 image repositories at once –Failure 2: team disbanded Graduation, summer job pressures

77 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 77 RAD Opportunity Tree Business process reengineering Eliminating Tasks Reducing Time Per Task Reducing Risks of Single-Point Failures Reducing Backtracking Activity Network Streamlining Increasing Effective Workweek Better People and Incentives Transition to Learning Organization Reusing assets Applications generation Schedule as independent variable (SAIV) Tools and automation Work streamlining (80-20) Increasing parallelism Reducing failures Reducing their effects Early error elimination Process anchor points Improving process maturity Collaboration technology Minimizing task dependencies Avoiding high fan-in, fan-out Reducing task variance Removing tasks from critical path 24x7 development Nightly builds, testing Weekend warriors

78 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 78 Software Engineering Trends Traditional Standalone systems Mostly source code Requirements-driven Control over evolution Focus on software Stable requirements Premium on cost Staffing workable Future Everything connected-maybe Mostly COTS components Requirements are emergent No control over COTS evolution Focus on systems and software Rapid change Premium on value, speed, quality Scarcity of critical talent

79 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 79 Understanding, dealing with sources of software value –And associated stakeholders, applications domains Software/systems architecting and engineering –Creating and/or integrating software components –Using risk to balance disciplined and agile processes Ability to continuously learn and adapt Critical Success Factors for Software Careers

80 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 80 Understanding, dealing with sources of software value –And associated stakeholders, applications domains –CS 510: Software Management and Economics Software/systems architecting and engineering –Creating and/or integrating software components –Using risk to balance disciplined and agile processes –CS 577ab: Software Engineering Principles and Practice –CS 578: Software Architecture Ability to continuously learn and adapt –CS 592: Emerging Best Practices in Software Engineering - USC Software Engineering Certificate Courses Critical Success Factors for Software Careers

81 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 81 USC SWE Certificate Program and MS Programs Provide Key Software Talent Strategy Enablers Infusion of latest SWE knowledge and trends Tailorable framework of best practices Continuing education for existing staff - Including education on doing their own lifelong learning Package of career-enhancing perks for new recruits - Career reward for earning Certificate - Option to continue for MS degree - Many CS BA grads want both income and advanced degree Available via USC Distance Education Network

82 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 82 Conclusions World of future: Complex adaptive systems –Distribution; mobility; rapid change Need adaptive vs. optimized processes –Don’t be a process dinosaur –Or a change-resistant change agent Marketplace trends favor value/risk-based processes –Software a/the major source of product value –Software the primary enabler of system adaptability Individuals are organizations need pro-active software talent strategies –Consider USC SW Engr. Certificate and MS programs

83 University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 83 References D. Ahern, A.Clouse, & R. Turner, CMMI Distilled, Addison Wesley, 2001. C. Baldwin & K. Clark, Design Rules: The Power of Modularity, MIT Press, 1999. B. Boehm, D. Port, A. Jain, & V. Basili, “Achieving CMMI Level 5 with MBASE and the CeBASE Method,” Cross Talk, May 2002. B. Boehm & K. Sullivan, “Software Economics: A Roadmap,” The Future of Software Economics, A. Finkelstein (ed.), ACM Press, 2000. R. Kaplan & D. Norton, The Balanced Scorecard: Translating Strategy into Action, Harvard Business School Press, 1996. D. Reifer, Making the Software Business Case, Addison Wesley, 2002. K. Sullivan, Y. Cai, B. Hallen, and W. Griswold, “The Structure and Value of Modularity in Software Design,” Proceedings, ESEC/FSE, 2001, ACM Press, pp. 99-108. J. Thorp and DMR, The Information Paradox, McGraw Hill, 1998. Software Engineering Certificate website: sunset.usc.edu/SWE-Cert CeBASE web site : www.cebase.org CMMI web site : www.sei.cmu.edu/cmmi MBASE web site : sunset.usc.edu/research/MBASE


Download ppt "University of Southern California Center for Software Engineering CSE USC ©USC-CSE 5/10/2002 1 The Future of Software Engineering Barry Boehm, USC Boeing."

Similar presentations


Ads by Google