Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Software Engineering CSE USC 102-06-01©USC-CSE Overview: USC Annual Research Review Barry Boehm, USC-CSE February.

Similar presentations


Presentation on theme: "University of Southern California Center for Software Engineering CSE USC 102-06-01©USC-CSE Overview: USC Annual Research Review Barry Boehm, USC-CSE February."— Presentation transcript:

1 University of Southern California Center for Software Engineering CSE USC 102-06-01©USC-CSE Overview: USC Annual Research Review Barry Boehm, USC-CSE February 6, 2001

2 University of Southern California Center for Software Engineering CSE USC 202-06-01©USC-CSE Outline USC-CSE Highlights, 2000 USC-CSE Affiliates and Calendar Example Recent Activity: NASA/CMU High Dependability Computing Consortium

3 University of Southern California Center for Software Engineering CSE USC 302-06-01©USC-CSE USC-CSE Highlights, 2000 New Affiliates: Auto Club of So. Calif., Galorath, Group Systems.com, IBM, JPL, Marotz New Ph.D.’s: Alex Egyed (Teknowledge), Jongmoon Baik (Motorola) Boehm honorary Sc.D. (UMass), INCOSE Fellow, IEEE Mills Award COCOMO II book and CD Commercially-based EasyWinWin (GroupSystems.com) NSF-ITR CeBASE grant with UMaryland, UNebraska, Mississippi State U.

4 University of Southern California Center for Software Engineering CSE USC 402-06-01©USC-CSE USC-CSE Affiliates (33) Commercial Industry (16) –Automobile Club of Southern California, C-Bridge, EDS, Fidelity Group, Galorath, Group Systems.Com, Hughes, IBM, Lucent, Marotz, Microsoft, Motorola, Rational, Sun, Telcordia, Xerox Aerospace Industry (9) –Boeing, Draper Labs, GDE Systems, Litton, Lockheed Martin, Northrop Grumman, Raytheon, SAIC, TRW Government (3) –FAA, US Army Research Labs, US Army TACOM FFRDC’s and Consortia (4) –Aerospace, JPL, SEI, SPC International (1) –Chung-Ang U. (Korea)

5 University of Southern California Center for Software Engineering CSE USC 502-06-01©USC-CSE USC-CSE Affiliates’ Calendar June 22, 2000 July 25-26, 2000 July 27, 2000 August 24-25, 2000 September 13-15, 2000 October 24-27, 2000 February 6-9, 2001 February 21-23, 2001 February 21, 2001 March 28, 2001 May 2001 May-June 2001 Easy WinWin Web Seminar Easy WinWin Hands-on Tutorial Tutorial: Transitioning to the CMMI via MBASE Software Engineering Internship Workshop Workshop: Spiral Development in the DoD (Washington DC; with SEI) COCOMO/Software Cost Modeling Forum and Workshop Annual Research Review, COTS-Based Systems Workshop (with SEI, CeBASE) Ground Systems Architecture Workshop (with Aerospace, SEI) LA SPIN, Ron Kohl, COTS-Based Systems Processes LA SPIN, High Dependability Computing Annual Affiliates’ Renewal Rapid Value/RUP/MBASE Seminar (with C-Bridge, Rational)

6 University of Southern California Center for Software Engineering CSE USC 602-06-01©USC-CSE Center for Empirically-Based Software Engineering (CeBASE) Strategic Vision Strategic Framework Strategic Process: Experience Factory Tailoring G/L:Goal-Model-Question-Metric Tactical Process:Model Integration (MBASE); WinWin Spiral Strategic Framework Empirical Methods Quantitative Qualitative Experimental Ethnographic Observational Analysis Surveys, Assessments Parametric Models Critical Success Factors Dynamic Models Root Cause Analysis Pareto 80-20 Relationships Experience Base (Context; Results) Project, Context Attributes Empirical Results; References Implications and Recommended Practices Experience Feedback Comments Initial foci: COTS-based systems; Defect reduction

7 University of Southern California Center for Software Engineering CSE USC 702-06-01©USC-CSE High Dependability Computing in a Competitive World Barry Boehm, USC NASA/CMU HDC Workshop Keynote February 6, 2001 (boehm@; http://) sunset.usc.edu

8 University of Southern California Center for Software Engineering CSE USC 802-06-01©USC-CSE HDC in a Competitive World The economics of IT competition and dependability Software Dependability Opportunity Tree –Decreasing defects –Decreasing defect impact –Continuous improvement –Attacking the future Conclusions and References

9 University of Southern California Center for Software Engineering CSE USC 902-06-01©USC-CSE Competing on Schedule and Quality - A risk analysis approach Risk Exposure RE = Prob (Loss) * Size (Loss) –“Loss” – financial; reputation; future prospects, … For multiple sources of loss: sources RE =  [Prob (Loss) * Size (Loss)] source

10 University of Southern California Center for Software Engineering CSE USC 1002-06-01©USC-CSE Example RE Profile: Time to Ship - Loss due to unacceptable dependability Time to Ship (amount of testing) RE = P(L) * S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

11 University of Southern California Center for Software Engineering CSE USC 1102-06-01©USC-CSE Example RE Profile: Time to Ship - Loss due to unacceptable dependability - Loss due to market share erosion Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

12 University of Southern California Center for Software Engineering CSE USC 1202-06-01©USC-CSE Example RE Profile: Time to Ship - Sum of Risk Exposures Time to Ship (amount of testing) RE = P(L) * S(L) Few rivals: low P(L) Weak rivals: low S(L) Many rivals: high P(L) Strong rivals: high S(L) Sweet Spot Many defects: high P(L) Critical defects: high S(L) Few defects: low P(L) Minor defects: low S(L)

13 University of Southern California Center for Software Engineering CSE USC 1302-06-01©USC-CSE Comparative RE Profile: Safety-Critical System Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): defects High-Q Sweet Spot

14 University of Southern California Center for Software Engineering CSE USC 1402-06-01©USC-CSE Comparative RE Profile: Internet Startup Time to Ship (amount of testing) RE = P(L) * S(L) Mainstream Sweet Spot Higher S(L): delays Low-TTM Sweet Spot TTM: Time to Market

15 University of Southern California Center for Software Engineering CSE USC 1502-06-01©USC-CSE Conclusions So Far Unwise to try to compete on both cost/schedule and quality –Some exceptions: major technology or marketplace edge There are no one-size-fits-all cost/schedule/quality strategies Risk analysis helps determine how much testing (prototyping, formal verification, etc.) is enough –Buying information to reduce risk Often difficult to determine parameter values –Some COCOMO II values discussed next

16 University of Southern California Center for Software Engineering CSE USC 1602-06-01©USC-CSE Software Dependability Opportunity Tree Decrease Defect Risk Exposure Continuous Improvement Decrease Defect Impact, Size (Loss) Decrease Defect Prob (Loss) Defect Prevention Defect Detection and Removal Value/Risk - Based Defect Reduction Graceful Degradation CI Methods and Metrics Process, Product, People Technology

17 University of Southern California Center for Software Engineering CSE USC 1702-06-01©USC-CSE Software Defect Detection Opportunity Tree Completeness checking Consistency checking - Views, interfaces, behavior, pre/post conditions Traceability checking Compliance checking - Models, assertions, standards Defect Detection and Removal - Rqts. - Design - Code Testing Reviewing Automated Analysis Peer reviews, inspections Architecture Review Boards Pair programming Requirements & design Structural Operational profile Usage (alpha, beta) Regression Value/Risk - based Test automation

18 University of Southern California Center for Software Engineering CSE USC 1802-06-01©USC-CSE Future trends intensify competitive HDC challenges –Complexity, criticality, decreased control, faster change Organizations need tailored, mixed HDC strategies –No universal HDC sweet spot –Goal/value/risk analysis useful –Quantitative data and models becoming available HDC Opportunity Tree helps sort out mixed strategies Quality is better than free for high-value, long-life systems Attractive new HDC technology prospects emerging –Architecture- and model-based methods –Lightweight formal methods –Self-stabilizing software –Complementary theory and empirical methods Conclusions


Download ppt "University of Southern California Center for Software Engineering CSE USC 102-06-01©USC-CSE Overview: USC Annual Research Review Barry Boehm, USC-CSE February."

Similar presentations


Ads by Google