Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sponsored by the U.S. Department of Defense © Carnegie Mellon University Pittsburgh, PA 15213-3890 Stephen E. Cross, Ph.D. Director and CEO Software Engineering.

Similar presentations


Presentation on theme: "Sponsored by the U.S. Department of Defense © Carnegie Mellon University Pittsburgh, PA 15213-3890 Stephen E. Cross, Ph.D. Director and CEO Software Engineering."— Presentation transcript:

1

2 Sponsored by the U.S. Department of Defense © Carnegie Mellon University Pittsburgh, PA 15213-3890 Stephen E. Cross, Ph.D. Director and CEO Software Engineering Institute sc@sei.cmu.edu www.sei.cmu.edu 412-268-7740 Software Engineering Institute

3 SEI Overview - page 2© 2002 by Carnegie Mellon University This Briefing Refers to the Following Service Marks and Trademarks ®Capability Maturity Model, Capability Maturity Modeling, Capability Maturity Model for Software, CMMI, CERT, and CERT Coordination Center are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University. SM CMM Integration; IDEAL; Personal Software Process; PSP; SCAMPI; SCAMPI Lead Assessor; SCAMPI Lead Appraiser; Team Software Process; TSP; Architecture Tradeoff Analysis Method; and ATAM are service marks of Carnegie Mellon University.

4 SEI Overview - page 3© 2002 by Carnegie Mellon University What I’d Like to Share With You Brief overview of Carnegie Mellon and the SEI Overview of SEI’s body of work Some of my own ideas for future research Summary

5 SEI Overview - page 4© 2002 by Carnegie Mellon University Software Engineering Institute Carnegie Institute of Technology College of Fine Arts College of Humanities and Social Sciences Graduate School of Industrial Administration H. John Heinz III School of Public Policy and Management Mellon College of Science School of Computer Science Carnegie Mellon University: Major Units Main campus: Pittsburgh PA (USA) West coast campus: San Jose CA (USA)

6 SEI Overview - page 5© 2002 by Carnegie Mellon University Software Engineering Institute Applied R&D laboratory situated as a college level unit at Carnegie Mellon University, Pittsburgh PA (USA) Established in 1984 Additional offices in in Arlington VA and Frankfurt Germany Staff size of 335 Sponsored by US Government and industry Mission: Improve the practice of software engineering SEI Overview, 23 Mar 2001 - Page 5

7 SEI Overview - page 6© 2002 by Carnegie Mellon University Acquirers & Developers Research Community SEI’s Role (Transition) Research Community Acquirers & Developers Sustain sustain what is adopted Outreach facilitate adoption and use Mature documentation and packaging analysis of trial use identify and mature new practices Helping others improve their software engineering practices Partners SEI

8 SEI Overview - page 7© 2002 by Carnegie Mellon University From a Recent “Top 10 Defects” List Finding and fixing a software problem post delivery is 100x more expensive than finding and fixing it during the requirements and design stage. Current software projects spend 40 to 50% of their time on avoidable work. Peer reviews catch 60% of the defects. Disciplined personal practices can reduce defect introduction rates up to 75%. About 40 to 50 % of user programs contain nontrivial defects. Ref: Boehm, B., and Basili, V. “Software Defect Reduction Top 10 List,” Computer, January 2001, p. 135-137.

9 SEI Overview - page 8© 2002 by Carnegie Mellon University State of Practice & the SEI Vision * move to the left ! * reuse everything * never make the same mistake twice *Ref: Standish Group, www.standishgroup.com, 1999 Development Integration and System Test Software state of practice (“test in” quality) World-class developers “design in” quality 60 - 80 % of effort and cost

10 SEI Overview - page 9© 2002 by Carnegie Mellon University What I’d Like to Share With You Brief overview of Carnegie Mellon and the SEI Overview of SEI’s body of work Some of my own ideas for future research Summary

11 SEI Overview - page 10© 2002 by Carnegie Mellon University SEI Technical Program Management Practice Initiatives Capability Maturity Model Integration Accelerating Software Technology Adoption COTS-Based Systems Performance Critical Systems Architecture Tradeoff Analysis Technical Practice Initiatives Team Software Process Software Engineering Measurement & Analysis Survivable Systems Product Line Practice Predictable Assembly with Certifiable Components The right software delivered defect free, on cost, on time, every time High confidence, evolvable, product lines with predictable and improved cost, schedule, and quality

12 SEI Overview - page 11© 2002 by Carnegie Mellon University Quality Process Models Quality process models, such as the CMMI ® models, support the design and improvement of the key software and system process competencies required to build today’s complex systems

13 SEI Overview - page 12© 2002 by Carnegie Mellon University Software Process Improvement Software Estimates. 0 % 140% -140%.................................................................... (Efforts = Labor Hours) Over/Under Percentage....................................................................... Post Release Defects 15 10 5 0 Average Number of Defects/Kloc Productivity 1992 1993 1994 1995 1996 100 80 60 40 20 0 Average Number of Hours - 26% - 38% - 62% - 12% Increased Productivity Cycle time 1992 1993 1994 1995 1996 100 80 60 40 20 36% Faster Without Historical Data With Historical Data Scott Griffin, Boeing CIO, keynote talk at SEPG 2000

14 SEI Overview - page 13© 2002 by Carnegie Mellon University PSP Results 11109876543210 0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 Design Code Compile Test Time Invested Per (New and Changed) Line of Code Program Number Mean Minutes Spent Per LOC Ref: W. Hayes, J. Over, Personal Software Process (PSP): An Empirical Study of the Impact of PSP on Individual Engineers (CMU/SEI-97-TR-001). See:http://www.sei.cmu.edu/publications

15 SEI Overview - page 14© 2002 by Carnegie Mellon University TSP Results Average Schedule Deviation - Range -20% 0% 20% 40% 60% 80% 100% 120% 140% 160% Pre TSP/PSPWith TSP/PSP Average Effort Deviation - Range -20% 0% 20% 40% 60% 80% 100% 120% Pre TSP/PSPWith TSP/PSP Defects/KLOC in Acceptance Test - Range 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 Pre TSP/PSPWith TSP/PSP System Test Duration (Days / KLOC) - Range 0 1 2 3 4 5 6 7 Pre TSP/PSPWith TSP/PSP http://www.sei.cmu.edu/publications/documents/00.reports/00tr015.html

16 SEI Overview - page 15© 2002 by Carnegie Mellon University Spreading the Architecture Word Software architecture concepts Software architecture evaluation Architecture Reconstruction Architecture documentation Software Architecture in Practice Software Architecture Familiarization ATAM Evaluator Training Architecture Reconstruction BooksCourses Documenting Software Architectures

17 SEI Overview - page 16© 2002 by Carnegie Mellon University Product Line Practice Use of a common asset base in production of a related set of products ArchitectureProduction Plan Scope Definition Business Case

18 SEI Overview - page 17© 2002 by Carnegie Mellon University SEI Product Line Practice Framework Web-based, evolving, community-authored document Describes product line essential activities Describes essential and proven product line practices in the areas of software engineering technical management organizational management http://www.sei.cmu.edu/plp/framework.html

19 SEI Overview - page 18© 2002 by Carnegie Mellon University Examples of Product Line Practice - 1 CelsiusTech - onboard ship systems hardware-to-software cost ratio changed from 35:65 to 80:20 Motorola - FLEXworks Project (family of one-way pagers) 4x cycle-time improvement 80% reuse Hewlett Packard - printer systems 2-7x cycle-time improvement (some 10x) Sample project – shipped 5x number of products – that were 4x as complex – and had 3x the number of features – with 4x products shipped/person

20 SEI Overview - page 19© 2002 by Carnegie Mellon University Examples of Product Line Practice - 2 Cummins Engine Co. - engine control systems system build and integration went from roughly 1 year to 1 week 5.5 years in product line development more than 20 products successfully launched Nokia - mobile phones went from 4 different phones produced per year to 50 per year

21 SEI Overview - page 20© 2002 by Carnegie Mellon University For more information

22 SEI Overview - page 21© 2002 by Carnegie Mellon University What I’d Like to Share With You Brief overview of Carnegie Mellon and the SEI Overview of SEI’s body of work Some of my own ideas for future research Summary

23 SEI Overview - page 22© 2002 by Carnegie Mellon University Effort => (Experience * Quality * Size) Process most individuals and teams lack software experience large projects have more challenges than smaller ones much time and cost is wasted doing rework Adapted from: Royce, W. Software Project Management: A Unified Framework. New York: Addison-Wesley, p. 23. inadequate processes (e.g., requirements) Summary of Today’s Trouble Spots

24 SEI Overview - page 23© 2002 by Carnegie Mellon University Enhancing experience Sharing best practices Knowledge management Context relevant training (e.g., TSP) Living it (e.g., flight simulators)

25 SEI Overview - page 24© 2002 by Carnegie Mellon University Organization Simulation (OrgSim) Concept An immersive environment that: Simulates future organizations, including likely cross- organizational interactions Enables decision makers to interact within & across organizational cultures Synthesizes “people” who behave as if pickup organization is already deployed Provides compelling feel for “what it will be like”

26 SEI Overview - page 25© 2002 by Carnegie Mellon University Overall OrgSim Approach Immersing decision makers in the possible futures Enabling decision makers to act in these possible futures Providing participants – synthespians – that react to decision makers’ actions

27 SEI Overview - page 26© 2002 by Carnegie Mellon University Central Challenges Needs for rapidly responding “pickup” organizations focused on issues and activities that cross jurisdictional boundaries and provide many opportunities for governance conflicts and gaps Scenarios that inevitably involve unforeseen, threats, events, locations, and needs -- requiring, in turn, involvement of organizations not beforehand perceived to be relevant

28 SEI Overview - page 27© 2002 by Carnegie Mellon University OrgSim Project Goals Facilitate rapid design of pickup organizations and evaluation of such designs by enabling experiencing these organizations before fielding them Facilitate identifying cross-organizational governance problems prior to depending on cross-organization functions, permitting rapid redesign and reevaluation

29 SEI Overview - page 28© 2002 by Carnegie Mellon University Orgsim Domains Software engineering Supply chain management Emergency response to terrorist events

30 SEI Overview - page 29© 2002 by Carnegie Mellon University OrgSim Team Carnegie-Mellon University School of Computer Science Heinz School of Management and Public Policy College of Humanities and Social Sciences Software Engineering Institute Georgia Tech School of Industrial & Systems Engineering College of Computing Ivan Allen College of Liberal Arts Georgia Tech Research Institute

31 SEI Overview - page 30© 2002 by Carnegie Mellon University User Interface, e.g., Large Screens, Voice, Gestures Organizational Story, e.g., ‘Active scenario’ Characters, e.g., User, Manager, QA Expert World Model, e.g., City, Industry, Media Distributed Simulation Software Hardware, e.g., Computers, Networks

32 SEI Overview - page 31© 2002 by Carnegie Mellon University Questions? Please contact us www.sei.cmu.edu Steve Cross sc@sei.cmu.educ@sei.cmu.edu 412-268-7740 703-908-8230


Download ppt "Sponsored by the U.S. Department of Defense © Carnegie Mellon University Pittsburgh, PA 15213-3890 Stephen E. Cross, Ph.D. Director and CEO Software Engineering."

Similar presentations


Ads by Google