Presentation is loading. Please wait.

Presentation is loading. Please wait.

Putting the Engineering in Software Engineering: Technology Infrastructure in Process Improvement Dorota Huizinga, ECS.

Similar presentations


Presentation on theme: "Putting the Engineering in Software Engineering: Technology Infrastructure in Process Improvement Dorota Huizinga, ECS."— Presentation transcript:

1 Putting the Engineering in Software Engineering: Technology Infrastructure in Process Improvement Dorota Huizinga, ECS

2

3 How many lines of code? 1. In the 2005 model of the auto displayed on the previous slide. 2. In the 2015 model of an equivalent auto.

4 What is Software Engineering? “Software Engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation and maintenance of software.” [IEEE std. 610.12-1990, 1990: IEEE Standard Glossary of Software Engineering Terminology]

5 Software Engineering There has been almost 40 years of software engineering. “Software Engineering” term was coined during 1968 NATO conference, held in Garmish, Germany, by its chairman Fredrich Bauer and it has been widely used since.

6 Why do we still not know how to produce quality software?

7 Bill Gates about bugs (1995) “There are no significant bugs in our released software that any significant number of users want fixed”

8  Software Errors Cost U.S. Economy $59.5 Billion Annually [NIST 2002 http://www.nist.gov/public_affairs/releases/n02-10.htm]http://www.nist.gov/public_affairs/releases/n02-10.htm  Recent example: “Los Angeles School District ERP Payroll Snarls Teacher Pay” [eWeek, October 5 th, 2007 http://www.eweek.com/article2/0,1895,2192653,00.asp ] http://www.eweek.com/article2/0,1895,2192653,00.asp

9 Why is this? Why can’t we achieve the same level of quality with software that we can achieve with other engineering products? More fundamentally what are the differences between software engineering and other types of engineering?

10 Is Software Engineering truly Engineering? Similarity: Software developers, like other engineers, transform ideas into usable products Differences: Software is inherently intangible (bits not atoms) Not completely testable (millions of states to test) Transitional (constantly changing) Can be substantially modified late in production

11 Impact of Software Characteristics on Software Engineers  Impact of software intangibility: “All or nothing effect” of software – causes high levels of anxiety [Kahneman and Tversky, “Prospect Theory: Decision Making Under Risk”,1979]  Impact of software fluidity: The current status of the product is often unknown and complexity becomes overwhelming causing additional anxiety

12 Anxiety, Competency & Boredom Philip G. Armour, “The Business of Software”, Communications of the ACM, June 2006, Vol. 49, No. 6 Mihaly Csikszentmihalyi, “Flow: The Psychology of the Optimal Experience”, Harper & Row, 1990

13 How can we help to put more engineering in software engineering?  By visualization of software development  By automating repetitive and mundane tasks  By isolating project artifacts that directly impact developer’s work to reduce the impact of both fluidity and complexity

14 Infrastructure First People + Technology + Process [ADP Book]

15 Vocoder project David Duc Phung, “Vocoder– TDD:A Software Development Case Study” MS Project Report, Department of Computer Science, Cal State, Fullerton, May 2007

16 Agile Software Development Lifecycle

17 Test Driven Development 1. Understanding the assigned work items. 2. Red: Create a test and make it fail. 3. Green: Implement the production code to make the test pass. 4. Refactor: Change the code to remove duplication

18 Infrastructure for Vocoder PEOPLETECHNOLOGY

19 Initial Infrastructure

20

21 Work items

22 How do we select proper technology ?

23

24 Red and Green States

25 Code Coverage

26 Traceability of Vocoder

27 Project Velocity Report

28 The Remaining Work Report

29 Unplanned Work Report

30 Vocoder’s Success Digitized Voice Data Stream Transceiver Application Encoding/Decoding Services Encoded Data Stream Satellite Link Encoded Data Stream Transceiver Application Encoding/Decoding Services

31 Minimum Open Source Infrastructure Proposed by Karl Fogel Karl Fogel – author & open source contributor –Subversion (http://subversion.tigris.org/).http://subversion.tigris.org/ –Wrote book, “Producing Open Source Software, How to run a successful free software project.” Minimum open source software project technology infrastructure includes the following: –Web site –Mailing lists –Version Control –Bug Tracking –Real-time chat [Fogel, Karl, “Producing Open Source Software”. Copyright © 2006 Karl Fogel. O’Reilly Media, Inc.]

32 Alex Obradovic, Open Source Infrastructure for offshore development

33 How do we select proper technology ? People, Process and Technology contribute to successful software projects. [http://www.sei.cmu.edu/cmmi/adoption/pdf/cmmi-overview07.pdf] [ Herbert, G., “Selecting Software Development Infrastructure Evaluation Criteria”, MS Project Report, Department of Computer Science, Cal State, Fullerton, May 2007]

34 CMMI® for Development v1.2: DAR “The purpose of Decision Analysis and Resolution (DAR) is to analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria.” [Chrissis, Mary Beth, Kondra, Mike, Shrum, Sandy “CMMI for Development, v 1.2”,Addison Wesley, 2007]

35 Decision Analysis and Resolution  CMMI® DAR has one specific goal  SG 1 Evaluate Alternatives  SP 1.1Establish Guidelines for Decision Analysis  SP 1.2Establish Evaluation Criteria  SP 1.3Identify Alternative Solutions  SP 1.4Select Evaluation Methods  SP 1.5Evaluate Alternatives  SP 1.6Select Solutions

36 Is technology selection important?  Healthy skepticism regarding software development tools appears in the SEI presentation accessible via this URL:  http://www.sei.cmu.edu/intro/documents/intro-slides/process- overview.pdf http://www.sei.cmu.edu/intro/documents/intro-slides/process- overview.pdf  Slide 4 of that SEI Briefing indicates:  “70% of tools purchased by the organizations in the surveys are never used, other than perhaps in initial trial.”

37 Technology requirements  Identifying criteria that must be met constitutes objective progress towards specifying our technology requirements.  Specified requirements may be appropriate at a point in time, but both needs and technologies evolve. Criteria

38 Technology requirements evolve

39 Selection Criteria and Evaluation Methods  Criteria Categories;  User and use related, operation related, costs, risk/security, installation, security, adoption  Evaluation Methods:  Scenario Based Evaluation (SBE)  Goal Question Metric (GQM)  Survey and Comparison Computer Supported Cooperative Work (CSCW)  SEI / CMMI offerings:  Organizational Innovation and Deployment (OID)  Decision Analysis and Resolution (DAR)  INTRo  Comparative Evaluation Process (CEP) Criteria Evaluation Methods

40 Technology Selection Criteria I 1. Meets requirements 2. Capabilities / Features 3. User Friendliness 4. Documentation 5. Familiarity with product 6. Experience with product

41 Technology Selection Criteria II 7. Compatibility with existing Infrastructure 8. Interoperability with common technologies 9. Portability across platforms 10. Adaptability for project or product needs 11. Maintainability 12. Support

42 Technology Selection Criteria III & IV 13. Ease of installation 14. Ease of migration 15. Initial costs 16. Recurring costs

43 Technology Selection Criteria V & VI 17. Security 18. Risk 19. Market adoption 20. Adoption by successful projects

44 Scenario Based Evaluation (SBE)  Situating Evaluation in Scenarios of Use “The essence of scenario-based methods is that system design and evaluation should be grounded in the concrete use scenarios for which it is intended.” [ Haynes, Steven R., Purao, Sandeep, and Skattebo, Amie L., “Situating Evaluation in Scenarios of Use”. CSCW’04, November 6-10, 2004, Chicago, Illinois, USA. Copyright © 2004 ACM 1-58113-810- 5/04/0011 ]  How will the technology be used?  What are the most important scenarios?

45 Comparative Evaluation Process (CEP )  CEP quantifies criteria and accounts for credibility of evaluation by weight factors [Phillips, Barbara Cavanaugh, and Polen, Susan M., “Add Decision Analysis to Your COTS Selection Process”. CrossTalk - The Journal of Defense Software Engineering. April 2004 Issue. http://www.stsc.hill.af.mil/crossTalk/2002/04/phillips.htmlhttp://www.stsc.hill.af.mil/crossTalk/2002/04/phillips.html ]

46 Goal Question Metric (GQM)  A Framework for Real-World Software Systems Evaluations [Basili, Victor R., Caldiera, Gianluigi, and Rombach, H. Dieter, “The Goal Question Metric Approach”. http://wwwagse.informatik.unikl.de/pubs/repository/basili94b/encyclo.gqm.pdf ]  Quantified criteria focused on important goals  Efficient data collection based on need

47 Goal Question Metric Model System Goals Evaluation Objectives Metrics (Conceptual) Measures (Conceptual) Measures –Implementation-specific

48 Simplified GQM model

49 Goal Question Metric Example  Context: Distributed development project  Goal: Improve quality results  Questions:  What is our current defect rate?  Are we compliant with our standards?  Metrics:  Defect density – automated test report results  Adherence to standards – code checker report

50 solutionbliss.com

51  600,000 lines of code in the 2005 model to make it “smart” [Automotive Design and Production, 6/1/2005]  100,000,000 lines of code in the 2015 model of class “S” Mercedes Benz [Automotive Design and Production, 6/1/2005]  (survey)

52 Conclusion Infrastructure First Thank you and Adam goes next…


Download ppt "Putting the Engineering in Software Engineering: Technology Infrastructure in Process Improvement Dorota Huizinga, ECS."

Similar presentations


Ads by Google