Presentation is loading. Please wait.

Presentation is loading. Please wait.

 What is “quality”? IEEE Glossary: Degree to which a system, component, or process meets (1) specified requirements, and (2) customer or user needs.

Similar presentations


Presentation on theme: " What is “quality”? IEEE Glossary: Degree to which a system, component, or process meets (1) specified requirements, and (2) customer or user needs."— Presentation transcript:

1

2

3  What is “quality”? IEEE Glossary: Degree to which a system, component, or process meets (1) specified requirements, and (2) customer or user needs or expectations ISO: the totality of features and characteristics of a product or service that bear on its ability to satisfy specified or implied needs

4  “Set of systematic activities providing evidence of the ability of the software process to produce a software product that is fit to use” ◦ G. Schulmeyer and J. McManus, Software Quality Handbook, Prentice Hall, 1998.

5 TThe metric used most often to measure software quality assurance is errors found/KLOC.

6  An alternate view of Quality: ◦ is not absolute ◦ is multidimensional, can be difficult to quantify ◦ has aspects that are not easy to measure ◦ assessment is subject to constraints (e.g., cost) ◦ is about acceptable compromises ◦ criteria are not independent, can conflict In other words, “quality” is not a well defined term.

7  Quality Criteria include: ◦ correctness ◦ efficiency ◦ flexibility ◦ integrity ◦ interoperability ◦ maintainability ◦ portability ◦ reliability ◦ reusability ◦ testability ◦ usability You can add some of your own criteria: Cool-ability Crash-ability Where-ability (Where you got the software) Who-ability (Who you copied it from)

8 Definition: Monitoring processes and products throughout the software development lifecycle to ensure the quality of the delivered product(s)

9 Monitoring the processes –Provides management with objective feedback regarding process compliance to approved plans, procedures, standards, and analyses

10  Monitoring the products ◦ Focus on the quality of product within each phase of the Software Design Lifecycle  e.g., requirements, test plan, architecture, etc. ◦ Objective: identify and remove defects throughout the lifecycle, as early as possible

11  SQA processes apply when integrating purchased or customer-supplied software products into the developed product Question. How do you determine the “quality” of COTS components? –Current research problem

12  Use of standards and process models has a positive impact on the quality of the software product. Examples include: –ISO 9001 (www.iso.org) –CMM (http://www.sei.cmu.edu/cmmi/) CMU SEI, 5 levels –SPICE (http://www.sqi.gu.edu.au/spice/what.html) Developing a standard for software process assessment ISO joint committee, Europe, Australia –IEEE 1074, IEEE 12207 http://www.techstreet.com/cgi-bin/detail?doc_no=ieee|1074_2006&product_id=1277365 How can we prove this is true?

13  Product assessment can include: ◦ Project management reports, quality assurance reports, training plans, test plan(s) ◦ Requirements, analysis, architecture, detailed design model, test cases ◦ Issue or problem reports ◦ Metric reports ◦ Traceability reports (http://www.ldra.com/tbreq.asp) ◦ Documentation, coding standards

14  Inspection: ◦ A formal evaluation technique in which an artifact (e.g., software requirements, design, or code) is examined in detail by a person or group other than the originator ◦ detect faults, violations of development standards, and other problems. ◦ review members are peers (equals) of the designer or programmer. ◦ data is collected during inspections for later analysis and to assist in future inspections.

15  Describes the practices and procedures to be followed for reporting, tracking, and resolving problems ◦ Who can report a problem? ◦ How is it reported? ◦ How is it tracked? ◦ Who determines if it is a problem that going to be resolved? ◦ How is it assigned for resolution? ◦ How does the person indicate it has been corrected? ◦ Who reviews it to determine if it can be closed?

16 Problems can be product or process related –e.g. incorrect requirement, incomplete class definition, code defect, ambiguous description in user documentation, process to review detailed design is not clearly defined, etc.

17  Metrics for each artifact (e.g., Requirements) ◦ Number of requirements ◦ Number of changes per requirement  Called “churn” rate ◦ Characterization of defects  Not testable, ambiguous, inconsistent, incorrect, incomplete redundant, infeasible, …  Major or minor defect  Phase defect detected (which phase of SE cycle)  Cost to fix

18 Indicator CategoryManagement InsightMetrics Progress Provides information on how well the project is performing with respect to its schedule. Actual vs. planned task completions Actual vs. planned durations Effort Provides visibility into the contributions of staffing on project costs, schedule adherence, and product quality. Actual vs. planned staffing profiles Cost Provides tracking of actual costs against estimated costs and predicts future costs. Actual vs. planned costs Cost and schedule variances Review Results Provides status of action items from life-cycle review. Status of action items Trouble Reports Provides insight into product and process quality and the effectiveness of the testing. Status of trouble reports Number of trouble reports opened, closed, etc. during reporting period Requirements Stability Provides visibility into the magnitude and impact of requirements changes. Number of requirements changes/clarifications Distribution of requirements over releases Size Stability Provides insight into the completeness and stability of the requirements and into the ability of the staff to complete the project within the current budget and schedule. Size growth Distribution of size over releases Computer Resource Utilization Provides information on how well the project is meeting its computer resource utilization goals/requirements. Actual vs. planned profiles of computer resource utilization Training Provides information on the training program and staff skills. Actual vs. planned number of personnel attending classes

19  Media Control involves how you store your artifacts.  Identify the media for each intermediate and deliverable artifact  Documentation required to store the media, including the backup and restore process  Protect computer program physical media from: ◦ unauthorized access ◦ inadvertent damage ◦ degradation

20

21 Post-Mortem Anaylsis (PMA) Definition:  The process of looking back at a completed project's design and its development process, in order to identify those aspects where improvements can be made in future projects  Post-mortems enable individual learning to be converted into team and organizational learning.

22 Other Names for PMA PMA, or its main, activities have many terms such as:  blame and flames  debriefing  lessons learned  post implementation review  post project review  postpartum  project audit  project review  retrospective  team retrospective.

23 Post-Mortem Analysis  PMA is ideally performed either soon after the most important milestones and events or at the end of a project, both in successful and unsuccessful software development projects.  The benefit is that post-mortems can often reveal findings more frequently and differently than project completion reports alone.

24 Benefits  Helps project team members share and understand each other’s perspectives;  Integrates individual and team learning;  Identifies hidden problems;  Documents good practices and problems (so as not to repeat bad practices);  Increases job satisfaction by giving people feedback about their work;

25 Aim of a PMA The aim of any post-mortem is to provide answers to the following four key questions :  What did we do well that does not have to be further discussed (the positive aspects)?  What did we learn (the negative aspects)?  What should we do differently the next time (the negative aspects which require improvements)?  What still puzzles us (share the things that are confusing or do not make sense)?

26 Conducting a PMA The PMA process for average-sized and large projects is defined in five steps:  1. Plan a project review  2. Gather project data  3. Hold a post-iteration workshop or post- mortem review  4. Analyze the findings and synthesize the lessons learned  5. Publication of the results

27 Plan a Project Review  1. A project review is planned to identify the most suitable methods and tools used in the other steps. ◦ The post-mortem reviews, the reasons for the review, the focus and the participants are defined

28 Gather Project Data  2. Both objective and subjective data are collected from all the project participants via pre-defined metrics, surveys, debriefings, etc. to identify the useful information for the “following step” (workshop/review)

29 Hold a Post-Mortem Review  3. A “project history day” is the most important step, and it is held to combine reflective analysis of project events with the actual project data after a project’s major milestone (post-iteration), or after a project has finished (post-mortem). ◦ In the case of large projects, only a few key people participate in this session

30 Analyze and Synthesize  4. The findings are analyzed, prioritized and synthesized as lessons learned. ◦ This is often started during the project review day after identifying and prioritizing the positive events and problems

31 Publish the Results  5. The summary of the findings is published and presented in a way that enables future projects to know what processes or tools are important to continue, and to turn problems into improvement activities.

32 Causes of Failure  We don’t look back until the very end of the project.  We don’t set a tone of open and honest feedback.  We don’t look at the whole picture of product and process.  We don’t actually follow through on the feedback.

33 A Sample Post-Mortem Form  http://www.klariti.com/technical- writing/Post-Mortem-Documentation.shtml

34  Fagan, M., “Design and Code Inspections to Reduce Errors in Program Development”, IBM Systems Journal, 15, 3 (1976), pp. 182-211  Fagan, M., “Advances in Software Inspections”, IEEE Transactions on Software Engineering, 12, 7(July 1986), pp. 744-751  Schulmeyer G. and McManus, J., Software Quality Handbook, Prentice Hall, 1998.  IEEE Std 730™ 2002, IEEE Standard for Software Quality Assurance Plans, IEEE Computer Society, Sponsored by the Software Engineering Standards Committee  Rosenberg, L.H.; Gallo, A.M., Jr., “Software quality assurance engineering at NASA”, Proceedings of the IEEE Aerospace Conference, 2002, Volume: 5, 2002, pp. 5-2569 -5-2575.  “Inspections”, http://www.math.uaa.alaska.edu/~afkjm/cs470/handouts/inspections.pdf


Download ppt " What is “quality”? IEEE Glossary: Degree to which a system, component, or process meets (1) specified requirements, and (2) customer or user needs."

Similar presentations


Ads by Google