RIT Software Engineering

Slides:



Advertisements
Similar presentations
INFO 631 Prof. Glenn Booker Week 1 – Defect Analysis and Removal 1INFO631 Week 1.
Advertisements

Software Engineering CSE470: Process 15 Software Engineering Phases Definition: What? Development: How? Maintenance: Managing change Umbrella Activities:
Chapter 4 Quality Assurance in Context
Computer Engineering 203 R Smith Project Tracking 12/ Project Tracking Why do we want to track a project? What is the projects MOV? – Why is tracking.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 6/e (McGraw-Hill 2005). Slides copyright 2005 by Roger Pressman.1.
Software Quality Assurance Inspection by Ross Simmerman Software developers follow a method of software quality assurance and try to eliminate bugs prior.
Testing Without Executing the Code Pavlina Koleva Junior QA Engineer WinCore Telerik QA Academy Telerik QA Academy.
Stepan Potiyenko ISS Sr.SW Developer.
Swami NatarajanJune 10, 2015 RIT Software Engineering Activity Metrics (book ch 4.3, 10, 11, 12)
Software Development Process Models. The Waterfall Development Model.
Software Quality Engineering Roadmap
Defect Removal Metrics
Swami NatarajanJune 17, 2015 RIT Software Engineering Reliability Engineering.
Defect Removal Metrics
SE 450 Software Processes & Product Metrics Reliability Engineering.
SE 450 Software Processes & Product Metrics Software Metrics Overview.
SE 450 Software Processes & Product Metrics 1 Defect Removal.
SE 450 Software Processes & Product Metrics Activity Metrics.
Validating and Improving Test-Case Effectiveness Author: Yuri Chernak Presenter: Lam, Man Tat.
Swami NatarajanJuly 14, 2015 RIT Software Engineering Reliability: Introduction.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
 QUALITY ASSURANCE:  QA is defined as a procedure or set of procedures intended to ensure that a product or service under development (before work is.
Stoimen Stoimenov QA Engineer QA Engineer SitefinityLeads,SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
12 Steps to Useful Software Metrics
Using A Defined and Measured Personal Software Process Watts S. Humphrey CS 5391 Article 8.
Capability Maturity Model
Chapter 20: Defect Classification and Analysis  General Types of Defect Analyses.  ODC: Orthogonal Defect Classification.  Analysis of ODC Data.
Software Project Management
S/W Project Management
S T A M © 2000, KPA Ltd. Software Trouble Assessment Matrix Software Trouble Assessment Matrix *This presentation is extracted from SOFTWARE PROCESS QUALITY:
Software Quality Chapter Software Quality  How can you tell if software has high quality?  How can we measure the quality of software?  How.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Dillon: CSE470: SE, Process1 Software Engineering Phases l Definition: What? l Development: How? l Maintenance: Managing change l Umbrella Activities:
CLEANROOM SOFTWARE ENGINEERING.
N By: Md Rezaul Huda Reza n
Validation Metrics. Metrics are Needed to Answer the Following Questions How much time is required to find bugs, fix them, and verify that they are fixed?
Testing – A Methodology of Science and Art. Agenda To show, A global Test Process which work Like a solution Black Box for an Software Implementation.
Unit 8 Syllabus Quality Management : Quality concepts, Software quality assurance, Software Reviews, Formal technical reviews, Statistical Software quality.
© Mahindra Satyam 2009 Defect Management and Prevention QMS Training.
College of Engineering and Computer Science Computer Science Department CSC 131 Computer Software Engineering Fall 2006 Lecture # 1 (Ch. 1, 2, & 3)
CEN rd Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Phases of Software.
1 Pop Quiz What is an Orthogonal Defect? Which is more precise: predictive models or management models? Why? Essay: Why are metrics and models an important.
INFO 637Lecture #101 Software Engineering Process II Review INFO 637 Glenn Booker.
Formal Methods in Software Engineering
CMMI. 1.Initial - The software process is characterized as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual.
CSCI 521 Final Exam Review. Why Establish a Standard Process? It is nearly impossible to have a high quality product without a high quality process. Standard.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Software Engineering Lecture # 1.
Software Quality Assurance SOFTWARE DEFECT. Defect Repair Defect Repair is a process of repairing the defective part or replacing it, as needed. For example,
1 Software Quality Engineering. 2 Quality Management Models –Tools for helping to monitor and manage the quality of software when it is under development.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M13 8/20/2001Slide 1 SMU CSE 8314 /
Software Engineering Lecture 8: Quality Assurance.
Mahindra Satyam Confidential Quality Management System Software Defect Prevention.
DevCOP: A Software Certificate Management System for Eclipse Mark Sherriff and Laurie Williams North Carolina State University ISSRE ’06 November 10, 2006.
Software Test Plan Why do you need a test plan? –Provides a road map –Provides a feasibility check of: Resources/Cost Schedule Goal What is a test plan?
Manual Testing Concepts Instructor: Surender. Agenda  Content: 1. Testing Overview I. What is testing II. Who does testing III. When to Start Testing.
by: Er. Manu Bansal Deptt of IT Software Quality Assurance.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Capability Maturity Model. What is CMM? n CMM: Capability Maturity Model n Developed by the Software Engineering Institute of the Carnegie Mellon University.
Swami NatarajanOctober 1, 2016 RIT Software Engineering Software Metrics Overview.
SOFTWARE TESTING TRAINING TOOLS SUPPORT FOR SOFTWARE TESTING Chapter 6 immaculateres 1.
Software Quality Engineering
Software Quality Engineering
12 Steps to Useful Software Metrics
Software engineering – 1
Quality Measurable characteristic Cyclomatic complexity Cohesion
Capability Maturity Model
Capability Maturity Model
3. Software Quality Management
Presentation transcript:

RIT Software Engineering Defect Removal RIT Software Engineering

RIT Software Engineering Agenda Setting defect removal targets Cost effectiveness of defect removal Matching to customer & business needs and preferences Performing defect removal Techniques/approaches/practices overview Performance tracking and feedback: Measurements and metrics Defect measurements and classification Measurement source: Inspections, test reports, bug reports Defect density Phase containment effectiveness Cost of quality & Cost of poor quality Tracking of bug fixing and fixing effectiveness RIT Software Engineering

Underlying Quality Engg Model Optimizing results using feedback Objectives Perform activity Outcomes Measure results, feedback for improvement In the next few weeks, we take different SE areas – Defect removal, Product quality, Customer satisfaction, Project management – and study the quality engineering objectives, practices and metrics for that area. RIT Software Engineering

Defect Removal objectives Low defect density in product Different density targets by severity level Actual targets based on nature of software Impact of defects, expectations of customer (Will discuss in more detail under reliability) Often the idea of “setting a defect rate goal” is not discussable What about the goal of “no known defects”? Is shipping with known defects acceptable? Cost-effective defect removal Quantitative understanding of which approaches most cost-effective Quantitative understanding of how much effort is worthwhile RIT Software Engineering

Defect Removal Practices 1 (Practices grouped by increasing sophistication of approach) Informal defect removal Informal discussion and review of requirements with customer Sporadic testing prior to release Informal discussions and reviews within team Informal bug reporting and fixing Informal but strong quality focus Extensive testing Creating test cases, writing test code Feature-based testing Need-based inspections and reviews “This code seems to have problems, let us improve it” RIT Software Engineering

Defect Removal Practices 2 Test strategy and test planning Informal attempts at coverage Systematic identification of test cases Tracking of detected problems to closure for both tests and reviews Systematic customer reviews of generated documents Tracking of defect reports from customers Possibly some use of test automation Test harnesses that systematically run the software through a series of tests Practices that prevent some kinds of defects Training, configuration management, prototyping RIT Software Engineering

Defect Removal Practices 3 Formal peer reviews of code and documents Tracking of review and test results data Use of coverage analysis and test generation tools Tracking of data on bug fixing rates Use of graphs showing defect rates for informal diagnosis and improvement Improved defect prevention using checklists, templates, formal processes RIT Software Engineering

Defect Removal Practices 4 Use of metrics to Analyze effectiveness of defect removal Identify problem modules (with high defect densities) Identify areas where practices need strengthening Identify problems “in-process” i.e. during development itself Set defect density / defect rate objectives Actually usually “baselines” – current capability level Guide release (only when defect detection rates fall below threshold) Plan test effort and number of test cases Optimize quality efforts Consistent use of test automation Use of formal methods for defect avoidance RIT Software Engineering

Defect Removal Practices 5 Continuous improvement cycle Pareto analysis to discover common sources of problems Causal analysis to identify roots of frequent problems Use defect elimination tools to prevent these problems Repeat! RIT Software Engineering

Value of early defect detection Note that y-axis scale is logarithmic – actual increase is exponential From http://www.sdtcorp.com/overview/inspections/sld016.htm RIT Software Engineering

How to detect defects early? Inspections and reviews Prototyping, extensive customer interaction Note that agile development emphasizes these Use of analysis techniques Requirements analysis for completeness and consistency Design analysis e.g. sequence diagrams to analyze functional correctness, attribute analysis Formal specification and analysis of requirements Methodologies that increase early lifecycle effort & depth O-O development increases design effort & detail Test-driven development increases understanding of relationships between design and requirements Traceability analysis Interesting that agile development goes the other way – it decreases time lag between requirements & release, changing the curve! RIT Software Engineering

Need for Multi-stage approaches (Just an illustrative example) One phase of defect removal e.g. testing Assume 95% efficiency (called PCE: phase containment effectiveness) Input 1000 defects – output 50 defects 6 phases of defect removal e.g. Req, Design, Impl, UT, IT, ST Assume 100, 300, 600 bugs introduced in Req, Des, Impl resp. Even with much less efficiency, we get better results 10% improvement in PCE produces 3x better results Phase Req Des Impl UT IT ST 60% eff: defects at entry 100 340 736 295 118 47 60% eff: defects at exit 40 136 19 70% eff: defects at exit 330 699 210 63 70% eff: defects at entry 30 99 6 RIT Software Engineering

RIT Software Engineering Sources of defect data Inspection / review reports contain Phase of detection (current), module (this) Defect severity Effort data: review prep, review meeting, effort to fix problems Number of lines of code reviewed Defect type (see next slide on defect classification) Similar data gathered from testing User bug reports Manual screening to reject duplicates, non-problems Similar classification & effort data RIT Software Engineering

Defect Classification Many organizations have their own defect classification system E.g. Logic, requirements, design, testing, config mgmt May classify in more detail: initialization, loop bounds, module interface, missed functionality etc. Helps in pareto analysis for continuous improvement More effort, less reliable: errors in classification, subjectivity (maybe) Did defect originate from previous fix? There exists a methodology called “Orthogonal Defect Classification” (ODC) RIT Software Engineering

Processing of defect data Compute phase containment effectiveness based on Number of defects found in that phase Number of defects from that phase or earlier found subsequently Similarly compute test effectiveness Fixing effectiveness Overall, modulewise, phasewise defect densities Review rates: number of lines / hour Cost of quality: Total effort spent on quality activities Cost of poor quality: Total effort spent on fixes RIT Software Engineering

Limitations in defect data Small sample sizes Smaller projects often have <100 bugs Classifying by type, phase etc. further reduces the “population” from statistical perspective PCE in particular has heavy sample size problems Organizational numbers often more meaningful PCE reduces as more bugs found! Hard to use as in-process metric Subjectivity of classification Developers may suppress defect data to look good Fundamental rule: “Never use metrics to evaluate people” RIT Software Engineering