© USC-CSE 2001 Oct. 24. 20011 Constructive Quality Model – Orthogonal Defect Classification (COQUALMO-ODC) Model Keun Lee (

Slides:



Advertisements
Similar presentations
Effort Estimation and Scheduling
Advertisements

Testing and Quality Assurance
Software Engineering CSE470: Process 15 Software Engineering Phases Definition: What? Development: How? Maintenance: Managing change Umbrella Activities:
Predictor of Customer Perceived Software Quality By Haroon Malik.
Using UML, Patterns, and Java Object-Oriented Software Engineering Royce’s Methodology Chapter 16, Royce’ Methodology.
COCOMO Suite Model Unification Tool Ray Madachy 23rd International Forum on COCOMO and Systems/Software Cost Modeling October 27, 2008.
Cocomo II Constructive Cost Model [Boehm] Sybren Deelstra.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Ricardo Valerdi USC Center for Systems and Software.
Automated Analysis and Code Generation for Domain-Specific Models George Edwards Center for Systems and Software Engineering University of Southern California.
March 2002 COSYSMO: COnstructive SYStems Engineering Cost MOdel Ricardo Valerdi USC Annual Research Review March 11, 2002.
OTS Integration Analysis using iStudio Jesal Bhuta, USC-CSE March 14, 2006.
University of Southern California Center for Software Engineering CSE USC COSYSMO: Constructive Systems Engineering Cost Model Barry Boehm, USC CSE Annual.
Software Measurement and Process Improvement
Applying COCOMO II Effort Multipliers to Simulation Models 16th International Forum on COCOMO and Software Cost Modeling Jongmoon Baik and Nancy Eickelmann.
University of Southern California Center for Software Engineering C S E USC Barry Boehm, USC COCOMO/SCM Forum #16 October 24, 2001
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 10/23/01 1 COSYSMO Portion The COCOMO II Suite of Software Cost Estimation.
University of Southern California Center for Systems and Software Engineering ©USC-CSSE1 Ray Madachy, Barry Boehm USC Center for Systems and Software Engineering.
Ch8: Management of Software Engineering. 1 Management of software engineering  Traditional engineering practice is to define a project around the product.
RIT Software Engineering
SIM5102 Software Evaluation
SE 450 Software Processes & Product Metrics 1 Defect Removal.
University of Southern California Center for Software Engineering CSE USC USC-CSE Annual Research Review COQUALMO Update John D. Powell March 11, 2002.
Copyright USC-CSSE 1 Quality Management – Lessons of COQUALMO (COnstructive QUALity MOdel) A Software Defect Density Prediction Model AWBrown.
© USC-CSE1 Determine How Much Dependability is Enough: A Value-Based Approach LiGuo Huang, Barry Boehm University of Southern California.
© USC-CSE Feb Keun Lee ( & Sunita Chulani COQUALMO and Orthogonal Defect.
1 Software Testing and Quality Assurance Lecture 30 – Testing Systems.
April 13, 2004CS WPI1 CS 562 Advanced SW Engineering General Dynamics, Needham Tuesdays, 3 – 7 pm Instructor: Diane Kramer.
University of Southern California Center for Software Engineering CSE USC ©USC-CSE 3/11/2002 Empirical Methods for Benchmarking High Dependability The.
Software Quality Assurance For Software Engineering && Architecture and Design.
Cost Management Week 6-7 Learning Objectives
Information System Economics Software Project Cost Estimation.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management 1.
CPTE 209 Software Engineering Summary and Review.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
FH Augsburg - FB Informatik 1 CADUI' June FUNDP Namur Software Life Cycle Automation for Interactive Applications: The AME Design Environment.
Dillon: CSE470: SE, Process1 Software Engineering Phases l Definition: What? l Development: How? l Maintenance: Managing change l Umbrella Activities:
SWE 316: Software Design and Architecture – Dr. Khalid Aljasser Objectives Lecture 11 : Frameworks SWE 316: Software Design and Architecture  To understand.
UML based dependability modeling, analysis and synthesis Proposers: TU Budapest: A. Pataricza, Gy. Csertán, I. Majzik, D. Varró PDCC Pisa: L. Simoncini,
Error reports as a source for SPI Tor Stålhane Jingyue Li, Jan M.N. Kristiansen IDI / NTNU.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Chapter 19: Quality Models and Measurements  Types of Quality Assessment Models  Data Requirements and Measurement  Comparing Quality Assessment Models.
Using error reports in SPI Tor Stålhane IDI / NTNU.
T. E. Potok - University of Tennessee CS 594 Software Engineering Lecture 3 Dr. Thomas E. Potok
Balancing Agility and Discipline Chapter 4 Sharon Beall EECS 811 April 22, 2004.
1 Introduction to Software Engineering Lecture 1.
Gan Wang 22 October th International Forum on COCOMO® and Systems/Software Cost Modeling in conjunction with the Practical Software and Systems.
An Introduction to Software Engineering
University of Southern California Center for Systems and Software Engineering COCOMO Suite Toolset Ray Madachy, NPS Winsor Brown, USC.
SFWR ENG 3KO4 Slide 1 Management of Software Engineering Chapter 8: Fundamentals of Software Engineering C. Ghezzi, M. Jazayeri, D. Mandrioli.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
University of Southern California Center for SoftwareEngineering Reliable Software Research and Technology Transition Barry Boehm, USC NASA IT Workshop.
Estimating “Size” of Software There are many ways to estimate the volume or size of software. ( understanding requirements is key to this activity ) –We.
Effort Estimation In WBS,one can estimate effort (micro-level) but needed to know: –Size of the deliverable –Productivity of resource in producing that.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
CSCE 240 – Intro to Software Engineering Lecture 2.
Unit – I Presentation. Unit – 1 (Introduction to Software Project management) Definition:-  Software project management is the art and science of planning.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
What is a software? Computer Software, or just Software, is the collection of computer programs and related data that provide the instructions telling.
Introduction Edited by Enas Naffar using the following textbooks: - A concise introduction to Software Engineering - Software Engineering for students-
COCOMO III Workshop Summary
Product reliability Measuring
Constructive Cost Model
Introduction Edited by Enas Naffar using the following textbooks: - A concise introduction to Software Engineering - Software Engineering for students-
COCOMO II Security Extension Workshop Report
More on Estimation In general, effort estimation is based on several parameters and the model ( E= a + b*S**c ): Personnel Environment Quality Size or.
Quality Management, Peer Review, & Architecture Review Board
Automated Analysis and Code Generation for Domain-Specific Models
Chapter 26 Estimation for Software Projects.
Center for Software and Systems Engineering,
Presentation transcript:

© USC-CSE 2001 Oct Constructive Quality Model – Orthogonal Defect Classification (COQUALMO-ODC) Model Keun Lee ( & Sunita Chulani

© USC-CSE 2001 Oct COQUALMO, ODC - Current shortfalls COQUALMO-ODC Model Objectives - Example of desired model Breakout Group Issues Outline

© USC-CSE 2001 Oct COCOMO II Current COQUALMO System COQUALMO Defect Introduction Model Defect Removal Model Software platform, Project, product and personnel attributes Software Size Estimate Defect removal profile levels Automation, Reviews, Testing Software development effort, cost and schedule estimate Number of residual defects Defect density per unit of size Defects Introduced Rqts. Design Code

© USC-CSE 2001 Oct Nominal Defect Introduction Rates Delivered Defects / KSLOC Composite Defect Removal Rating COQUALMO Defect Removal Estimates

© USC-CSE 2001 Oct Example : Code Defects; High Ratings Analysis : 0.7 of defects remaining Reviews : 0.4 of defects remaining Testing : 0.31 of defects remaining Together : (0.7)(0.4)(0.31) = 0.09 of defects remaining How valid is this? - All catch same defects : 0.31 of defects remaining - Mostly catch different defects : ~0.01 of defects remaining Need to determine which techniques find which defect type - Orthogonal Defect Classification (ODC) data can help Multiplicative Defect Removal Model

© USC-CSE 2001 Oct ODC A technology for software process measurement and analysis, providing a standard of unique, non-redundant attribute definitions for defect classification as well as analysis metrics and procedures for process evaluations Orthogonal Defect Classification (ODC)

© USC-CSE 2001 Oct Activity/Trigger How defect detected Impact Customer view Interaction Target/Type What was fixed Source Where defect located Qualifier Missing, Incorrect,.. Age History Feedback on Verification ProcessFeedback on Development Process Orthogonal Defect Classification (ODC)

© USC-CSE 2001 Oct IBM Results (Chillarege, 1996) ODC Data Attractive for Extending COQUALMO

© USC-CSE 2001 Oct One-size-fits-all model - May not fit COTS/Web, embedded applications Defect uniformity, independence assumptions - Unvalidated hypotheses COCOMO II C-S-Q trade offs just to RELY levels - Not to delivered defect density, MTBF Need for more calibration data - ODC data could extend and strengthen model Current COQUALMO shortfalls

© USC-CSE 2001 Oct Outline Current COQUALMO Approach - Current shortfalls COQUALMO-ODC Model Objectives - Example of desired model Breakout Group Issues

© USC-CSE 2001 Oct Support cost-schedule-quality tradeoff analysis Provide reference for defect monitoring and control Evolve to cover all major classes of project - With different defect distributions (e.g. COTS-based) - Start simple;grow opportunistically COQUALMO-ODC Model Objectives

© USC-CSE 2001 Oct K 1200K 1800K $20K/PM Effort (PM) VL L N H VH Current: RELY rating Desired: Defect MTBF KSLOC (hr) , Details for any given point- next Time (Mo.) Example of Desired Model - I

© USC-CSE 2001 Oct KSLOC; RELY = VH; 75PM; 12Mo.; Delivered Defect = 0.3 PhaseRqts.Design Code & Unit testIntegration & Test Effort(PM) Cost($K) Schedule(Mo.) Defects in/out/left - Rqts. -Design -Timing -Interface ….. 60/50/10130/116/24264/234/308/37.7/0.3 60/50/1010/16/42/5/11/2/0 120/100/2010/25/52/6.9/0.1 12/6/62/4/41/4.9/0.1 30/25/55/9/10/1/0 … Example of Desired Model - II

© USC-CSE 2001 Oct Proposed COQUALMO-ODC Approach Simplified initial model Applicability across project types Staged approach Data collection Alternatives to ODC Breakout Group Issues

© USC-CSE 2001 Oct Estimate defects introduced, removed by phase - Rqts., Design, Code & Unit Test, Integration & Test Use aggregated defect types - One requirements type - Three Design and Code types - Method : Algorithm/method, Assignment/initialization, checking - Static Relationship : relationship, function/class/object, interface, OO messages - Dynamic Relationship : timing/Serialization Use simple Defect Estimating Relationships - Defects introduced : DER (type, phase) - Defects removed : Yield (type, ART, phase) - ART : Analysis, Reviews, Testing Simplified Initial COQUALMO-ODC Model - I

© USC-CSE 2001 Oct For each defect type (1 Rqts, 3 Design, 3 Code) For each Phase (Rqts, Design, Code & Unit test, Integration & Test) 1. Estimate Defect Density Introduced (DDI) RDDI(t,p) = Reference Defect Density Introduced (from calibration) DM(i) = Defect Multiplier for COCOMO II drivers i Simplified Initial COQUALMO-ODC Model - II

© USC-CSE 2001 Oct Estimate Delivered Defect Density (DDD) - similarly for Code & Unit test, Integration & test Simplified Initial COQUALMO-ODC Model - III

© USC-CSE 2001 Oct Can be done by < 15 well-qualified people Business rules critical EUI critical Dependability critical Performance Critical Business rules, architecture generally understand COTS-driven Adaptability critical Stakeholder value-driven Method Sweet Spots RAD, XP variants Open source Business-rules driven Specification driven Method generator (MBASE/CeBASE) ODC 1. Small EUI Apps.. 2. Large EUI Apps. 3. Infrastructure 4. Device- Embedded 5. Complex Systems of Systems Critical Discriminators X X X X X X X X X ? X X X X X X X (X) X X X X X X X X X X X X X X X X X X X X X X X X X X X EUI : everyperson user interface (vs. programmer). (X) : most of the time. Business rules : application domain ontologies Project Patterns Software Project Types

© USC-CSE 2001 Oct Begin with simplified COQUALMO-ODC model - Existing simple requirements defect model - Aggregated categories for design and code defects Begin with Infrastructure applications - where most of data and experience is Experiment with small initial data set - About 5 Infrastructure data points - Plus about 5 USC small-EUI data points Refine and extend based on results, further data Staged COQUALMO-ODC approach

© USC-CSE 2001 Oct Use currently-collected ODC types - Aggregate later Try for multiple sources - IBM, USC - Exploring Motorola, HP, Telcordia Need COCOMO II cost driver rating for projects - Included on forms - Defined in Book Project Data Collection & Further Research