6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt1 Team Software Project (TSP) June 26, 2006 System Test.

Slides:



Advertisements
Similar presentations
Making the System Operational
Advertisements

Automated Software Testing: Test Execution and Review Amritha Muralidharan (axm16u)
Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
More CMM Part Two : Details.
Sponsored by the U.S. Department of Defense © 2002 by Carnegie Mellon University July 2002 Pittsburgh, PA Lecture 6: Team Planning.
Configuration Management Managing Change. Points to Ponder Which is more important?  stability  progress Why is change potentially dangerous?
Sixth Hour Lecture 10:30 – 11:20 am, September 9 Framework for a Software Management Process – Artifacts of the Process (Part II, Chapter 6 of Royce’ book)
Overview Lesson 10,11 - Software Quality Assurance
6/19/2007SE _06_19_Overview_Inspections.ppt1 Team Software Project (TSP) June 19, 2007 High Level Designs, Code Inspections & Measurement.
Software Quality Metrics
Project Measurement Source: Practical Software Measurement John McGarry, et.al.
6/19/2007SE _6_19_TSPImp_SVT_Lecture.ppt1 Implementation Phase Inputs: Development strategy & plan Completed, inspected & baselined SRS & SDS.
SE 555 Software Requirements & Specification Requirements Management.
Software Project Transition Planning
1 Software Testing and Quality Assurance Lecture 15 - Planning for Testing (Chapter 3, A Practical Guide to Testing Object- Oriented Software)
RIT Software Engineering
SE 450 Software Processes & Product Metrics 1 Defect Removal.
Fundamentals of Information Systems, Second Edition
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Business & Enterprise Systems Introduction to Hewlett Packard (HP) Application Lifecycle Management.
SE 652 Software Quality Management Summer 2007 Lee Vallone.
8/7/2007SE _8_07_Misc_PostMortem.ppt1 Additional Topics & Team Project Post-Mortem.
Software Process and Product Metrics
Stoimen Stoimenov QA Engineer QA Engineer SitefinityLeads,SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Fundamental of Software Project Management Team Assignment 1 – K15T2 – Team 07.
12 Steps to Useful Software Metrics
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Team Launch Introduction. Real projects are large and complex, and most software is created by teams Merely throwing people together does not result in.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
INFO 637Lecture #81 Software Engineering Process II Integration and System Testing INFO 637 Glenn Booker.
N By: Md Rezaul Huda Reza n
Cmpe 589 Spring Software Quality Metrics Product  product attributes –Size, complexity, design features, performance, quality level Process  Used.
Software Quality Assurance Activities
Rational Unified Process Fundamentals Module 4: Disciplines II.
Unit 8 Syllabus Quality Management : Quality concepts, Software quality assurance, Software Reviews, Formal technical reviews, Statistical Software quality.
Software System Engineering: A tutorial
Introduction to Software Engineering LECTURE 2 By Umm-e-Laila 1Compiled by: Umm-e-Laila.
Chapter 6 : Software Metrics
University of Sunderland CIFM03Lecture 4 1 Software Measurement and Reliability CIFM03 Lecture 4.
Testing Workflow In the Unified Process and Agile/Scrum processes.
Software Project Management With Usage of Metrics Candaş BOZKURT - Tekin MENTEŞ Delta Aerospace May 21, 2004.
Software Project Management Lecture # 10. Outline Quality Management (chapter 26)  What is quality?  Meaning of Quality in Various Context  Some quality.
Software process improvement Framework for SPI SPI support groups, maturity and immaturity models Assessment and gap analysis Education and training Selection.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
CS 3610: Software Engineering – Fall 2009 Dr. Hisham Haddad – CSIS Dept. Chapter 2 The Software Process Discussion of the Software Process: Process Framework,
Managing CMMI® as a Project
INFO 637Lecture #101 Software Engineering Process II Review INFO 637 Glenn Booker.
Team Software Process (TSPi) CS4320 Fall TSP Strategy Provide a simple process framework based on the PSP. Use modest, well-defined problems. Develop.
CHAPTER 9 INSPECTIONS AS AN UP-FRONT QUALITY TECHNIQUE
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Business & Enterprise Systems The Integrated Master Plan (IMP) and the Integrated Master Schedule.
Develop Project Charter
Computing and SE II Chapter 15: Software Process Management Er-Yu Ding Software Institute, NJU.
Chapter 3: Software Project Management Metrics
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
University of Southern California Center for Systems and Software Engineering Software Metrics and Measurements Supannika Koolmanojwong CS577 1.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
Rational Unified Process Fundamentals Module 4: Core Workflows II - Concepts Rational Unified Process Fundamentals Module 4: Core Workflows II - Concepts.
Hussein Alhashimi. “If you can’t measure it, you can’t manage it” Tom DeMarco,
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Software Engineering Lecture 9: Configuration Management.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
TMP3413 Software Engineering Lab Lab 01: TSPi Tool Support.
What is a software? Computer Software, or just Software, is the collection of computer programs and related data that provide the instructions telling.
CS646: Software Design and Architectures Introduction and Overview †  Definitions.  The general design process.  A context for design: the waterfall.
by: Er. Manu Bansal Deptt of IT Software Quality Assurance.
CS 577b: Software Engineering II
Software Configuration Management
TechStambha PMP Certification Training
CMMI – Staged Representation
Fundamental Test Process
Software metrics.
Presentation transcript:

6/26/2007SE _6_26_TestResults_PSMp1.ppt1 Team Software Project (TSP) June 26, 2006 System Test

6/26/2007SE _6_26_TestResults_PSMp1.ppt2 Outline Remaining Session Plan & Discussion System Test Plan Discussion Mythical Man Month System Test Plan Recap Metrics Presentations More on Measurement Next Phases Cycle 1 Test Cycle 1 Post-Mortem & Presentations Cycle 2 Plan & Strategy

6/26/2007SE _6_26_TestResults_PSMp1.ppt3 Due Today Key Metrics Presentation (10-15 minutes) All Implementation Quality Records (LOGD, CCRs, etc.) Final code (source & executable) Updated Products (code components, SRS, HLD, User Documentation) Intermediate Products (e.g. Unit Test Plans) Configuration Management Plan Release CD: Application User Guide Release Letter No class on July 3

Project Performance Discussion

6/26/2007SE _6_26_TestResults_PSMp1.ppt5 Remaining Lectures Plan/Discussion July 10 – Cycle 1 Test Complete & Post-Mortem Cycle 1 Results Presentation & Discussion Cycle 1 Reports & Post-Mortem Measurement Team audit July 17 – Cycle 2 Launch Cycle 2 Launch, Project & Measurement Planning Peopleware Topics: Management, Teams, Open Kimono, Quality, Hiring/Morale, … July 24 – Cycle 2 Requirements Complete Cycle 2 Requirements Death March Projects: July 31 – Cycle 2 Implementation Complete System Test Plan Baselined Cycle 2 Design & Implementation Process topics – CMMI, TL-9000, ISO August 7 – Cycle 2 Test Complete Cycle 2 Test Complete Cycle 2 Post-Mortem Complete August 14 - Course Review Course Review Class exercise Final

Remaining Course Topics Discussion

6/26/2007SE _6_26_TestResults_PSMp1.ppt7 System Test Schedule Note: Assumes system has already passed Integration Test Full feature to system test and instructor by COB June 25 including: Test environment Executable User documentation (note: CCRs can be filed against user documentation) Source code Tester generates CCRs for all finds & fills out LOGTEST to instructor when generated (see below) Development team updates LOGD referencing CCRs Required turn-around times for fixes 80% within 24 hours 99% within 48 hours Required test coverage short of blocking issues 80% First Pass Test Complete by June % First Pass Test Complete by July 1 Regression Test Complete by July 3 Daily test reports to instructor detailing test cases executed, results & CCRs

6/26/2007SE _6_26_TestResults_PSMp1.ppt8 System Test Plan Recap Areas to cover: Installation Start-up All required functions available & working as specified Diabolical (e.g. power failures, corner cases, incorrect handling) Performance Usability Includes: Test cases you plan to run (numbered / named) Expected results Ordering of testing & dependencies Supporting materials needed Traceability to requirements

6/26/2007SE _6_26_TestResults_PSMp1.ppt9 Release “Letters” Purpose What’s in it? –Version Information –Release contents Examples: All functionality defined in Change Counter Requirements v0.6 except GUI Phase 1 features as defined in project plan x.y Feature 1, Feature 2, Feature 3 as defined by … –Known Problems Change Request IDs w/ brief customer oriented description –Fixed Problems –Upgrade Information –Other?

6/26/2007SE _6_26_TestResults_PSMp1.ppt10 Implementation Status Implementation experience Unit/Integration experience Problems / Rework? PIP forms

6/26/2007SE _6_26_TestResults_PSMp1.ppt11 Implementation & Test Discussion Sample topics Obstacles to success? Things that went well? Things to avoid? Biggest surprises? How did you do vs. plan? Crises handled? Team dynamics in crisis?

Team Presentation

Project Measurement Source: Practical Software Measurement John McGarry, et.al.

6/26/2007SE _6_26_TestResults_PSMp1.ppt14 Measurement “If you can’t measure it, you can’t manage it” Tom DeMarco

6/26/2007SE _6_26_TestResults_PSMp1.ppt15 Fundamentals Don’t try to measure everything Align measures with: Project goals & risks (basic survival mode) Process improvement areas (continual improvement mode) Define measurement program up front Monitor continuously & take action where needed

6/26/2007SE _6_26_TestResults_PSMp1.ppt16 Applications Improve accuracy of size & cost estimates Improve quality Understand project status Produce more predictable schedules Improve organizational communication Faster, better informed management decisions Improve software processes

6/26/2007SE _6_26_TestResults_PSMp1.ppt17 Basic In-Process Measurement Examples Schedule Earned Value vs. Planned Value Schedule Variance Development Task completion Actual code completed vs. planned Project End Game Defect Creation vs. Closure Variations: severity System Test % Testing Complete Variations: passed, failed, blocked Test Time / Defect Test Coverage (vs. requirements, white box code coverage)

6/26/2007SE _6_26_TestResults_PSMp1.ppt18 Process Improvement Measurement Examples Quality Defect density Post Deployment defect density Inspection Effectiveness Defects / inspection hour Estimation Accuracy

6/26/2007SE _6_26_TestResults_PSMp1.ppt19 Why Measure? Support short & long term decision making Mature software organization (CMMI level?) uses measurement to: Plan & evaluate proposed projects Objectively track actual performance against plan Guide process improvement decisions Assess business & technical performance Organizations need the right kind of information, at the right time to make the right decisions

6/26/2007SE _6_26_TestResults_PSMp1.ppt20 Measurement in Software Lifecycle Plan Do – carry out change Check – observe effects of change Act – decide on additional areas for improvement Repeat Considerations: Cost, schedule, capability, quality

6/26/2007SE _6_26_TestResults_PSMp1.ppt21 Measurement Psychological Effects Measurement as measures of individual performance Hawthorne Effect Measurement Errors Conscious: rounding, pencil whipping (ie. False data entry) Unintentional: inadvertent, technique (ie. Consistent)

6/26/2007SE _6_26_TestResults_PSMp1.ppt22 Use of Measures Process Measures – time oriented, includes defect levels, events & cost elements Used to improve software development & maintenance process Product Measures – deliverables & artifacts such as documents includes size, complexity, design features, performance & quality levels Project Measures – project characteristics and execution includes # of developers, cost, schedule, productivity Resource Measures –resource utilization includes training, costs, speed & ergonomic data

6/26/2007SE _6_26_TestResults_PSMp1.ppt23 Measurement Uses Objective information to help: Communicate effectively Track specific project objectives Identify & correct problems early Make key trade-off decisions Justify decisions

6/26/2007SE _6_26_TestResults_PSMp1.ppt24 Glossary Entity - object or event (e.g. personnel, materials, tools & methods) Attribute - feature of an entity (e.g. # LOC inspected, # defects found, inspection time) Measurement - # and symbols assigned to attributes to describe them Measure – quantitative assessment of a product/process attribute (e.g. defect density, test pass rate, cyclomatic complexity) Measurement Reliability – consistency of measurements assuming nochange to method/subject Software validity – proof that the software is trouble free & functions correctly (ie. high quality) Predictive validity – accuracy of model estimates Measurement errors – systematic (associated with validity) & random (associated w/ reliability) Software Metrics – approach to measuring some attribute Defect – product anomaly Failure – termination of product’s ability to perform a required function

6/26/2007SE _6_26_TestResults_PSMp1.ppt25 PSM Measurement Process Measurement Plan Information need – e.g.: What is the quality of the product? Are we on schedule? Are we within budget? How productive is the team? Measurable Concept Measured entities to satisfy need (abstract level: e.g. productivity) Measurement Construct What will be measured? How will data be combined? (e.g. size, effort) Measurement Procedure Defines mechanics for collecting and organizing data Perform Measurement Evaluate Measurement

6/26/2007SE _6_26_TestResults_PSMp1.ppt26 Measurement Construct Attribute Base Measure Derived Measure Indicator Derived Measure Decision Criteria Measurement method Measurement Function Measurement method Analysis Model

6/26/2007SE _6_26_TestResults_PSMp1.ppt27 Attributes Attribute Distinguishable property or characteristic of a software entity (Entities: processes, products, projects and resources) Qualitative or Quantitative measure Attribute Base Measure Derived Measure Indicator Derived Measure

6/26/2007SE _6_26_TestResults_PSMp1.ppt28 Base Measure Measure of an attribute (one to one relationship) Measurement method Attribute quantification with respect to a scale Method type Subjective (e.g. high, medium, low), Objective (e.g. KLOC) Scale Ratio Interval Ordinal Nominal Unit of measurement e.g. hours, pages, KLOC Attribute Base Measure Derived Measure Indicator Derived Measure

6/26/2007SE _6_26_TestResults_PSMp1.ppt29 Derived Measure Indicator Derived Measure Function of 2 or more base measures Measurement Function Algorithm for deriving data (e.g. productivity = KLOC/developer hours) Indicator Estimate or Evaluation Analysis Model Algorithm / calculation using 2 or more base &/or derived measures + Decision Criteria Numerical thresholds, targets, limits, etc. used to determine need for action or further investigation Attribute Base Measure Derived Measure Indicator Derived Measure

6/26/2007SE _6_26_TestResults_PSMp1.ppt30 Measurement Construct Examples Productivity Attributes: Hours, KLOC Base Measures:Effort (count total hrs), Size (KLOC counter) Derived Measure:Size / Effort = Productivity Analysis Model:Compute Mean, compute std deviation Indicator:Productivity: mean w/ 2  confidence limits Quality Attributes:Defects, KLOC Base Measures:# Defects (count defects), Size (KLOC counter) Derived Measures:# Defects / Size = Defect Rate Indicator:Defect rate control: baseline mean, control limits & measured defect rate Attribute Base Measure Derived Measure Indicator Derived Measure

6/26/2007SE _6_26_TestResults_PSMp1.ppt31 More Measurement Construct Examples Coding Base Measure:Schedule (w.r.t. coded units) Derived Measure:Planned units, actual units Analysis Model:Subtract units completed from planned units Indicator:Planned versus actual units complete + variance Attribute Base Measure Derived Measure Indicator Derived Measure

6/26/2007SE _6_26_TestResults_PSMp1.ppt32 Class Measurement Construct Examples Coding Base Measure: Derived Measure: Analysis Model: Indicator: Attribute Base Measure Derived Measure Indicator Derived Measure

6/26/2007SE _6_26_TestResults_PSMp1.ppt33 Identify Candidate Information Needs Project Objectives Cost, schedule, quality, capability Risks Prioritize One approach: probability of occurrence x project impact = project exposure e.g. Schedule Budget Reliability Dependencies Product Volatility Measurement Planning

6/26/2007SE _6_26_TestResults_PSMp1.ppt34 PSM Common Information Categories Schedule & Progress Resources & Cost Product Size & Stability Product Quality Process Performance Technology Effectiveness Customer Satisfaction

6/26/2007SE _6_26_TestResults_PSMp1.ppt35 PSM Common Information Categories Measurement Concepts Schedule & Progress- milestone dates/completion, EV/PV Resources & Cost- staff level, effort, budget, expenditures Product Size & Stability- KLOC/FP, # requirements, # interfaces Product Quality- defects, defect age, MTBF, complexity Process Performance- productivity, rework effort, yield Technology Effectiveness- requirements coverage Customer Satisfaction- customer feedback, satisfaction ratings, support requests, support time, willingness to repurchase

6/26/2007SE _6_26_TestResults_PSMp1.ppt36 Select & Specify Measures Considerations Utilize existing data collection mechanisms As invisible as possible Limit categories & choices Use automated methods over manual Beware of accuracy issues (e.g. timecards) Frequency needs to be enough to support ongoing decision making (alternative: gate processes)

6/26/2007SE _6_26_TestResults_PSMp1.ppt37 Measurement Construct Information Need Measurable Concept Relevant Entities Attributes Base Measures Measurement Method Type of Method Scale Type of Scale Unit of Measurement Derived Measures Measurement Function Indicator Analysis Model Decision Criteria

6/26/2007SE _6_26_TestResults_PSMp1.ppt38 Project Measurement Plan Template (from PSM figure 3-10, p 56) Introduction Project Description Measurement Roles, Responsibilities & Communications Description of Project Information Needs Measurement Specifications (i.e. constructs) Project Aggregation Structures Reporting Mechanisms & Periodicity

6/26/2007SE _6_26_TestResults_PSMp1.ppt39 Team Project Postmortem Why Insanity Continuous improvement Mechanism to learn & improve Improve by changing processes or better following current processes Tracking process improvements during project Process Improvement Proposals (PIP) Post-Mortem Areas to consider Better personal practices Improved tools Process changes

6/26/2007SE _6_26_TestResults_PSMp1.ppt40 Cycle 2 Measurement Plan Identify cycle 2 risks & information needs Review & revise measures & create measurement constructs Document in a measurement plan

6/26/2007SE _6_26_TestResults_PSMp1.ppt41 Postmortem process Team discussion of project data Review & critique of roles

6/26/2007SE _6_26_TestResults_PSMp1.ppt42 Postmortem process Review Process Data Review of cycle data including SUMP & SUMQ forms Examine data on team & team member activities & accomplishments Identify where process worked & where it didn’t Quality Review Analysis of team’s defect data Actual performance vs. plan Lessons learned Opportunities for improvement Problems to be corrected in future PIP forms for all improvement suggestions Role Evaluations What worked? Problems? Improvement areas? Improvement goals for next cycle / project?

6/26/2007SE _6_26_TestResults_PSMp1.ppt43 Cycle Report Table of contents Summary Role Reports Leadership – leadership perspective Motivational & commitment issues, meeting facilitation, req’d instructor support Development Effectiveness of development strategy, design & implementation issues Planning Team’s performance vs. plan, improvements to planning process Quality / Process Process discipline, adherence, documentation, PIPs & analysis, inspections Cross-team system testing planning & execution Support Facilities, CM & Change Control, change activity data & change handling, ITL Engineer Reports – individual assessments

6/26/2007SE _6_26_TestResults_PSMp1.ppt44 Role Evaluations & Peer Forms Consider & fill out PEER formsPEER Ratings (1-5) on work, team & project performance, roles & team members Additional role evaluations suggestions Constructive feedback Discuss behaviors or product, not person Team leaders fill out TEAM EVALUATION formTEAM EVALUATION

6/26/2007SE _6_26_TestResults_PSMp1.ppt45 Cycle 1 Project Notebook Update Updated Requirements & Design documents Conceptual Design, SRS, SDS, System Test Plan, User Documentation* Updated Process descriptions Baseline processes, continuous process improvement, CM Tracking forms ITL, LOGD, Inspection forms, LOGTEST Planning & actual performance Team Task, Schedule, SUMP, SUMQ, SUMS, SUMTASK, CCR*

6/26/2007SE _6_26_TestResults_PSMp1.ppt46 Due July 10 Class Cycle 1 Reports / Post-Mortem Cycle 1 Results Presentation Cycle 2 Project Plan Cycle 2 Measurement Plan

Cycle 1 Audit