Presentation is loading. Please wait.

Presentation is loading. Please wait.

6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt1 Team Software Project (TSP) June 26, 2006 System Test.

Similar presentations


Presentation on theme: "6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt1 Team Software Project (TSP) June 26, 2006 System Test."— Presentation transcript:

1 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt1 Team Software Project (TSP) June 26, 2006 System Test

2 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt2 Outline Remaining Session Plan & Discussion System Test Plan Discussion Mythical Man Month System Test Plan Recap Metrics Presentations More on Measurement Next Phases Cycle 1 Test Cycle 1 Post-Mortem & Presentations Cycle 2 Plan & Strategy

3 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt3 Due Today Key Metrics Presentation (10-15 minutes) All Implementation Quality Records (LOGD, CCRs, etc.) Final code (source & executable) Updated Products (code components, SRS, HLD, User Documentation) Intermediate Products (e.g. Unit Test Plans) Configuration Management Plan Release CD: Application User Guide Release Letter No class on July 3

4 Project Performance Discussion

5 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt5 Remaining Lectures Plan/Discussion July 10 – Cycle 1 Test Complete & Post-Mortem Cycle 1 Results Presentation & Discussion Cycle 1 Reports & Post-Mortem Measurement Team audit July 17 – Cycle 2 Launch Cycle 2 Launch, Project & Measurement Planning Peopleware Topics: Management, Teams, Open Kimono, Quality, Hiring/Morale, … July 24 – Cycle 2 Requirements Complete Cycle 2 Requirements Death March Projects: July 31 – Cycle 2 Implementation Complete System Test Plan Baselined Cycle 2 Design & Implementation Process topics – CMMI, TL-9000, ISO August 7 – Cycle 2 Test Complete Cycle 2 Test Complete Cycle 2 Post-Mortem Complete August 14 - Course Review Course Review Class exercise Final

6 Remaining Course Topics Discussion

7 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt7 System Test Schedule Note: Assumes system has already passed Integration Test Full feature to system test and instructor by COB June 25 including: Test environment Executable User documentation (note: CCRs can be filed against user documentation) Source code Tester generates CCRs for all finds & fills out LOGTEST Email to instructor when generated (see below) Development team updates LOGD referencing CCRs Required turn-around times for fixes 80% within 24 hours 99% within 48 hours Required test coverage short of blocking issues 80% First Pass Test Complete by June 28 100% First Pass Test Complete by July 1 Regression Test Complete by July 3 Daily test reports to instructor detailing test cases executed, results & CCRs

8 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt8 System Test Plan Recap Areas to cover: Installation Start-up All required functions available & working as specified Diabolical (e.g. power failures, corner cases, incorrect handling) Performance Usability Includes: Test cases you plan to run (numbered / named) Expected results Ordering of testing & dependencies Supporting materials needed Traceability to requirements

9 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt9 Release “Letters” Purpose What’s in it? –Version Information –Release contents Examples: All functionality defined in Change Counter Requirements v0.6 except GUI Phase 1 features as defined in project plan x.y Feature 1, Feature 2, Feature 3 as defined by … –Known Problems Change Request IDs w/ brief customer oriented description –Fixed Problems –Upgrade Information –Other?

10 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt10 Implementation Status Implementation experience Unit/Integration experience Problems / Rework? PIP forms

11 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt11 Implementation & Test Discussion Sample topics Obstacles to success? Things that went well? Things to avoid? Biggest surprises? How did you do vs. plan? Crises handled? Team dynamics in crisis?

12 Team Presentation

13 Project Measurement Source: Practical Software Measurement John McGarry, et.al.

14 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt14 Measurement “If you can’t measure it, you can’t manage it” Tom DeMarco

15 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt15 Fundamentals Don’t try to measure everything Align measures with: Project goals & risks (basic survival mode) Process improvement areas (continual improvement mode) Define measurement program up front Monitor continuously & take action where needed

16 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt16 Applications Improve accuracy of size & cost estimates Improve quality Understand project status Produce more predictable schedules Improve organizational communication Faster, better informed management decisions Improve software processes

17 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt17 Basic In-Process Measurement Examples Schedule Earned Value vs. Planned Value Schedule Variance Development Task completion Actual code completed vs. planned Project End Game Defect Creation vs. Closure Variations: severity System Test % Testing Complete Variations: passed, failed, blocked Test Time / Defect Test Coverage (vs. requirements, white box code coverage)

18 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt18 Process Improvement Measurement Examples Quality Defect density Post Deployment defect density Inspection Effectiveness Defects / inspection hour Estimation Accuracy

19 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt19 Why Measure? Support short & long term decision making Mature software organization (CMMI level?) uses measurement to: Plan & evaluate proposed projects Objectively track actual performance against plan Guide process improvement decisions Assess business & technical performance Organizations need the right kind of information, at the right time to make the right decisions

20 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt20 Measurement in Software Lifecycle Plan Do – carry out change Check – observe effects of change Act – decide on additional areas for improvement Repeat Considerations: Cost, schedule, capability, quality

21 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt21 Measurement Psychological Effects Measurement as measures of individual performance Hawthorne Effect Measurement Errors Conscious: rounding, pencil whipping (ie. False data entry) Unintentional: inadvertent, technique (ie. Consistent)

22 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt22 Use of Measures Process Measures – time oriented, includes defect levels, events & cost elements Used to improve software development & maintenance process Product Measures – deliverables & artifacts such as documents includes size, complexity, design features, performance & quality levels Project Measures – project characteristics and execution includes # of developers, cost, schedule, productivity Resource Measures –resource utilization includes training, costs, speed & ergonomic data

23 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt23 Measurement Uses Objective information to help: Communicate effectively Track specific project objectives Identify & correct problems early Make key trade-off decisions Justify decisions

24 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt24 Glossary Entity - object or event (e.g. personnel, materials, tools & methods) Attribute - feature of an entity (e.g. # LOC inspected, # defects found, inspection time) Measurement - # and symbols assigned to attributes to describe them Measure – quantitative assessment of a product/process attribute (e.g. defect density, test pass rate, cyclomatic complexity) Measurement Reliability – consistency of measurements assuming nochange to method/subject Software validity – proof that the software is trouble free & functions correctly (ie. high quality) Predictive validity – accuracy of model estimates Measurement errors – systematic (associated with validity) & random (associated w/ reliability) Software Metrics – approach to measuring some attribute Defect – product anomaly Failure – termination of product’s ability to perform a required function

25 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt25 PSM Measurement Process Measurement Plan Information need – e.g.: What is the quality of the product? Are we on schedule? Are we within budget? How productive is the team? Measurable Concept Measured entities to satisfy need (abstract level: e.g. productivity) Measurement Construct What will be measured? How will data be combined? (e.g. size, effort) Measurement Procedure Defines mechanics for collecting and organizing data Perform Measurement Evaluate Measurement

26 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt26 Measurement Construct Attribute Base Measure Derived Measure Indicator Derived Measure Decision Criteria Measurement method Measurement Function Measurement method Analysis Model

27 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt27 Attributes Attribute Distinguishable property or characteristic of a software entity (Entities: processes, products, projects and resources) Qualitative or Quantitative measure Attribute Base Measure Derived Measure Indicator Derived Measure

28 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt28 Base Measure Measure of an attribute (one to one relationship) Measurement method Attribute quantification with respect to a scale Method type Subjective (e.g. high, medium, low), Objective (e.g. KLOC) Scale Ratio Interval Ordinal Nominal Unit of measurement e.g. hours, pages, KLOC Attribute Base Measure Derived Measure Indicator Derived Measure

29 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt29 Derived Measure Indicator Derived Measure Function of 2 or more base measures Measurement Function Algorithm for deriving data (e.g. productivity = KLOC/developer hours) Indicator Estimate or Evaluation Analysis Model Algorithm / calculation using 2 or more base &/or derived measures + Decision Criteria Numerical thresholds, targets, limits, etc. used to determine need for action or further investigation Attribute Base Measure Derived Measure Indicator Derived Measure

30 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt30 Measurement Construct Examples Productivity Attributes: Hours, KLOC Base Measures:Effort (count total hrs), Size (KLOC counter) Derived Measure:Size / Effort = Productivity Analysis Model:Compute Mean, compute std deviation Indicator:Productivity: mean w/ 2  confidence limits Quality Attributes:Defects, KLOC Base Measures:# Defects (count defects), Size (KLOC counter) Derived Measures:# Defects / Size = Defect Rate Indicator:Defect rate control: baseline mean, control limits & measured defect rate Attribute Base Measure Derived Measure Indicator Derived Measure

31 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt31 More Measurement Construct Examples Coding Base Measure:Schedule (w.r.t. coded units) Derived Measure:Planned units, actual units Analysis Model:Subtract units completed from planned units Indicator:Planned versus actual units complete + variance Attribute Base Measure Derived Measure Indicator Derived Measure

32 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt32 Class Measurement Construct Examples Coding Base Measure: Derived Measure: Analysis Model: Indicator: Attribute Base Measure Derived Measure Indicator Derived Measure

33 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt33 Identify Candidate Information Needs Project Objectives Cost, schedule, quality, capability Risks Prioritize One approach: probability of occurrence x project impact = project exposure e.g. Schedule Budget Reliability Dependencies Product Volatility Measurement Planning

34 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt34 PSM Common Information Categories Schedule & Progress Resources & Cost Product Size & Stability Product Quality Process Performance Technology Effectiveness Customer Satisfaction

35 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt35 PSM Common Information Categories Measurement Concepts Schedule & Progress- milestone dates/completion, EV/PV Resources & Cost- staff level, effort, budget, expenditures Product Size & Stability- KLOC/FP, # requirements, # interfaces Product Quality- defects, defect age, MTBF, complexity Process Performance- productivity, rework effort, yield Technology Effectiveness- requirements coverage Customer Satisfaction- customer feedback, satisfaction ratings, support requests, support time, willingness to repurchase

36 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt36 Select & Specify Measures Considerations Utilize existing data collection mechanisms As invisible as possible Limit categories & choices Use automated methods over manual Beware of accuracy issues (e.g. timecards) Frequency needs to be enough to support ongoing decision making (alternative: gate processes)

37 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt37 Measurement Construct Information Need Measurable Concept Relevant Entities Attributes Base Measures Measurement Method Type of Method Scale Type of Scale Unit of Measurement Derived Measures Measurement Function Indicator Analysis Model Decision Criteria

38 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt38 Project Measurement Plan Template (from PSM figure 3-10, p 56) Introduction Project Description Measurement Roles, Responsibilities & Communications Description of Project Information Needs Measurement Specifications (i.e. constructs) Project Aggregation Structures Reporting Mechanisms & Periodicity

39 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt39 Team Project Postmortem Why Insanity Continuous improvement Mechanism to learn & improve Improve by changing processes or better following current processes Tracking process improvements during project Process Improvement Proposals (PIP) Post-Mortem Areas to consider Better personal practices Improved tools Process changes

40 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt40 Cycle 2 Measurement Plan Identify cycle 2 risks & information needs Review & revise measures & create measurement constructs Document in a measurement plan

41 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt41 Postmortem process Team discussion of project data Review & critique of roles

42 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt42 Postmortem process Review Process Data Review of cycle data including SUMP & SUMQ forms Examine data on team & team member activities & accomplishments Identify where process worked & where it didn’t Quality Review Analysis of team’s defect data Actual performance vs. plan Lessons learned Opportunities for improvement Problems to be corrected in future PIP forms for all improvement suggestions Role Evaluations What worked? Problems? Improvement areas? Improvement goals for next cycle / project?

43 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt43 Cycle Report Table of contents Summary Role Reports Leadership – leadership perspective Motivational & commitment issues, meeting facilitation, req’d instructor support Development Effectiveness of development strategy, design & implementation issues Planning Team’s performance vs. plan, improvements to planning process Quality / Process Process discipline, adherence, documentation, PIPs & analysis, inspections Cross-team system testing planning & execution Support Facilities, CM & Change Control, change activity data & change handling, ITL Engineer Reports – individual assessments

44 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt44 Role Evaluations & Peer Forms Consider & fill out PEER formsPEER Ratings (1-5) on work, team & project performance, roles & team members Additional role evaluations suggestions Constructive feedback Discuss behaviors or product, not person Team leaders fill out TEAM EVALUATION formTEAM EVALUATION

45 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt45 Cycle 1 Project Notebook Update Updated Requirements & Design documents Conceptual Design, SRS, SDS, System Test Plan, User Documentation* Updated Process descriptions Baseline processes, continuous process improvement, CM Tracking forms ITL, LOGD, Inspection forms, LOGTEST Planning & actual performance Team Task, Schedule, SUMP, SUMQ, SUMS, SUMTASK, CCR*

46 6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt46 Due July 10 Class Cycle 1 Reports / Post-Mortem Cycle 1 Results Presentation Cycle 2 Project Plan Cycle 2 Measurement Plan

47 Cycle 1 Audit


Download ppt "6/26/2007SE 652- 2007_6_26_TestResults_PSMp1.ppt1 Team Software Project (TSP) June 26, 2006 System Test."

Similar presentations


Ads by Google