2/17/2001 ASEE Workshop 1 Dashboard Metrics Presented by: Lita Marishak Robert Mosser
2/17/2001 ASEE Workshop 2 Communicate large amounts of data within a small space and time frame +Quick presentation of data with comparisons +Clear and accurate presentation Allow ease of understanding +Common, easy-to-understand format +“Everyday” feel to the presentation Facilitate decision-making +Objective data with comparisons +Timely and accurate information Gain economic acceptance for current and future efforts +Real dollar impact/benefit presentation +Excellent format for check signers, bean-counters, those with something to gain from saving or spending money more wisely Visual Presentation of Metrics
2/17/2001 ASEE Workshop 3 A Dashboard Example: Graphics Expected Traffic Signal Bar Graph Meter/Gauge Line GraphColumn Bar Graph Pie Chart Indicator Light
2/17/2001 ASEE Workshop 4
2/17/2001 ASEE Workshop 5 A Dashboard Example: Categories Performance/Status Coverage Completion Productivity Quality
2/17/2001 ASEE Workshop 6 A Dashboard Example: Performance/Status Section BOGUS PRODUCT 2, Ver. 2.5 Defects Test Cases Weeks 11/6/200012/7/2000 FromTo Reporting Period 1. Defects Found 2. Test Cases Successfully Run 3. Elapsed Time 306.7% Spent Status RemainingThis Period 310% Found Expected 9/18 Cumulative Expected 10/27 100% Complete
2/17/2001 ASEE Workshop 7 DO +Present the data and tell the truth +Emphasize substance instead of methodology or the graphic design +Reveal the data at several levels: from broad to the detail +Use accurate comparisons +Provide the greatest amount of data with the least amount of ink in the smallest space +Represent numbers and the corresponding graphic sizes in the same proportions +Show data variation Effective Do’s and Don’ts
2/17/2001 ASEE Workshop 8 DON’T +Let the information lie +Get graphically complicated +Show design variation +Forget to label the data right on the graphic +Distort the visual representation of the data +Forget to provide actual numbers used to create the graphics Effective Do’s and Don’ts (con’t)
2/17/2001 ASEE Workshop 9 Conclusion Benefits Client Acceptance Next Steps
2/17/2001 ASEE Workshop 10 References Edward R. Tufte, The Visual Display of Quantitative Information, 1983, Graphics Press Edward R. Tufte, Envisioning Information, 1990, Graphics Press Edward R. Tufte, Visual Explanations, 1997, Graphics Press Ben Schneiderman, Designing the User Interface, 3rd Edition, 1998 Laura Arlov, GUI Design for Dummies, 1997 Darrell Huff, How to Lie with Statistics, 1954 Elizabeth Linkdholm, Information Center, August, 1987, “There’s More to Graphics than Making Pretty Pictures. It’s Called Analysis” Lloyd Dobyns and Clare Crawford- Mason, Thinking About Quality: Progress, Wisdom, and the Deming Philosophy, 1994 A. J. Cameron, A Guide to Graphs, 1970 William S. Cleveland, The Elements of Graphing Data, 1985 Allan C. Haskell, How to Make and Use Graphic Charts, 1920 Robert L. Harris, Information Graphics: a Comprehensive Illustrated Reference,1996 Mary Eleanor Spear, Practical Charting Techniques, 1969 Jan V. White, Using Charts and Graphs, 1000 Ideas for Visual Persuasion, 1984
2/17/2001 ASEE Workshop 11 Supplementary Materials Related Data Sheets for Example Dashboard Additional Dashboard Example Related Data Sheets
1. Defects Found Expected Test Cases Run Expected Time Spent Expected 6 4. Expected Test Cases by OS 5. Actual Test Cases by OS 8. Expected Test Cases by Browser 9. Actual Test Cases by Browser BOGUS PRODUCT 2 DASHBOARD DATA SHEET Total Expected is the number of test cases that were estimated to be created or executed during this reporting period. Actual is the number of test cases that were created or executed during this reporting period. Total Overdue is the sum of the number of test cases expected to be created or executed up to and including this reporting period less the total number of test cases actually created or executed up to and including this reporting period. 11. Test Case Execution Status 16. Q/A Rework by Caused and Q/A Phases Rework 474 hours Lost Time 91 Work Defects by Status and Q/A Phase Win NTWin 98Win ExpectedActualCompleted Win NT Win Win IE (5.0)Netscape WIN NTWIN 98WIN Expected Created 1192 Test Cases Completed ReworkLostWork % Rework & Lost Defects by Status and Priority Req'mntsDesignCodeTest Creation0000 Execution % Rework and Lost Hours --THIS PERIOD-- Create Execute Total Expected Actual Total Overdue (Cumulative) 0 0
BOGUS PRODUCT 2 DASHBOARD DATA SHEET Page 2 6. Expected Test Cases by Functionality 7. Actual Test Cases by Functionality Funct/StrucMDACEnviron QA Creation Statistics 12. QA Execution Statistics 15. Expected vs. Found Defects 18. Defects by Caused Phase &Component
2/17/2001 ASEE Workshop 14
1. Defects Found 2. Test Cases Run Expected Time Spent Expected 5 4. Expected Test Cases by Type 6. Expected Test Cases by Priority7. Actual Test Cases by Priority BOGUS PRODUCT 1, DASHBOARD DATA SHEET Total Expected is the number of test cases that were estimated to be created or executed during this reporting period. Actual is the number of test cases that were created or executed during this reporting period. Total Overdue is the sum of the number of test cases expected to be created or executed up to and including this reporting period less the total number of test cases actually created or executed up to and including this reporting period. 150 Expected Created 1637 Expected Executed 8. Test Case Execution Status 13. Q/A Rework by Caused and Q/A Phases Rework 8 hours Lost Time 0 Work Expected vs. Found Defects 15. Defects by Status and Q/A Phase Expected Defects by Status & Severity 5. Actual Test Cases by Type --THIS PERIOD-- Create Execute Total Expected Actual Total Overdue (Cumulative) Percent Rework Hours
2/17/2001 ASEE Workshop 16 BOGUS PRODUCT 1, DASHBOARD DATA SHEET Page 2 9. Test Case Creation Statistics 10. Test Case Execution Statistics 16. Defects by Caused Phase & Component