1 Common Mistakes in Performance Evaluation (1) 1.No Goals  Goals  Techniques, Metrics, Workload 2.Biased Goals  (Ex) To show that OUR system is better.

Slides:



Advertisements
Similar presentations
Functional Form and Dynamic Models
Advertisements

REGRESSION, IV, MATCHING Treatment effect Boualem RABTA Center for World Food Studies (SOW-VU) Vrije Universiteit - Amsterdam.
Design of Experiments Lecture I
Module B-4: Processing ICT survey data TRAINING COURSE ON THE PRODUCTION OF STATISTICS ON THE INFORMATION ECONOMY Module B-4 Processing ICT Survey data.
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 7. Specification and Data Problems.
Chapter 3: Editing and Debugging SAS Programs. Some useful tips of using Program Editor Add line number: In the Command Box, type num, enter. Save SAS.
Learning objectives You should be able to: –Identify the requirements for the Data Collection and Processing section of the Internal Assessment –Collect.
1 Design of Engineering Experiments Part 3 – The Blocking Principle Text Reference, Chapter 4 Blocking and nuisance factors The randomized complete block.
Statistical Methods in Computer Science Hypothesis Life-cycle Ido Dagan.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
ITEC 451 Network Design and Analysis. 2 You will Learn: (1) Specifying performance requirements Evaluating design alternatives Comparing two or more systems.
Chapter 13 Additional Topics in Regression Analysis
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 7. Specification and Data Problems.
Statistics CSE 807.
Additional Topics in Regression Analysis
Role and Place of Statistical Data Analysis and very simple applications Simplified diagram of scientific research When you know the system: Estimation.
CS533 Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2)
Chapter 7: Control Charts For Attributes
Chapter 11 Standard costs for control: flexible budgets and manufacturing overhead.
1 AQA ICT AS Level © Nelson Thornes 2008 Testing a solution.
Research Report Chapter 15. Research Report – APA Format Title Page Running head – BRIEF TITLE, positioned in upper left corner of no more than 50 characters.
INT 506/706: Total Quality Management Introduction to Design of Experiments.
Chapter 13Design & Analysis of Experiments 8E 2012 Montgomery 1.
Chapter 16 Measuring in Research. Measurement Challenges in Research in PE, Sport, & Exercise Science For physical education:For physical education: –Involvement.
Research Design. Research is based on Scientific Method Propose a hypothesis that is testable Objective observations are collected Results are analyzed.
Copyright © 2014 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
Lecture 14 Dustin Lueker. 2  Inferential statistical methods provide predictions about characteristics of a population, based on information in a sample.
Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2) 10/4/2015H.Malekinezhad1.
Experimental Research Methods in Language Learning Chapter 11 Correlational Analysis.
Introduction to Experimental Design
Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient.
Research Project Statistical Analysis. What type of statistical analysis will I use to analyze my data? SEM (does not tell you level of significance)
Accuracy Precision % Error. Variable is a factor that affects the outcome of an experiment. 3 Types of variables Experimental/ Independent Variable The.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
1 f02kitchenham5 Preliminary Guidelines for Empirical Research in Software Engineering Barbara A. Kitchenham etal IEEE TSE Aug 02.
ICT Techniques Processing Information. Data validation Validation is the checking of data before processing takes place. You need to ensure this is done.
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
Chapter 09 Audit Sampling McGraw-Hill/IrwinCopyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
Project quality management. Introduction Project quality management includes the process required to ensure that the project satisfies the needs for which.
Error & Uncertainty: II CE / ENVE 424/524. Handling Error Methods for measuring and visualizing error and uncertainty vary for nominal/ordinal and interval/ratio.
Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final Presentation Experimental.
Notes 1.3 (Part 1) An Overview of Statistics. What you will learn 1. How to design a statistical study 2. How to collect data by taking a census, using.
ICT Techniques Data to Information. Data – the raw, unprocessed facts that passes through a series of operational steps to become useful and meaningful.
1 f02laitenberger7 An Internally Replicated Quasi- Experimental Comparison of Checklist and Perspective-Based Reading of Code Documents Laitenberger, etal.
CPE 619: Modeling and Analysis of Computer and Communications Systems Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department.
Chapter 6: Analyzing and Interpreting Quantitative Data
Missing Values Raymond Kim Pink Preechavanichwong Andrew Wendel October 27, 2015.
9-1 Copyright © 2016 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill Education.
© 2003, Carla Ellis Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final.
Chapter 3 – The Process Approach 1 Process Approach Auditing 14-April-2009 QUESTIONS FOR USE DURING YOUR PROCESS AUDITS.
ANOVA EDL 714, Fall Analysis of variance  ANOVA  An omninbus procedure that performs the same task as running multiple t-tests between all groups.
Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final Presentation Experimental.
Introduction Andy Wang CIS Computer Systems Performance Analysis.
Manuscript Review: A Checklist From: Seals, D.R and H Tanaka Advances in Physiology Education 23:52-58.
Building Valid, Credible & Appropriately Detailed Simulation Models
Common Pitfalls in Randomized Evaluations Jenny C. Aker Tufts University.
Common Mistakes in Performance Evaluation The Art of Computer Systems Performance Analysis By Raj Jain Adel Nadjaran Toosi.
Software Verification and Validation
Performance evaluation
Network Performance and Quality of Service
ITEC 451 Network Design and Analysis
Experimental Design.
Experimental Design.
Andy Wang CIS 5930 Computer Systems Performance Analysis
Evidence Based Practice
STA 291 Spring 2008 Lecture 13 Dustin Lueker.
DESIGN OF EXPERIMENTS by R. C. Baker
Presentation transcript:

1 Common Mistakes in Performance Evaluation (1) 1.No Goals  Goals  Techniques, Metrics, Workload 2.Biased Goals  (Ex) To show that OUR system is better than THEIRS 3.Unsystematic Approach 4.Analysis Without Understanding the Problem 5.Incorrect Performance Metrics 6.Unrepresentative Workload 7.Wrong Evaluation Technique

2 Common Mistakes in Performance Evaluation (2) 8.Overlook Important Parameters 9.Ignore Significant Factors 10.Inappropriate Experimental Design 11.Inappropriate Level of Detail 12.No Analysis 13.Erroneous Analysis 14.No Sensitivity Analysis 15.Ignore Errors in Input

3 Common Mistakes in Performance Evaluation (3) 16.Improper Treatment of Outliers 17.Assuming No Change in the Future 18.Ignoring Variability 19.Too Complex Analysis 20.Improper Presentation of Results 21.Ignoring Social Aspects 22.Omitting Assumptions and Limitations

4 Checklist for Avoiding Common Mistakes (1) 1.Is the system correctly defined and the goals clearly stated? 2.Are the goals stated in an unbiased manner? 3.Have all the steps of the analysis followed systematically? 4.Is the problem clearly understood before analyzing it? 5.Are the performance metrics relevant for this problem? 6.Is the workload correct for this problem?

5 Checklist for Avoiding Common Mistakes (2) 7.Is the evaluation technique appropriate? 8.Is the list of parameters that affect performance complete? 9.Have all parameters that affect performance been chosen as factors to be varied? 10.Is the experimental design efficient in terms of time and results? 11.Is the level of detail proper? 12.Is the measured data presented with analysis and interpretation?

6 Checklist for Avoiding Common Mistakes (3) 13.Is the analysis statistically correct? 14.Has the sensitivity analysis been done? 15.Would errors in the input cause an insignificant change in the results? 16.Have the outliers in the input or output been treated properly? 17.Have the future changes in the system and workload been modeled? 18.Have the variance of input been taken into account?

7 Checklist for Avoiding Common Mistakes (4) 19.Has the variance of the results been analyzed? 20.Is the analysis easy to explain? 21.Is the presentation style suitable for its audience? 22.Have the results been presented graphically as much as possible? 23.Are the assumptions and limitations of the analysis clearly documented?

8 Systematic Approach to Performance Evaluation 1.State Goals and Define the System 2.List Services and Outcomes 3.Select Appropriate Metrics 4.List the Parameters 5.Select Evaluation Techniques 6.Select Workload 7.Design Experiment(s) 8.Analyze and Interpret Data 9.Present the results