ITEC 451 Network Design and Analysis. 2 You will Learn: (1) Specifying performance requirements Evaluating design alternatives Comparing two or more systems.

Slides:



Advertisements
Similar presentations
1 CS533 Modeling and Performance Evaluation of Network and Computer Systems Capacity Planning and Benchmarking (Chapter 9)
Advertisements

Design of Experiments Lecture I
1 Evaluation Rong Jin. 2 Evaluation  Evaluation is key to building effective and efficient search engines usually carried out in controlled experiments.
Copyright © 2005 Department of Computer Science CPSC 641 Winter PERFORMANCE EVALUATION Often in Computer Science you need to: – demonstrate that.
Introduction to Research Methodology
Silberschatz, Galvin and Gagne  2002 Modified for CSCI 399, Royden, Operating System Concepts Operating Systems Lecture 19 Scheduling IV.
Project 4 U-Pick – A Project of Your Own Design Proposal Due: April 14 th (earlier ok) Project Due: April 25 th.
© 2004 Prentice-Hall, Inc.Chap 1-1 Basic Business Statistics (9 th Edition) Chapter 1 Introduction and Data Collection.
1 PERFORMANCE EVALUATION H Often one needs to design and conduct an experiment in order to: – demonstrate that a new technique or concept is feasible –demonstrate.
VLSI Systems--Spring 2009 Introduction: --syllabus; goals --schedule --project --student survey, group formation.
1/55 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 10 Hypothesis Testing.
Writing Good Software Engineering Research Papers A Paper by Mary Shaw In Proceedings of the 25th International Conference on Software Engineering (ICSE),
1 Validation and Verification of Simulation Models.
1 PERFORMANCE EVALUATION H Often in Computer Science you need to: – demonstrate that a new concept, technique, or algorithm is feasible –demonstrate that.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
CS533 Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2)
Science and Engineering Practices
Chapter 10 Hypothesis Testing
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
March  There is a maximum of one obtuse angle in a triangle, but can you prove it?  To prove something like this, we mathematicians must do a.
Chapter 1: Introduction to Statistics
Lecture 1 What is Modeling? What is Modeling? Creating a simplified version of reality Working with this version to understand or control some.
Chapter 8 Architecture Analysis. 8 – Architecture Analysis 8.1 Analysis Techniques 8.2 Quantitative Analysis  Performance Views  Performance.
© 2003, Carla Ellis Experimentation in Computer Systems Research Why: “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you.
Modeling and Performance Evaluation of Network and Computer Systems Introduction (Chapters 1 and 2) 10/4/2015H.Malekinezhad1.
PARAMETRIC STATISTICAL INFERENCE
Evaluating a Research Report
1 Performance Evaluation of Computer Systems and Networks Introduction, Outlines, Class Policy Instructor: A. Ghasemi Many thanks to Dr. Behzad Akbari.
Chapter 3 System Performance and Models. 2 Systems and Models The concept of modeling in the study of the dynamic behavior of simple system is be able.
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 David K. Walters.
Analysis of Simulation Results Chapter 25. Overview  Analysis of Simulation Results  Model Verification Techniques  Model Validation Techniques  Transient.
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006.
Managerial Economics Demand Estimation & Forecasting.
Properties of OLS How Reliable is OLS?. Learning Objectives 1.Review of the idea that the OLS estimator is a random variable 2.How do we judge the quality.
1 STAT 500 – Statistics for Managers STAT 500 Statistics for Managers.
NCHRP Project Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications SURVEY OF PRACTITIONERS.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
Chapter 10 Verification and Validation of Simulation Models
Academic Research Academic Research Dr Kishor Bhanushali M
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
Issues concerning the interpretation of statistical significance tests.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Chapter 3 System Performance and Models Introduction A system is the part of the real world under study. Composed of a set of entities interacting.
Presenting and Analysing your Data CSCI 6620 Spring 2014 Thesis Projects: Chapter 10 CSCI 6620 Spring 2014 Thesis Projects: Chapter 10.
1 Common Mistakes in Performance Evaluation (1) 1.No Goals  Goals  Techniques, Metrics, Workload 2.Biased Goals  (Ex) To show that OUR system is better.
1 1 Slide Simulation Professor Ahmadi. 2 2 Slide Simulation Chapter Outline n Computer Simulation n Simulation Modeling n Random Variables and Pseudo-Random.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
Basic Business Statistics, 8e © 2002 Prentice-Hall, Inc. Chap 1-1 Inferential Statistics for Forecasting Dr. Ghada Abo-zaid Inferential Statistics for.
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
© 2003, Carla Ellis Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Csci 418/618 Simulation Models Dr. Ken Nygard, IACC 262B
Vague idea “groping around” experiences Hypothesis Model Initial observations Experiment Data, analysis, interpretation Results & final Presentation Experimental.
Introduction Andy Wang CIS Computer Systems Performance Analysis.
Simulation. Types of simulation Discrete-event simulation – Used for modeling of a system as it evolves over time by a representation in which the state.
Introduction It had its early roots in World War II and is flourishing in business and industry with the aid of computer.
Building Valid, Credible & Appropriately Detailed Simulation Models
PSY 325 AID Education Expert/psy325aid.com FOR MORE CLASSES VISIT
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Common Mistakes in Performance Evaluation The Art of Computer Systems Performance Analysis By Raj Jain Adel Nadjaran Toosi.
OPERATING SYSTEMS CS 3502 Fall 2017
Performance evaluation
Network Performance and Quality of Service
ITEC 451 Network Design and Analysis
Chapter 10 Verification and Validation of Simulation Models
Computer Systems Performance Evaluation
Computer Systems Performance Evaluation
MECH 3550 : Simulation & Visualization
DESIGN OF EXPERIMENTS by R. C. Baker
Dr. Arslan Ornek MATHEMATICAL MODELS
Presentation transcript:

ITEC 451 Network Design and Analysis

2 You will Learn: (1) Specifying performance requirements Evaluating design alternatives Comparing two or more systems Determining the optimal value of a parameter (system tuning) Finding the performance bottleneck (bottleneck identification)

3 You will Learn: (2) Characterizing the load on the system (workload characterization) Determining the number and sizes of components (capacity planning) Predicting the performance at future loads (forecasting)

4 Basic Terms (1) System: Any collection of hardware, software, or both Model: Mathematical representation of a concept, phenomenon, or system Metrics: The criteria used to evaluate the performance of a system Workload: The requests made by the users of a system

5 Basic Terms (2) Parameters: System and workload characteristics that affect system performance Factors: Parameters that are varied in a study especially those that depend on users Outliners: Values (in a set of measurement data) that are too high or too low as compared to the majority

6 Common Mistakes in Performance Evaluation (1) 1.No Goals  Goals  Techniques, Metrics, Workload 2.Biased Goals  (Ex) To show that OUR system is better than THEIRS 3.Unsystematic Approach 4.Analysis Without Understanding the Problem 5.Incorrect Performance Metrics 6.Unrepresentative Workload 7.Wrong Evaluation Technique

7 Common Mistakes in Performance Evaluation (2) 8.Overlook Important Parameters 9.Ignore Significant Factors 10.Inappropriate Experimental Design 11.Inappropriate Level of Detail 12.No Analysis 13.Erroneous Analysis 14.No Sensitivity Analysis 15.Ignore Errors in Input

8 Common Mistakes in Performance Evaluation (3) 16.Improper Treatment of Outliers 17.Assuming No Change in the Future 18.Ignoring Variability 19.Too Complex Analysis 20.Improper Presentation of Results 21.Ignoring Social Aspects 22.Omitting Assumptions and Limitations

9 Checklist for Avoiding Common Mistakes (1) 1.Is the system correctly defined and the goals clearly stated? 2.Are the goals stated in an unbiased manner? 3.Have all the steps of the analysis followed systematically? 4.Is the problem clearly understood before analyzing it? 5.Are the performance metrics relevant for this problem? 6.Is the workload correct for this problem?

10 Checklist for Avoiding Common Mistakes (2) 7.Is the evaluation technique appropriate? 8.Is the list of parameters that affect performance complete? 9.Have all parameters that affect performance been chosen as factors to be varied? 10.Is the experimental design efficient in terms of time and results? 11.Is the level of detail proper? 12.Is the measured data presented with analysis and interpretation?

11 Checklist for Avoiding Common Mistakes (3) 13.Is the analysis statistically correct? 14.Has the sensitivity analysis been done? 15.Would errors in the input cause an insignificant change in the results? 16.Have the outliers in the input or output been treated properly? 17.Have the future changes in the system and workload been modeled? 18.Have the variance of input been taken into account?

12 Checklist for Avoiding Common Mistakes (4) 19.Has the variance of the results been analyzed? 20.Is the analysis easy to explain? 21.Is the presentation style suitable for its audience? 22.Have the results been presented graphically as much as possible? 23.Are the assumptions and limitations of the analysis clearly documented?

13 Systematic Approach to Performance Evaluation 1.State Goals and Define the System 2.List Services and Outcomes 3.Select Appropriate Metrics 4.List the Parameters 5.Select Evaluation Techniques 6.Select Workload 7.Design Experiment(s) 8.Analyze and Interpret Data 9.Present the results

14 State Goals and Define the System Identify the goal of study  Not trivial, but  Will affect every decision or choice you make down the road Clearly define the system  Where you draw the boundary will Dictate the choice of model Affect choice of metrics and workload

15 List Services and Outcomes Identify the services offered by the system For each service, identify all possible outcomes What ’ s the point?  Those will help in the selection of appropriate metrics

16 Select Appropriate Metrics (1) These are the criteria for performance evaluation Desired Properties  Specific  Measurable  Acceptable  Realizable  Through Examples? Prefer those that  Have low variability,  Are non-redundant, and  Are complete

17 Select Appropriate Metrics (2) - Examples - Successful Service Rate – Throughput Frequency of Correct Results – Reliability Being Available When Needed – Availability Serving Users Fairly – Fairness Efficiency of Resource Usage – Utilization  Now, How to Measure These?

18 Select Appropriate Metrics (3) - A Classification - Higher is Better  Examples? Lower is Better  Examples? Nominal is the Best  Examples?

19 Select Appropriate Metrics (4) - Criteria for Metric Set Selection - Low-variability  Helps reduce the number of runs needed  Advice: Avoid ratios of two variables Non-redundancy  Helps make results less confusing and reduce the effort  Try to find a relationship between metrics If a simple relationship exists, keep only one Completeness

20 Select Appropriate Metrics (5) - Summary - Metrics chosen should be measurable  Can assign a numerical value to it Acceptable Easy to Work With (i.e., can measure it easily) Avoid Redundancy Pay Attention to the Unit Used Sanity Check  Check the boundary conditions (i.e., best system, ideal workload, etc.) to see if the metric is sensible

21 Systematic Approach to Performance Evaluation 1.State Goals and Define the System 2.List Services and Outcomes 3.Select Appropriate Metrics 4.List the Parameters 5.Select Evaluation Techniques 6.Select Workload 7.Design Experiment(s) 8.Analyze and Interpret Data 9.Present the results

22 List the Parameters Identify all system and workload parameters  System parameters Characteristics of the system that affect system performance  Workload parameters Characteristics of usage (or workload) that affect system performance Categorize them according to their effects on system performance Determine the range of their variation or expected variation Decide on one or at most a couple to vary while keeping others fixed

23 Select Evaluation Technique(s) (1) Three Techniques  Measurement  Simulation  Analytical Modeling

24 Select Evaluation Technique(s) (2) -Measurement, Simulation, or Analysis?- Can be a combination of two or all three Use the goal of study to guide your decision  Remember, each of these techniques has its pros and cons

25 Select Evaluation Technique(s) (3) - Measurement - (+) Provides realistic data (+) Can test the limits on load (-) System or a prototype should be working (-) The prototype may not represent the actual system (-) Not that easy to correlate cause and effect Challenges  Defining appropriate metrics  Using appropriate workload  Statistical tools to analyze the data

26 Select Evaluation Technique(s) (4) - Simulation - (+) Less expensive than building a prototype (+) Can test under more load scenarios (-) Synthetic since the model is not the actual system (-) Can not use simulation to make any guarantees on expected performance Challenges  Need to be careful when to use simulation  Need to get the model right  Need to represent results well (the graphical tools)  Need to learn simulation tools

27 Select Evaluation Technique(s) (5) - Analytical Modeling - (+) Can make strong guarantees on expected behavior (+) Can provide an insight in to cause and effect (+) Does not need to build a prototype (-) Performance prediction only as good as the model Challenges  Significant learning curve  Mathematically involved  Choosing the right model (the art work)

28 Select Evaluation Technique(s) (6) - Bottom Line - You can use measurement to demonstrate feasibility of an approach. You can use measurement or simulation to show an evidence that your algorithm or system performs better than competing approaches in certain situations. But, if you would like to claim any properties of your algorithm (or system), the only option is to use analysis and mathematically prove your claim.

29 Select Evaluation Technique(s) (7) - When to Use What? - It is good to be versed in all three 1.Can start with measurement or simulation to get a feel of the model or expected behavior 2.Start with a simple model 3.Perform an analysis to predict the performance and prove some behavioral properties 4.Observe the actual performance to determine the validity of your model and your analysis 5.Can use simulation for the previous step if a working system is not available/feasible 6.Go back to revise the model and analysis if significant inconsistency is observed and start with Step 4 7.Finally use simulation to verify your results for large scale data or for scenarios that can not be modeled with existing expertise and available time

30 Systematic Approach to Performance Evaluation 1.State Goals and Define the System 2.List Services and Outcomes 3.Select Appropriate Metrics 4.List the Parameters 5.Select Evaluation Techniques 6.Select Workload 7.Design Experiment(s) 8.Analyze and Interpret Data 9.Present the results

31 Select Workload What is a workload? How do you represent it?  Range of values What should be in increment size?  Probability Distribution Need to find a good model that approximates reality May require measurement/statistical analysis In simulation, use an appropriate random number generator to produce values  Trace from an actual system

32 Design Experiment(s) To provide maximum information with minimum effort  Field experiments can take enormous preparation time  Attempt to get several experiments done in one setup  Explore if you can use data collected by someone else  Also, explore if you can use remote labs  Finally, explore if you can use simulation without loosing significant validity  Modifying simulation code can be time consuming as well In both simulation and measurement, repeat the same experiment (for a fixed workload and fixed parameter values) sufficient number of times for statistical validity Always keep the goal in mind

33 Analyze and Interpret Data In Analytical Modeling  Carry out mathematical derivations that prove expected system behavior In Measurement  Statistically analyze the collected data  Summarize the results by computing statistical measures

34 Systematic Approach to Performance Evaluation 1.State Goals and Define the System 2.List Services and Outcomes 3.Select Appropriate Metrics 4.List the Parameters 5.Select Evaluation Techniques 6.Select Workload 7.Design Experiment(s) 8.Analyze and Interpret Data 9.Present the results

35 Present the Results (1) In Analytical Modeling  Clear statements of lemmas, and theorems  Description of an algorithm with a proof of its properties  Present numerical computation results To show how to use the formulae, and To show the effect of varying the parameters  Perform simulation/measurement to show the validity of the model and analysis

36 Present the Results (2) In simulation and Measurement  Clear statement of the goals of experiment  A list of assumptions  The experiment set up Platforms, tools, units, range of values for parameters  Graphical presentation of results The simpler is it to understand the graphs, the better it is

37 Present the Results (3) In all three, after presentation of results  Discuss implications for the users  Discuss how a user can use the results  Any additional applications that can benefit from your experiment  Present conclusions What did you learn, e.g., surprises, new directions  Discuss limitations and future work

38 Review: Systematic Approach to Performance Evaluation 1.State Goals and Define the System 2.List Services and Outcomes 3.Select Appropriate Metrics 4.List the Parameters 5.Select Evaluation Techniques 6.Select Workload 7.Design Experiment(s) 8.Analyze and Interpret Data 9.Present the results