CEN 4021 21 st Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Monitoring (POMA)

Slides:



Advertisements
Similar presentations
Design of Experiments Lecture I
Advertisements

Descriptive Measures MARE 250 Dr. Jason Turner.
CEN th Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Software Project Planning.
Psychology: A Modular Approach to Mind and Behavior, Tenth Edition, Dennis Coon Appendix Appendix: Behavioral Statistics.
QUANTITATIVE DATA ANALYSIS
Statistical Analysis SC504/HS927 Spring Term 2008 Week 17 (25th January 2008): Analysing data.
Statistical Process Control
Elec471 Embedded Computer Systems Chapter 4, Probability and Statistics By Prof. Tim Johnson, PE Wentworth Institute of Technology Boston, MA Theory and.
The Data Analysis Plan. The Overall Data Analysis Plan Purpose: To tell a story. To construct a coherent narrative that explains findings, argues against.
Fundamentals of Statistical Analysis DR. SUREJ P JOHN.
Answering questions about life with statistics ! The results of many investigations in biology are collected as numbers known as _____________________.
Chapter 3 Descriptive Measures
Forecasting Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill.
CEN th Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Release Management.
APPENDIX B Data Preparation and Univariate Statistics How are computer used in data collection and analysis? How are collected data prepared for statistical.
Chapter 3 – Descriptive Statistics
Managing Software Projects Analysis and Evaluation of Data - Reliable, Accurate, and Valid Data - Distribution of Data - Centrality and Dispersion - Data.
© The Catholic University of America Dept of Biomedical Engineering ENGR 104: Lecture 2 Statistical Analysis Using Matlab Lecturers: Dr. Binh Tran.
CHAPTER 1 Basic Statistics Statistics in Engineering
CEN st Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi What.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
© Copyright McGraw-Hill CHAPTER 3 Data Description.
CEN rd Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Phases of Software.
CEN th Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Software Project Planning.
Chapter 11 Descriptive Statistics Gay, Mills, and Airasian
Why Is It There? Getting Started with Geographic Information Systems Chapter 6.
CEN th Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Effort estimation.
Trying an Experiment BATs Conduct a memory experiment with real participants in a professional and ethical way Collect data Have you got all your materials.
Psyc 235: Introduction to Statistics Lecture Format New Content/Conceptual Info Questions & Work through problems.
CEN th Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Software Project.
Chapter 8 Curve Fitting.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved.
Statistical analysis Outline that error bars are a graphical representation of the variability of data. The knowledge that any individual measurement.
Measures of central tendency are statistics that express the most typical or average scores in a distribution These measures are: The Mode The Median.
Chapter 36 Quality Engineering (Part 2) EIN 3390 Manufacturing Processes Summer A, 2012.
Descriptive Statistics Descriptive Statistics describe a set of data.
Data Collection and Processing (DCP) 1. Key Aspects (1) DCPRecording Raw Data Processing Raw Data Presenting Processed Data CompleteRecords appropriate.
INVESTIGATION 1.
Copyright  2003 by Dr. Gallimore, Wright State University Department of Biomedical, Industrial Engineering & Human Factors Engineering Human Factors Research.
Experimental Research Methods in Language Learning Chapter 9 Descriptive Statistics.
CEN th Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Software Project Planning.
Semester 2: Lecture 3 Quantitative Data Analysis: Univariate Analysis II Prepared by: Dr. Lloyd Waller ©
Chapter 6: Analyzing and Interpreting Quantitative Data
RESEARCH & DATA ANALYSIS
Copyright © 2011, 2005, 1998, 1993 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 19: Statistical Analysis for Experimental-Type Research.
CHAPTER 7 STATISTICAL PROCESS CONTROL. THE CONCEPT The application of statistical techniques to determine whether the output of a process conforms to.
Quality Control: Analysis Of Data Pawan Angra MS Division of Laboratory Systems Public Health Practice Program Office Centers for Disease Control and.
STATISTICS STATISTICS Numerical data. How Do We Make Sense of the Data? descriptively Researchers use statistics for two major purposes: (1) descriptively.
Educational Research: Data analysis and interpretation – 1 Descriptive statistics EDU 8603 Educational Research Richard M. Jacobs, OSA, Ph.D.
Descriptive Statistics(Summary and Variability measures)
1 Statistical Analysis - Graphical Techniques Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND.
Educational Research Descriptive Statistics Chapter th edition Chapter th edition Gay and Airasian.
Introduction Dispersion 1 Central Tendency alone does not explain the observations fully as it does reveal the degree of spread or variability of individual.
Forecas ting Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill.
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 10 Descriptive Statistics Numbers –One tool for collecting data about communication.
Stats Methods at IC Lecture 3: Regression.
Descriptive and Inferential Statistics
Project Monitoring Review Class 22 Project Monitoring cont
POPULATION VERSUS SAMPLE
Statistical analysis.
Part 5 - Chapter
Part 5 - Chapter 17.
Data Mining: Concepts and Techniques
Statistical analysis.
Description of Data (Summary and Variability measures)
Part 5 - Chapter 17.
MBA 510 Lecture 2 Spring 2013 Dr. Tonya Balan 4/20/2019.
DESIGN OF EXPERIMENT (DOE)
Data analysis LO: Identify and apply different methods of measuring central tendencies and dispersion.
CEN 4021 Software Engineering II
Presentation transcript:

CEN st Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Monitoring (POMA) Analysis and Evaluation of Information

21 st LectureCEN 4021: Software Engineering II Acknowledgements  Dr. Onyeka Ezenwoye  Dr. Peter Clarke 2

21 st LectureCEN 4021: Software Engineering II Agenda  Monitoring (POMA) –Analysis and Evaluation of Information

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation of Information  Any data collected must be reliable, accurate and valid.  Reliable data – Data that are collected and recorded according to the defined rules of measurement and metric.  Accurate data – Data that are collected and tabulated according to the defined level of precision of measurement and metric.  Valid data – Data that are collected, tabulated, and applied according to the defined intention of applying the measurement.

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont  The level of data accuracy includes rounding and specifying the number of significant figures.  Validity addresses the applicability of the data to assess the particular issue or to measure the particular attribute.

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont Consider the following computed measurement of average problem level: Avg. problem level = [SUM(# of severity k problems x severity k) / n] Where: n = total # of problems found SUM = the summation function k = a discrete value between 1 and (2.7 can be rounded up to 3)

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont Is average in this case valid?  Care needs to be taken in considering the validity of data in the analysis of some attribute.  Care need to be taken when using derived information after applying some computation or transformation to the raw data

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont Distribution of Data  One of the simplest forms of analysis is to look at the distribution of the collected data.  Data distribution – A description of a collection of data that shows the spread of the values and the frequency of occurrences of the values of the data

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont  Data distribution examples –Skew of the distribution  Severity level 1: 23  Severity level 2: 46  Severity level 3: 79  Severity level 4: 95 –Range of data values  Functional area 1: 2  Functional area 2: 7  Functional area 3: 3  Functional area 4: 8  Functional area 5: 0 # of severity level by functional area Do these examples indicate where the product is good or bad?

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont  Data distribution examples –Data trends  Week 1: 20  Week 2: 23  Week 3: 45  Week 4: 67  Week 5: 35  Week 6:15 # of problems found by functional area

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont Centrality and Dispersion  Provides a way to compare groups of data.  Provides manager a way to characterize a set of related data, whether those data deal with product quality, project productivity, or some other attribute.  Centrality analysis – is an analysis of a data set to find the typical value of that data set.

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont Centrality and Dispersion  Go through sections: –Average value  Can be affected by one or more extreme data points –Median Value  Divides data into upper and lower halves  What about data trends?  What about extreme data points?

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont Centrality and Dispersion  Standard deviation –Sometimes it is good to know how the distribution of data is dispersed form the central value of either the average or the median. –It is more difficult to utilize the central value if there is a large dispersion from the central value – average or mean value.

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont  Standard deviation –A very common dispersion measurement is the standard deviation (see text book for an example).  Control Charts –Control Chart – A chart used to assess and control the variability of some process or product characteristic. It usually involves establishing the upper and lower limits (the control limits) of data variations from the data set’s average value. If an observed data value falls outside the control limits, then it would trigger evaluation of the characteristic.

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont  Data Smoothing: Moving averages –To “smooth” out variations in data and prevent alarm from being raised from a few spikes, the data from two or three weeks are combined. The resulting combined value is called the moving average. –Moving average – A technique for expressing data by computing the average of a fixed grouping (e.g., data for a fixed period) of data values; it is often used to suppress the effects of one extreme data point. –Data smoothing – A technique used to decrease the effects of individual, extreme variability in data values. –See table 10.1 in text book.

21 st LectureCEN 4021: Software Engineering II

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont  Data Correlation –Correlating attributes is a very useful tool for spmr, but it must be used carefully. Data correlation speaks only to the potential existence of a relationship between attributes; it does not necessarily imply cause and effect. –Data correlation – A technique that analyze the degree of relationship between sets of data. –Linear regression – A technique that estimates the relationship between two sets of data by fitting a straight line to the two sets of data values.

21 st LectureCEN 4021: Software Engineering II Analysis and Evaluation cont  Normalization of Data –Normalizing data – A technique used to bring data characterizations to some common or standard level so that comparisons become more meaningful. –The size of the functional area or its complexity should be taken into account rather than just collecting the raw number in the area. Some measurement of size of the functional area or its complexity must be used to normalize the data.