CSE 300: Software Reliability Engineering Topics covered: Software metrics and software reliability.

Slides:



Advertisements
Similar presentations
Face Recognition Sumitha Balasuriya.
Advertisements

Krishna Rajan Data Dimensionality Reduction: Introduction to Principal Component Analysis Case Study: Multivariate Analysis of Chemistry-Property data.
Covariance Matrix Applications
1er. Escuela Red ProTIC - Tandil, de Abril, 2006 Principal component analysis (PCA) is a technique that is useful for the compression and classification.
As applied to face recognition.  Detection vs. Recognition.
1 Multivariate Statistics ESM 206, 5/17/05. 2 WHAT IS MULTIVARIATE STATISTICS? A collection of techniques to help us understand patterns in and make predictions.
Principal Components Analysis Babak Rasolzadeh Tuesday, 5th December 2006.
An introduction to Principal Component Analysis (PCA)
Psychology 202b Advanced Psychological Statistics, II April 7, 2011.
CSE 322: Software Reliability Engineering Topics covered: Techniques for prediction.
Principal Component Analysis
CSE 300: Software Reliability Engineering Topics covered: Software Reliability Models.
CSE 322: Software Reliability Engineering Topics covered: Software Reliability Models.
Data mining and statistical learning, lecture 4 Outline Regression on a large number of correlated inputs  A few comments about shrinkage methods, such.
CSE 322: Software Reliability Engineering Topics covered: Software Reliability Models.
Dimension Reduction and Feature Selection Craig A. Struble, Ph.D. Department of Mathematics, Statistics, and Computer Science Marquette University.
Introduction Given a Matrix of distances D, (which contains zeros in the main diagonal and is squared and symmetric), find variables which could be able,
Microarray analysis Algorithms in Computational Biology Spring 2006 Written by Itai Sharon.
CSE 300: Software Reliability Engineering Topics covered: Software metrics and software reliability Software complexity and software quality.
Principal Component Analysis. Consider a collection of points.
Correlation. The sample covariance matrix: where.
Summarized by Soo-Jin Kim
Principal Components Analysis (PCA). a technique for finding patterns in data of high dimension.
Chapter 2 Dimensionality Reduction. Linear Methods
Principal Components Analysis BMTRY 726 3/27/14. Uses Goal: Explain the variability of a set of variables using a “small” set of linear combinations of.
Chapter 3 Data Exploration and Dimension Reduction 1.
The Multiple Correlation Coefficient. has (p +1)-variate Normal distribution with mean vector and Covariance matrix We are interested if the variable.
BACKGROUND LEARNING AND LETTER DETECTION USING TEXTURE WITH PRINCIPAL COMPONENT ANALYSIS (PCA) CIS 601 PROJECT SUMIT BASU FALL 2004.
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Basics of Neural Networks Neural Network Topologies.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Descriptive Statistics vs. Factor Analysis Descriptive statistics will inform on the prevalence of a phenomenon, among a given population, captured by.
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
Principal Components: A Mathematical Introduction Simon Mason International Research Institute for Climate Prediction The Earth Institute of Columbia University.
Reduces time complexity: Less computation Reduces space complexity: Less parameters Simpler models are more robust on small datasets More interpretable;
Module III Multivariate Analysis Techniques- Framework, Factor Analysis, Cluster Analysis and Conjoint Analysis Research Report.
Education 795 Class Notes Factor Analysis Note set 6.
Principal Component Analysis Zelin Jia Shengbin Lin 10/20/2015.
Principle Components Analysis A method for data reduction.
Chapter 15: Classification of Time- Embedded EEG Using Short-Time Principal Component Analysis by Nguyen Duc Thang 5/2009.
Presented by: Muhammad Wasif Laeeq (BSIT07-1) Muhammad Aatif Aneeq (BSIT07-15) Shah Rukh (BSIT07-22) Mudasir Abbas (BSIT07-34) Ahmad Mushtaq (BSIT07-45)
FACTOR ANALYSIS.  The basic objective of Factor Analysis is data reduction or structure detection.  The purpose of data reduction is to remove redundant.
Principal Components Analysis ( PCA)
CSE 4705 Artificial Intelligence
Principal Component Analysis (PCA)
PREDICT 422: Practical Machine Learning
Multivariate Analysis - Introduction
COMP 1942 PCA TA: Harry Chan COMP1942.
9.3 Filtered delay embeddings
Principal Component Analysis (PCA)
Dimension Reduction via PCA (Principal Component Analysis)
Principal Component Analysis
Measuring latent variables
Eigen Decomposition Based on the slides by Mani Thomas and book by Gilbert Strang. Modified and extended by Longin Jan Latecki.
Techniques for studying correlation and covariance structure
Measuring latent variables
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Descriptive Statistics vs. Factor Analysis
Measuring latent variables
Introduction to Statistical Methods for Measuring “Omics” and Field Data PCA, PcoA, distance measure, AMOVA.
Principal Components Analysis
Eigen Decomposition Based on the slides by Mani Thomas
Chapter_19 Factor Analysis
Principal Component Analysis
Correspondence Analysis
Eigen Decomposition Based on the slides by Mani Thomas
Factor Analysis.
Measuring latent variables
Presentation transcript:

CSE 300: Software Reliability Engineering Topics covered: Software metrics and software reliability

Introduction

 What can be measured?  Predicted quality attributes:

Static complexity metrics  Measurements on:  Obtained earlier in the life cycle

Halstead’s software science metrics  Primitive metrics:  Composite, non primitive metrics:

Halstead’s software science metrics (contd..)  Discussion:

McCabe’s cyclomatic complexity metric

 Application

Principal Components Analysis  Transforms a number of possibly correlated variables into a smaller number of uncorrelated variables called principal components  First principal component accounts for as much variability in the data as possible.  Each subsequent component accounts for as much remaining variability as possible.  Principal components represent transformed scores on dimensions that are orthogonal  Decomposition technique to detect and analyze relationships among variables  Identify distinct sources of variation underlying the set of variables

Principal Components Analysis  Application:  Many metrics exist to measure the same artifact.  Metrics are interrelated.  Reduce the large set of correlated metrics to a small set of uncorrelated variables, which capture the same information.  Investigate the structure of the underlying common factors or components that make up the raw metrics.

Steps in Principal Components Analysis  Data:  Software metrics data  Step I: Organize the data in the form of n x m matrix, where n is the number of modules and m is the number of metrics.

Steps in Principal Components Analysis  Subtract the mean from each one of the metrics observations

Steps in Principal Components Analysis  Step III: Compute the covariance matrix. Covariance matrix will be m x m.

Steps in Principal Components Analysis  Step IV: Compute the eigenvectors and eigenvalues

Steps in Principal Component Analysis  Step V: Choosing components and forming a feature vector.

Steps in Principal Component Analysis  Step VI: Deriving the new data set:

Steps in Principal Component Analysis  Step VII: Getting the old data back: