Explanatory Factor Analysis: Alpha and Omega Dominique Zephyr Applied Statistics Lab University of Kenctucky.

Slides:



Advertisements
Similar presentations
Factor Analysis and Principal Components Removing Redundancies and Finding Hidden Variables.
Advertisements

Exploratory Factor Analysis
Chapter Nineteen Factor Analysis.
© LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON
Objectives (BPS chapter 24)
Lecture 7: Principal component analysis (PCA)
Psychology 202b Advanced Psychological Statistics, II April 7, 2011.
Factor Analysis Ulf H. Olsson Professor of Statistics.
Factor Analysis Research Methods and Statistics. Learning Outcomes At the end of this lecture and with additional reading you will be able to Describe.
GRA 6020 Multivariate Statistics Factor Analysis Ulf H. Olsson Professor of Statistics.
Factor Analysis Ulf H. Olsson Professor of Statistics.
Measurement Models and CFA Ulf H. Olsson Professor of Statistics.
Goals of Factor Analysis (1) (1)to reduce the number of variables and (2) to detect structure in the relationships between variables, that is to classify.
Ch. 14: The Multiple Regression Model building
Education 795 Class Notes Factor Analysis II Note set 7.
Multivariate Methods EPSY 5245 Michael C. Rodriguez.
Factor Analysis Psy 524 Ainsworth.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Factor Analysis PowerPoint Prepared by Alfred.
MEASUREMENT MODELS. BASIC EQUATION x =  + e x = observed score  = true (latent) score: represents the score that would be obtained over many independent.
Measuring the Unobservable
Factor Analysis & Structural Equation Models 1 Sociology 8811, Class 28 Copyright © 2007 by Evan Schofer Do not copy or distribute without permission.
EXPLORATORY FACTOR ANALYSIS (EFA)
MGMT 6971 PSYCHOMETRICS © 2014, Michael Kalsher
Advanced Correlational Analyses D/RS 1013 Factor Analysis.
6. Evaluation of measuring tools: validity Psychometrics. 2012/13. Group A (English)
Thursday AM  Presentation of yesterday’s results  Factor analysis  A conceptual introduction to: Structural equation models Structural equation models.
Factor Analysis ( 因素分析 ) Kaiping Grace Yao National Taiwan University
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
Reliability Analysis Based on the results of the PAF, a reliability analysis was run on the 16 items retained in the Task Value subscale. The Cronbach’s.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition Instructor’s Presentation Slides 1.
MOI UNIVERSITY SCHOOL OF BUSINESS AND ECONOMICS CONCEPT MEASUREMENT, SCALING, VALIDITY AND RELIABILITY BY MUGAMBI G.K. M’NCHEBERE EMBA NAIROBI RESEARCH.
Lecture 12 Factor Analysis.
Multivariate Analysis and Data Reduction. Multivariate Analysis Multivariate analysis tries to find patterns and relationships among multiple dependent.
Applied Quantitative Analysis and Practices
Exploratory Factor Analysis. Principal components analysis seeks linear combinations that best capture the variation in the original variables. Factor.
Education 795 Class Notes Factor Analysis Note set 6.
Exploratory Factor Analysis Principal Component Analysis Chapter 17.
Multitrait Scaling and IRT: Part I Ron D. Hays, Ph.D. Questionnaire Design and Testing.
Chapter 13.  Both Principle components analysis (PCA) and Exploratory factor analysis (EFA) are used to understand the underlying patterns in the data.
Department of Cognitive Science Michael Kalsher Adv. Experimental Methods & Statistics PSYC 4310 / COGS 6310 Factor Analysis 1 PSYC 4310 Advanced Experimental.
Factor Analysis I Principle Components Analysis. “Data Reduction” Purpose of factor analysis is to determine a minimum number of “factors” or components.
Advanced Statistics Factor Analysis, I. Introduction Factor analysis is a statistical technique about the relation between: (a)observed variables (X i.
FACTOR ANALYSIS 1. What is Factor Analysis (FA)? Method of data reduction o take many variables and explain them with a few “factors” or “components”
Principal Component Analysis
FACTOR ANALYSIS.  The basic objective of Factor Analysis is data reduction or structure detection.  The purpose of data reduction is to remove redundant.
Chapter 14 EXPLORATORY FACTOR ANALYSIS. Exploratory Factor Analysis  Statistical technique for dealing with multiple variables  Many variables are reduced.
FACTOR ANALYSIS & SPSS. First, let’s check the reliability of the scale Go to Analyze, Scale and Reliability analysis.
Linear model. a type of regression analyses statistical method – both the response variable (Y) and the explanatory variable (X) are continuous variables.
Lecture 2 Survey Data Analysis Principal Component Analysis Factor Analysis Exemplified by SPSS Taylan Mavruk.
FACTOR ANALYSIS & SPSS.
Exploratory Factor Analysis
EXPLORATORY FACTOR ANALYSIS (EFA)
Analysis of Survey Results
Evaluation of measuring tools: validity
Factor analysis Advanced Quantitative Research Methods
An introduction to exploratory factor analysis in IBM SPSS Statistics
Advanced Data Preparation
© LOUIS COHEN, LAWRENCE MANION AND KEITH MORRISON
Measuring latent variables
EPSY 5245 EPSY 5245 Michael C. Rodriguez
Simple Linear Regression
Principal Component Analysis
Factor Analysis (Principal Components) Output
Exploratory Factor Analysis. Factor Analysis: The Measurement Model D1D1 D8D8 D7D7 D6D6 D5D5 D4D4 D3D3 D2D2 F1F1 F2F2.
Factor Analysis.
Measuring latent variables
Presentation transcript:

Explanatory Factor Analysis: Alpha and Omega Dominique Zephyr Applied Statistics Lab University of Kenctucky

Factor Analysis Factor analysis is a statistical method that allows us to understand large quantities of observable variables in terms of a smaller number of unobservable variables. In factor analysis, latent variables represent unobserved constructs that are referred as factors or dimensions.

Two types of factor Analysis Exploratory Factor Analysis (EFA) - It is exploratory when you do not have a pre-defined idea of the structure or how many dimensions are in a set of variables. Confirmatory Factor Analysis (CFA)- It is confirmatory when you want to test specific hypothesis about the structure or the number of dimensions underlying a set of variables. CFA is used to study how well a hypothesized factor model fits a new sample from the same population or a sample from a different population

The EFA Model Factor analysis (FA) aims to describe a set of Q variables x1, x2,..., xQ in terms of a smaller number of m factors and residuals e ( unique factors)

The EFA model where x i (i=1,…,Q) represents the original variables but standardized with zero mean and unit variance F 1, F 2,...,F m are m uncorrelated common factors, each with zero mean and unit variance; α i1, α i2,..., α im are the factor loadings related to the variable X i e i are the Q unique factors supposed independently and identically distributed with zero mean

EFA Extraction Methods Methods of extraction available in Stata pf : principal factor; the default pcf: principal-component factor ipf: iterated principal factor ml: maximum-likelihood factor We will use ml so that the results are comparable with CFA

EFA results Factor GSE1-GSE6, ml Factor | Eigenvalue Difference Proportion Cumulative Factor1 | Factor2 | Factor3 | Variable | Factor1 Factor2 Factor3 | Uniqueness GSE1 | | GSE2 | | GSE3 | | GSE4 | | GSE5 | | GSE6 | |

How many factors to retain? Kaiser criterion. Drop all factors with eigenvalues below 1.0. The simplest justification for this is that it makes no sense to add a factor that explains less variance than is contained in one individual indicator. Scree plot. This method, proposed by Cattell, plots the successive eigenvalues, which drop sharply and then level off. It suggests retaining all eigenvalues in the sharp descent before the first one on the line where they start to level off. Variance explained criteria. Some researchers simply use the rule of keeping enough factors to account for 90% (sometimes 80%) of the variation.

Kaiser Criterion Factor | Eigenvalue Difference Proportion Cumulative Factor1 | Factor2 | Factor3 | factor GSE1-GSE6, ml screeplot

Rotation The goal of rotational strategies is to obtain a clear pattern of loadings. The sum of eigenvalues is not affected by rotation, but changing the axes will alter the eigenvalues of particular factors and will change the factor loadings The most common rotation method is the “varimax rotation”

Oblimin Rotation Results factor GSE1-GSE6, factor(1) ml rotate, oblimin Factor analysis/correlation Number of obs = 700 Method: maximum likelihood Retained factors = 1 Rotation: orthogonal oblimin (Kaiser off) Number of params = 6 Schwarz's BIC = Log likelihood = (Akaike's) AIC = Rotated factor loadings (pattern matrix) and unique variances Variable | Factor1 | Uniqueness GSE1 | | GSE2 | | GSE3 | | GSE4 | | GSE5 | | GSE6 | |

Cronbach Coefficient Alpha The Cronbach Coefficient Alpha is the most common estimate of internal consistency of items in a model or survey How large the α must be: Nunnally (1978) suggests 0.7 as an acceptable reliability threshold. Some authors use.75 or.80 as a cut-off value, while others are as lenient as to go to 0.6

Alpha results alpha GSE1-GSE6, item std detail average item-test item-rest interitem Item | Obs Sign correlation correlation correlation alpha GSE1 | GSE2 | GSE3 | GSE4 | GSE5 | GSE6 | Test scale |

Bootstrapped CI for Alpha bootstrap r(alpha), reps(1000): alpha GSE1- GSE6, item std detail estat bootstrap, all | Observed Bootstrap | Coef. Bias Std. Err. [95% Conf. Interval] _bs_1 | (N) | (P) | (BC) (N) normal confidence interval (P) percentile confidence interval (BC) bias-corrected confidence interval

Omega after EFA matrix list c: sum(loadings)^2 c matrix list e: sum(error variances) r scalar omega= c[1,1]/(c[1,1]+e[1,1]). scalar list omega =

Bootstrapped CI for Omega capture program drop omegaefa program define omegaefa, rclass version 12.0 set more off preserve syntax [varlist] [if] [in] factor `varlist', factor(1) ml rotate, oblimin matrix c = e(L) matrix e = e(Psi)' * sum(loadings)^2 matrix one = J(rowsof(c),1,1) matrix c = one'*c*c'*one * sum(error variances) matrix e = J(1,rowsof(e),1)*e matrix list c matrix list e return scalar omega=c[1,1]/(c[1,1]+e[1,1]) restore end bootstrap omega=r(omega), reps(1000): omegaefa GSE1-GSE6 estat bootstrap, all

Bootstrapped Results Bootstrap results Number of obs = 700 Replications = 1000 command: omegaefa GSE1-GSE6 omega: r(omega) | Observed Bootstrap | Coef. Bias Std. Err. [95% Conf. Interval] omega | (N) | (P) | (BC) (N) normal confidence interval (P) percentile confidence interval (BC) bias-corrected confidence interval