1crmda.KU.edu Todd D. Little University of Kansas Director, Quantitative Training Program Director, Center for Research Methods and Data Analysis Director,

Slides:



Advertisements
Similar presentations
Agenda Levels of measurement Measurement reliability Measurement validity Some examples Need for Cognition Horn-honking.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Part II Sigma Freud & Descriptive Statistics
1crmda.KU.edu Todd D. Little University of Kansas Director, Quantitative Training Program Director, Center for Research Methods and Data Analysis Director,
Hypothesis Testing Steps in Hypothesis Testing:
Measurement Reliability and Validity
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
1crmda.KU.edu Todd D. Little University of Kansas Director, Quantitative Training Program Director, Center for Research Methods and Data Analysis Director,
Correlation & Regression Chapter 15. Correlation statistical technique that is used to measure and describe a relationship between two variables (X and.
Part II Knowing How to Assess Chapter 5 Minimizing Error p115 Review of Appl 644 – Measurement Theory – Reliability – Validity Assessment is broader term.
The Simple Linear Regression Model: Specification and Estimation
Factor Analysis Ulf H. Olsson Professor of Statistics.
Common Factor Analysis “World View” of PC vs. CF Choosing between PC and CF PAF -- most common kind of CF Communality & Communality Estimation Common Factor.
© University of Minnesota Data Mining for the Discovery of Ocean Climate Indices 1 CSci 8980: Data Mining (Fall 2002) Vipin Kumar Army High Performance.
Beginning the Research Design
When Measurement Models and Factor Models Conflict: Maximizing Internal Consistency James M. Graham, Ph.D. Western Washington University ABSTRACT: The.
Factor Analysis Ulf H. Olsson Professor of Statistics.
Measurement Models and CFA Ulf H. Olsson Professor of Statistics.
Types of Errors Difference between measured result and true value. u Illegitimate errors u Blunders resulting from mistakes in procedure. You must be careful.
G Lecture 91 Thinking about an example Pitfalls in measurement models Pitfalls in model specification Item Parcel Issues.
Education 795 Class Notes Factor Analysis II Note set 7.
LECTURE 16 STRUCTURAL EQUATION MODELING.
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Structural Equation Modeling Intro to SEM Psy 524 Ainsworth.
Multivariate Methods EPSY 5245 Michael C. Rodriguez.
Factor Analysis Psy 524 Ainsworth.
Determining Sample Size
Measurement in Exercise and Sport Psychology Research EPHE 348.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
1crmda.KU.edu Todd D. Little University of Kansas Director, Quantitative Training Program Director, Center for Research Methods and Data Analysis Director,
Estimation of Statistical Parameters
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
CJT 765: Structural Equation Modeling Class 7: fitting a model, fit indices, comparingmodels, statistical power.
Psy 427 Cal State Northridge Andrew Ainsworth PhD.
Advanced Correlational Analyses D/RS 1013 Factor Analysis.
Applied Quantitative Analysis and Practices
CJT 765: Structural Equation Modeling Class 8: Confirmatory Factory Analysis.
6. Evaluation of measuring tools: validity Psychometrics. 2012/13. Group A (English)
Factor Analysis Psy 524 Ainsworth. Assumptions Assumes reliable correlations Highly affected by missing data, outlying cases and truncated data Data screening.
Confirmatory Factor Analysis Psych 818 DeShon. Construct Validity: MTMM ● Assessed via convergent and divergent evidence ● Convergent – Measures of the.
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
3.4 The Components of the OLS Variances: Multicollinearity We see in (3.51) that the variance of B j hat depends on three factors: σ 2, SST j and R j 2.
G Lecture 7 Confirmatory Factor Analysis
1crmda.KU.edu Todd D. Little University of Kansas Director, Quantitative Training Program Director, Center for Research Methods and Data Analysis Director,
CFA: Basics Beaujean Chapter 3. Other readings Kline 9 – a good reference, but lumps this entire section into one chapter.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
MEASUREMENT. MeasurementThe assignment of numbers to observed phenomena according to certain rules. Rules of CorrespondenceDefines measurement in a given.
CJT 765: Structural Equation Modeling Class 8: Confirmatory Factory Analysis.
Multivariate Analysis and Data Reduction. Multivariate Analysis Multivariate analysis tries to find patterns and relationships among multiple dependent.
1crmda.KU.edu Todd D. Little University of Kansas Director, Quantitative Training Program Director, Center for Research Methods and Data Analysis Director,
Correlation They go together like salt and pepper… like oil and vinegar… like bread and butter… etc.
Reliability: Introduction. Reliability Session 1.Definitions & Basic Concepts of Reliability 2.Theoretical Approaches 3.Empirical Assessments of Reliability.
Chapter 6 - Standardized Measurement and Assessment
Tutorial I: Missing Value Analysis
Advanced Statistics Factor Analysis, I. Introduction Factor analysis is a statistical technique about the relation between: (a)observed variables (X i.
Applied Quantitative Analysis and Practices LECTURE#19 By Dr. Osman Sadiq Paracha.
Principal Component Analysis
Quantitative Methods. Bivariate Regression (OLS) We’ll start with OLS regression. Stands for  Ordinary Least Squares Regression. Relatively basic multivariate.
FACTOR ANALYSIS.  The basic objective of Factor Analysis is data reduction or structure detection.  The purpose of data reduction is to remove redundant.
Lesson 3 Measurement and Scaling. Case: “What is performance?” brandesign.co.za.
Chapter 17 STRUCTURAL EQUATION MODELING. Structural Equation Modeling (SEM)  Relatively new statistical technique used to test theoretical or causal.
Classical Test Theory Psych DeShon. Big Picture To make good decisions, you must know how much error is in the data upon which the decisions are.
Structural Equation Modeling
Evaluation of measuring tools: validity
CJT 765: Structural Equation Modeling
Reliability and Validity of Measurement
EPSY 5245 EPSY 5245 Michael C. Rodriguez
Structural Equation Modeling
Presentation transcript:

1crmda.KU.edu Todd D. Little University of Kansas Director, Quantitative Training Program Director, Center for Research Methods and Data Analysis Director, Undergraduate Social and Behavioral Sciences Methodology Minor Member, Developmental Psychology Training Program crmda. KU.edu Workshop presented University of Turku Based on my Presidential Address presented American Psychological Association Meeting in Washington, DC Why the Items versus Parcels Controversy Needn’t Be One

2crmda.KU.edu University of Kansas

3crmda.KU.edu University of Kansas

4crmda.KU.edu University of Kansas

5crmda.KU.edu University of Kansas

6crmda.KU.edu University of Kansas

7 7 Overview Learn what parcels are and how to make them Learn the reasons for and the conditions under which parcels are beneficial Learn the conditions under which parcels can be problematic Disclaimer: This talk reflects my view that parcels per se aren’t controversial if done thoughtfully. crmda.KU.edu

8 8 Key Sources and Acknowledgements Special thanks to: Mijke Rhemtulla, Kimberly Gibson, Alex Schoemann, Wil Cunningham, Golan Shahar, John Graham & Keith Widaman Little, T. D., Rhemtulla, M., Gibson, K., & Schoemann, A. M. (in press). Why the items versus parcels controversy needn’t be one. Psychological Methods, 00, Little, T. D., Cunningham, W. A., Shahar, G., & Widaman, K. F. (2002). To parcel or not to parcel: Exploring the question, weighing the merits. Structural Equation Modeling, 9, Little, T. D., Lindenberger, U., & Nesselroade, J. R. (1999). On selecting indicators for multivariate measurement and modeling with latent variables: When "good" indicators are bad and "bad" indicators are good. Psychological Methods, 4, crmda.KU.edu

9 9 What is Parceling? Parceling: Averaging (or summing) two or more items to create more reliable indicators of a construct ≈ Packaging items, tying them together Data pre-processing strategy crmda.KU.edu

10 A CFA of Items.83 GreatCheerful Model Fit: χ 2 (53, n=759) = 181.2; RMSEA =.056 ( ) ; NNFI/TLI =.97; CFI = * HappyGoodGladSuperSadDownUnhappyBlueBadTerrible.76 Negative2Positive crmda.KU.edu

11 CFA: Using Parcels Positive1Negative  21  11  22  33  44  55  66 Great &Glad & GladUnhappy & Bad Down & Blue Terrible &Sad & SadHappy & Super Cheerful & Good 1*  11  22 (6.2.Parcels) Average 2 items to create 3 parcels per construct crmda.KU.edu

12 CFA: Using Parcels Positive1Negative     Great &Glad & GladUnhappy & Bad Down & Blue Terrible &Sad & SadHappy & Super Cheerful & Good 1* Model Fit: χ 2 (8, n=759) = 26.8; RMSEA =.056 ( ) ; NNFI/TLI =.99; CFI =.99 Similar solution Similar factor correlation Higher loadings, more reliable info Good model fit, improved χ2 crmda.KU.edu

13 Philosophical Issues To parcel, or not to parcel…? crmda.KU.edu

14 Pragmatic View From Little et al., 2002 “Given that measurement is a strict, rule- bound system that is defined, followed, and reported by the investigator, the level of aggregation used to represent the measurement process is a matter of choice and justification on the part of the investigator” Preferred terms: remove unwanted, clean, reduce, minimize, strengthen, etc. “Given that measurement is a strict, rule- bound system that is defined, followed, and reported by the investigator, the level of aggregation used to represent the measurement process is a matter of choice and justification on the part of the investigator” Preferred terms: remove unwanted, clean, reduce, minimize, strengthen, etc. crmda.KU.edu

15 Empiricist / Conservative View From Little et al., 2002 “Parceling is akin to cheating because modeled data should be as close to the response of the individual as possible in order to avoid the potential imposition, or arbitrary manufacturing of a false structure” Preferred terms: mask, conceal, camouflage, hide, disguise, cover-up, etc. “Parceling is akin to cheating because modeled data should be as close to the response of the individual as possible in order to avoid the potential imposition, or arbitrary manufacturing of a false structure” Preferred terms: mask, conceal, camouflage, hide, disguise, cover-up, etc. crmda.KU.edu

16 Psuedo-Hobbesian View Parcels should be avoided because researchers are ignorant (perhaps stupid) and prone to mistakes. And, because the unthoughtful or unaware application of parcels by unwitting researchers can lead to bias, they should be avoided. Preferred terms: most (all) researchers are un___ as in … unaware, unable, unwitting, uninformed, unscrupulous, etc. Parcels should be avoided because researchers are ignorant (perhaps stupid) and prone to mistakes. And, because the unthoughtful or unaware application of parcels by unwitting researchers can lead to bias, they should be avoided. Preferred terms: most (all) researchers are un___ as in … unaware, unable, unwitting, uninformed, unscrupulous, etc. crmda.KU.edu

17 Other Issues I Classical school vs. Modeling School –Objectivity versus Transparency –Items vs. Indicators –Factors vs. Constructs Self-correcting nature of science Suboptimal simulations –Don’t include population misfit –Emphasize the ‘straw conditions’ and proofing the obvious; sometimes over generalize crmda.KU.edu

18 Other Issues II Focus of inquiry –Question about the items/scale development? Avoid parcels –Question about the constructs? Parcels are warranted but must be done thoughtfully! –Question about factorial invariance? Parcels are OK if done thoughtfully. crmda.KU.edu

19 Measurement Measurement starts with Operationalization Defining a concept with specific observable characteristics [Hitting and kicking ~ operational definition of Overt Aggression] Process of linking constructs to their manifest indicants (object/event that can be seen, touched, or otherwise recorded; cf. items vs. indicators) Rule-bound assignment of numbers to the indicants of that which exists [e,g., Never=1, Seldom=2, Often=3, Always=4] … although convention often ‘rules’, the rules should be chosen and defined by the investigator “Whatever exists at all exists in some amount. To know it thoroughly involves knowing its quantity as well as its quality” - E. L. Thorndike (1918) “Whatever exists at all exists in some amount. To know it thoroughly involves knowing its quantity as well as its quality” - E. L. Thorndike (1918) crmda.KU.edu

20 “Indicators are our worldly window into the latent space” - John R. Nesselroade crmda.KU.edu

21 Classical Variants a) X i = T i + S i + e i b) X = T + S + e c) X 1 = T 1 + S 1 + e 1 X i : a person’s observed score on an item T i : 'true' score (i.e., what we hope to measure) S i : item-specific, yet reliable, component e i : random error or noise. Assume: S i and e i are normally distributed (with mean of zero) and uncorrelated with each other Across all items in a domain, the S i s are uncorrelated with each other, as are the e i s crmda.KU.edu

22 Latent Variable Variants a) X 1 = T + S 1 + e 1 b) X 2 = T + S 2 + e 2 c) X 3 = T + S 3 + e 3 X 1 -X 3 : are multiple indicators of the same construct T : common 'true' score across indicators S 1 -S 3 : item-specific, yet reliable, component e 1 -e 3 : random error or noise. Assume: S s and e s are normally distributed (with mean of zero) and uncorrelated with each other Across all items in a domain, the S s are uncorrelated with each other, as are the e s crmda.KU.edu

23 Empirical Pros Psychometric Characteristics of Parcels (vs. Items) Higher reliability, communality, & ratio of common- to-unique factor variance Lower likelihood of distributional violations More, Smaller, and more-equal intervals NeverSeldomOftenAlways Happy1234 Glad1234 Mean Sum crmda.KU.edu

24 More Empirical Pros Model Estimation and Fit with Parcels (vs. Items) Fewer parameter estimates Lower indicator-to-subject ratio Reduces sources of parsimony error (population misfit of a model)  Lower likelihood of correlated residuals & dual factor loading Reduces sources of sampling error Makes large models tractable/estimable crmda.KU.edu

25 Sources of Variance in Items crmda.KU.edu

26 Simple Parcel T e2e2e2e2 T s3s3s3s3 e3e3e3e3 + 3 T S1S1S1S1 e1e1e1e1 + T SxSxSxSx S2S2S2S2 1/9 of their original size! crmda.KU.edu

27 Correlated Residual T e8e8e8e8 T s9s9s9s9 e9e9e9e9 + 3 T S7S7S7S7 e7e7e7e7 + T SySySySy S8S8S8S8 SySySySy crmda.KU.edu

28 Cross-loading: Correlated Factors e5e5e5e5 s6s6s6s6 e6e6e6e6 + 3 U1 S4S4S4S4 e4e4e4e4 + S5S5S5S5 C U1 C U1 C U2 C U1 C T1 T2 crmda.KU.edu

29 Cross-loading: Uncorrelated Factors 3 e5e5e5e5 s6s6s6s6 e6e6e6e6 + U1 S4S4S4S4 e4e4e4e4 + S5S5S5S5 C U1 C U1 C U2 U1 C T1 crmda.KU.edu

30 Construct = Common Variance of Indicators crmda.KU.edu

31 Construct = Common Variance of Indicators crmda.KU.edu

32 Empirical Cautions TT S1S1S1S1 S2S2S2S2 E1E1E1E1 E2E2E2E2 + Specific Construct Error ¼ of their original size! crmda.KU.edu 2 T M M

33 Construct = Common Variance of Indicators crmda.KU.edu

34 Construct = Common Variance of Indicators crmda.KU.edu

35 Three is Ideal Positive1Negative  21  11  22  33  44  55  66 Great &Glad & GladUnhappy & Bad Down & Blue Terrible &Sad & SadHappy & Super Cheerful & Good 1*  11  22 crmda.KU.edu

36 P1 11   11 P2 11    22 P3 11     33 N1 11      44 N2 11       55 N3 11        66 Great & Glad Unhappy & Bad Down & Blue Terrible & Sad Happy & Super Cheerful & Good Matrix Algebra Formula: Σ = Λ Ψ Λ´ + Θ Cross-construct item associations (in box) estimated only via Ψ 21 – the latent constructs’ correlation. Degrees of freedom only arise from between construct relations Three is Ideal crmda.KU.edu

37 Empirical Cons Multidimensionality Constructs and relationships can be hard to interpret if done improperly Model misspecification Can get improved model fit, regardless of whether model is correctly specified Increased Type II error rate if question is about the items Parcel-allocation variability Solutions depend on the parcel allocation combination (Sterba & McCallum, 2010; Sterba, in press) Applicable when the conditions for sampling error are high crmda.KU.edu

38 Psychometric Issues Principles of Aggregation (e.g., Rushton et al.) Any one item is less representative than the average of many items (selection rationale) Aggregating items yields greater precision Law of Large Numbers More is better, yielding more precise estimates of parameters (and a person’s true score) Normalizing tendency crmda.KU.edu

39 Construct Space with Centroid crmda.KU.edu

40 Potential Indicators of the Construct crmda.KU.edu

41 Selecting Six (Three Pairs) crmda.KU.edu

42 … take the mean crmda.KU.edu

43 … and find the centroid crmda.KU.edu

44 Selecting Six (Three Pairs) crmda.KU.edu

45 … take the mean crmda.KU.edu

46 … find the centroid crmda.KU.edu

47 How about 3 sets of 3? crmda.KU.edu

48 … taking the means crmda.KU.edu

49 … yields more reliable & accurate indicators crmda.KU.edu

50 Building Parcels Theory – Know thy S and the nature of your items. Random assignment of items to parcels (e.g., fMRI) Use Sterba’s calculator to find allocation variability when sampling error is high. Balancing technique Combine items with higher loadings with items having smaller loadings [Reverse serpentine pattern] Using a priori designs (e.g., CAMI) Develop new tests or measures with parcels as the goal for use in research crmda.KU.edu

51 Techniques: Multidimensional Case Example: ‘Intelligence’ ~ Spatial, Verbal, Numerical Domain Representative Parcels Has mixed item content from various dimensions Parcel consists of: 1 Spatial item, 1 Verbal item, and 1 Numerical item Facet Representative Parcels Internally consistent, each parcel is a ‘facet’ or singular dimension of the construct Parcel consists of: 3 Spatial items Recommended method crmda.KU.edu

52 Domain Representative Parcels SV += N + SV += N + SV += N + Spatial Verbal Numerical Parcel #1 Parcel #3 Parcel #2 crmda.KU.edu

53 Domain Representative crmda.KU.edu

54 Domain Representative But which facet is driving the correlation among constructs? Intellective Ability, Spatial Ability, Verbal Ability, Numerical Ability crmda.KU.edu

55 Facet Representative Parcels SS += S + VV += V + NN += N + Parcel: Spatial Parcel: Verbal Parcel: Numerical crmda.KU.edu

56 Facet Representative crmda.KU.edu

57 Facet Representative Intellective Ability Diagram depicts smaller communalities (amount of shared variance) crmda.KU.edu

58 Facet Representative Parcels +=+ +=+ +=+ A more realistic case with higher communalities crmda.KU.edu

59 Facet Representative crmda.KU.edu

60 Facet Representative Intellective Ability Parcels have more reliable information crmda.KU.edu

61 2 nd Order Representation Capture multiple sources of variance? crmda.KU.edu

62 2 nd Order Representation Variance can be partitioned even further crmda.KU.edu

63 2 nd Order Representation Lower-order constructs retain facet-specific variance crmda.KU.edu

64 Functionally Equivalent Models Explicit Higher-Order Structure Implicit Higher-Order Structure crmda.KU.edu

65 When Facet Representative Is Best crmda.KU.edu

66 When Domain Representative Is Best crmda.KU.edu

67crmda.KU.edu Todd D. Little University of Kansas Director, Quantitative Training Program Director, Center for Research Methods and Data Analysis Director, Undergraduate Social and Behavioral Sciences Methodology Minor Member, Developmental Psychology Training Program crmda. KU.edu Workshop presented University of Turku Based on Presidential Address presented American Psychological Association Meeting in Washington, DC Thank You!

Update Dr. Todd Little is currently at Texas Tech University Director, Institute for Measurement, Methodology, Analysis and Policy (IMMAP) Director, “Stats Camp” Professor, Educational Psychology and Leadership IMMAP (immap.educ.ttu.edu) Stats Camp (Statscamp.org) 68www.Quant.KU.edu