CHAPTER OVERVIEW The Measurement Process Levels of Measurement Reliability and Validity: Why They Are Very, Very Important A Conceptual Definition of Reliability.

Slides:



Advertisements
Similar presentations
Chapter 8 Flashcards.
Advertisements

Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
Ch 5: Measurement Concepts
Taking Stock Of Measurement. Basics Of Measurement Measurement: Assignment of number to objects or events according to specific rules. Conceptual variables:
MEASUREMENT CONCEPTS © 2012 The McGraw-Hill Companies, Inc.
VALIDITY AND RELIABILITY
Chapter 5 Measurement, Reliability and Validity.
Professor Gary Merlo Westfield State College
Defining, Measuring and Manipulating Variables. Operational Definition  The activities of the researcher in measuring and manipulating a variable. 
Part II Sigma Freud & Descriptive Statistics
Part II Sigma Freud & Descriptive Statistics
Business Research for Decision Making Sixth Edition by Duane Davis Chapter 7 Foundations of Measurement PowerPoint Slides for the Instructor’s Resource.
Copyright © Allyn & Bacon (2007) Data and the Nature of Measurement Graziano and Raulin Research Methods: Chapter 4 This multimedia product and its contents.
LECTURE 9.
Data and the Nature of Measurement
MEASUREMENT. Measurement “If you can’t measure it, you can’t manage it.” Bob Donath, Consultant.
Concept of Measurement
Beginning the Research Design
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
Variables cont. Psych 231: Research Methods in Psychology.
Measurement and Data Quality
Measurement in Survey Research MKTG 3342 Fall 2008 Professor Edward Fox.
Defining and Measuring Variables Slides Prepared by Alison L. O’Malley Passer Chapter 4.
Slide 9-1 © 1999 South-Western Publishing McDaniel Gates Contemporary Marketing Research, 4e Understanding Measurement Carl McDaniel, Jr. Roger Gates Slides.
Measurement in Exercise and Sport Psychology Research EPHE 348.
What is Statistics?  Set of methods and rules for organizing summarizing, and interpreting information 2.
Instrumentation.
Foundations of Educational Measurement
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Copyright © 2008 by Nelson, a division of Thomson Canada Limited Chapter 11 Part 3 Measurement Concepts MEASUREMENT.
Chapter Eight The Concept of Measurement and Attitude Scales
1 Psychology 2020 Measurement & Observing Behavior Unit 2.
Chapter Five Measurement Concepts. Terms Reliability True Score Measurement Error.
Reliability & Validity
Counseling Research: Quantitative, Qualitative, and Mixed Methods, 1e © 2010 Pearson Education, Inc. All rights reserved. Basic Statistical Concepts Sang.
Chapter 2: Behavioral Variability and Research Variability and Research 1. Behavioral science involves the study of variability in behavior how and why.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
MOI UNIVERSITY SCHOOL OF BUSINESS AND ECONOMICS CONCEPT MEASUREMENT, SCALING, VALIDITY AND RELIABILITY BY MUGAMBI G.K. M’NCHEBERE EMBA NAIROBI RESEARCH.
SOCW 671: #5 Measurement Levels, Reliability, Validity, & Classic Measurement Theory.
Variables It is very important in research to see variables, define them, and control or measure them.
Psychometrics. Goals of statistics Describe what is happening now –DESCRIPTIVE STATISTICS Determine what is probably happening or what might happen in.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
MEASUREMENT: PART 1. Overview  Background  Scales of Measurement  Reliability  Validity (next time)
Technical Adequacy of Tests Dr. Julie Esparza Brown SPED 512: Diagnostic Assessment.
SECOND EDITION Chapter 5 Standardized Measurement and Assessment
Measurement Experiment - effect of IV on DV. Independent Variable (2 or more levels) MANIPULATED a) situational - features in the environment b) task.
Chapter 6 - Standardized Measurement and Assessment
Lesson 3 Measurement and Scaling. Case: “What is performance?” brandesign.co.za.
Measurement Chapter 6. Measuring Variables Measurement Classifying units of analysis by categories to represent variable concepts.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
© 2009 Pearson Prentice Hall, Salkind. Chapter 5 Measurement, Reliability and Validity.
Data Collection Methods NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN.
Measurement and Scaling Concepts
Chapter 2 Theoretical statement:
Data and the Nature of Measurement
Ch. 5 Measurement Concepts.
CHAPTER 4 Research in Psychology: Methods & Design
Product Reliability Measuring
MEASUREMENT: RELIABILITY AND VALIDITY
Measurement: Part 1.
Associated with quantitative studies
CHAPTER 5 MEASUREMENT CONCEPTS © 2007 The McGraw-Hill Companies, Inc.
Journalism 614: Reliability and Validity
Introduction to Measurement
Measurement: Part 1.
Measurement Concepts and scale evaluation
Measurement: Part 1.
Ch 5: Measurement Concepts
Presentation transcript:

CHAPTER OVERVIEW The Measurement Process Levels of Measurement Reliability and Validity: Why They Are Very, Very Important A Conceptual Definition of Reliability Validity The Relationship Between Reliability and Validity A Closing (Very Important) Thought

THE MEASUREMENT PROCESS Two definitions –Stevens—”assignment of numerals to objects or events according to rules.” –“…the assignment of values to outcomes.” Chapter foci –Levels of measurement –Reliability and validity

LEVELS OF MEASUREMENT Variables are measured at one of these four levels Qualities of one level are characteristic of the next level up The more precise (higher) the level of measurement, the more accurate is the measurement process Level of Measurement For exampleQuality of Level RatioRachael is 5’ 10” and Gregory is 5’ 5”Absolute zero IntervalRachael is 5” taller than GregoryAn inch is an inch is an inch OrdinalRachael is taller than GregoryGreater than NominalRachael is tall and Gregory is shortDifferent from

NOMINAL SCALE QualitiesExampleWhat You Can Say What You Can’t Say Assignment of labels Gender— (male or female) Preference— (like or dislike) Voting record— (for or against) Each observation belongs in its own category An observation represents “more” or “less” than another observation

ORDINAL SCALE QualitiesExampleWhat You Can Say What You Can’t Say Assignment of values along some underlying dimension Rank in college Order of finishing a race One observation is ranked above or below another. The amount that one variable is more or less than another

INTERVAL SCALE QualitiesExampleWhat You Can Say What You Can’t Say Equal distances between points Number of words spelled correctly Intelligence test scores Temperature One score differs from another on some measure that has equally appearing intervals The amount of difference is an exact representation of differences on the variable being studied

RATIO SCALE QualitiesExampleWhat You Can Say What You Can’t Say Meaningful and non-arbitrary zero Age Weight Time One value is twice as much as another or no quantity of that variable can exist Not much!

WHAT IS ALL THE FUSS? Measurement should be as precise as possible In psychology, most variables are probably measured at the nominal or ordinal level But—how a variable is measured can determine the level of precision

RELIABILITY AND VALIDITY Reliability—tool is consistent Validity—tool measures “what-it-should” Good assessment tools  –Rejection of Null hypotheses OR –Acceptance of Research hypotheses

A CONCEPTUAL DEFINITION OF RELIABILITY Method Error Observed Score = True Score + Error Score Trait Error

A CONCEPTUAL DEFINITION OF RELIABILITY Observed score –Score actually observed –Consists of two components True Score Error Score Method Error Observed Score = True Score + Error Score Trait Error

A CONCEPTUAL DEFINITION OF RELIABILITY True score –Perfect reflection of true value for individual –Theoretical score Method Error Observed Score = True Score + Error Score Trait Error

A CONCEPTUAL DEFINITION OF RELIABILITY Error score –Difference between observed and true score Method Error Observed Score = True Score + Error Score Trait Error

A CONCEPTUAL DEFINITION OF RELIABILITY Method error is due to characteristics of the test or testing situation Trait error is due to individual characteristics Reliability of the observed score becomes higher if error is reduced!! Method Error Observed Score = True Score + Error Score Trait Error

INCREASING RELIABILITY  Decreasing Error Increase sample size Eliminate unclear questions Standardize testing conditions Use both easy and difficult questions Minimize the effects of external events Standardize instructions Maintain consistent scoring procedures

HOW RELIABILITY IS MEASURED Reliability is measured using a –Correlation coefficient –r test1test2 Reliability coefficients –Indicate how scores on one test change relative to scores on a second test –Can range from -1.0 to +1.0 (perfect reliability)

TYPES OF RELIABILITY Type of Reliability What It IsHow You Do ItWhat the Reliability Coefficient Looks Like Test-RetestA measure of stability Administer the same test/measure at two different times to the same group of participants r test1test1 Parallel Forms A measure of equivalence Administer two different forms of the same test to the same group of participants r form1form2 Inter-RaterA measure of agreement Have two raters rate behaviors and then determine the amount of agreement between them Percentage of agreements Internal Consistency A measure of how consistently each item measures the same underlying construct Correlate performance on each item with overall performance across participants Cronbach’s alpha

VALIDITY A valid test does what it was designed to do A valid test measures what it was designed to measure

A CONCEPTUAL DEFINITION OF VALIDITY Validity refers to the test’s results, not to the test itself Validity ranges from low to high, it is not “either/or” Validity must be interpreted within the testing context

TYPES OF VALIDITY Type of ValidityWhat Is It?How Do You Establish It? ContentA measure of how well the items represent the entire universe of items Ask an expert if the items assess what you want them to Criterion ConcurrentA measure of how well a test estimates a criterion Select a criterion and correlate scores on the test with scores on the criterion in the present PredictiveA measure of how well a test predicts a criterion Select a criterion and correlate scores on the test with scores on the criterion in the future ConstructA measure of how well a test assesses some underlying construct Assess the underlying construct on which the test is based and correlate these scores with the test scores

HOW TO ESTABLISH CONSTRUCT VALIDITY OF A NEW TEST Correlate new test with an established test Show that people with and without certain traits score differently Determine whether tasks required on test are consistent with theory guiding test development

MULTITRAIT-MULITMETHOD MATRIX Convergent validity—different methods yield similar results Discriminant validity—different methods yield different results Method 1 Paper and Pencil Method 2 Activity Level Monitor Method 1 Paper and Pencil Method 2 Activity Level Monitor Trait 1 Method 1 Paper and Pencil ModerateLow Impulsivity Method 2 Activity Level Monitor Moderate Trait 2 Method 1 Paper and Pencil Activity Level Method 2 Activity Level Monitor Low Trait 1 Impulsivity Trait 2 Activity Level

THE RELATIONSHIP BETWEEN RELIABILITY AND VALIDITY A valid test must be reliable But A reliable test need not be valid