Research: Conceptualization and Measurement Conceptualization Steps in measuring a variable Operational definitions Confounding Criteria for measurement.

Slides:



Advertisements
Similar presentations
Agenda Levels of measurement Measurement reliability Measurement validity Some examples Need for Cognition Horn-honking.
Advertisements

Conceptualization, Operationalization, and Measurement
Measurement Concepts Operational Definition: is the definition of a variable in terms of the actual procedures used by the researcher to measure and/or.
Conceptualization and Measurement
Cal State Northridge Psy 427 Andrew Ainsworth PhD
Survey Methodology Reliability and Validity EPID 626 Lecture 12.
The Research Consumer Evaluates Measurement Reliability and Validity
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
VALIDITY AND RELIABILITY
Research Methodology Lecture No : 11 (Goodness Of Measures)
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
Designing Research Concepts, Hypotheses, and Measurement
Reliability and Validity of Research Instruments
Design Into Practice These lectures tie into Terre Blanche chapter 4 and 5 Now you have a design – how do you run the study? Many practical issues involved.
RESEARCH METHODS Lecture 18
RELIABILITY & VALIDITY
MEASUREMENT. Measurement “If you can’t measure it, you can’t manage it.” Bob Donath, Consultant.
Beginning the Research Design
ITEC6310 Research Methods in Information Technology
Chapter 7 Evaluating What a Test Really Measures
Classroom Assessment A Practical Guide for Educators by Craig A
Reliability and Validity. Criteria of Measurement Quality How do we judge the relative success (or failure) in measuring various concepts? How do we judge.
Measurement of Abstract Concepts Edgar Degas: Madame Valpincon with Chrysantehmums, 1865.
Proposal Writing.
The Practice of Social Research
Indexes, Scales, and Typologies
Measurement and Data Quality
Validity and Reliability
Reliability, Validity, & Scaling
Experimental Research
Ch 6 Validity of Instrument
Research Method Step 1 – Formulate research question Step 2 – Operationalize concepts ◦ Valid and reliable indicators Step 3 – Decide on sampling technique.
Instrumentation.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 14 Measurement and Data Quality.
Validity & Practicality
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
6. Conceptualization & Measurement
Measurement Cameron G. Thies University of Iowa. The Measurement Process What is measurement? – The process of assigning numbers or labels to units of.
ScWk 240 Week 6 Measurement Error Introduction to Survey Development “England and America are two countries divided by a common language.” George Bernard.
Reliability & Validity
Measurement Validity.
Research: Conceptualization and Measurement Conceptualization Steps in measuring a variable Operational definitions Confounding Criteria for measurement.
Developing Measures Concepts as File Folders Three Classes of Things That can be Measured (Kaplan, 1964) Direct Observables--Color of the Apple or a Check.
Chapter 8 Validity and Reliability. Validity How well can you defend the measure? –Face V –Content V –Criterion-related V –Construct V.
Validity Validity: A generic term used to define the degree to which the test measures what it claims to measure.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 17 Assessing Measurement Quality in Quantitative Studies.
Week 4 Slides. Conscientiousness was most highly voted for construct We will also give other measures – protestant work ethic and turnover intentions.
 Measuring Anything That Exists  Concepts as File Folders  Three Classes of Things That can be Measured (Kaplan, 1964) ▪ Direct Observables--Color of.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
MEASUREMENT: PART 1. Overview  Background  Scales of Measurement  Reliability  Validity (next time)
Chapter 6 - Standardized Measurement and Assessment
1 Announcement Movie topics up a couple of days –Discuss Chapter 4 on Feb. 4 th –[ch.3 is on central tendency: mean, median, mode]
Measurement Chapter 6. Measuring Variables Measurement Classifying units of analysis by categories to represent variable concepts.
Copyright © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 11 Measurement and Data Quality.
RELIABILITY AND VALIDITY Dr. Rehab F. Gwada. Control of Measurement Reliabilityvalidity.
Data Collection Methods NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN.
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
Chapter 3 Designing Research Concepts, Hypotheses, and Measurement.
Survey Methodology Reliability and Validity
Reliability and Validity
Chapter 2 Theoretical statement:
Reliability and Validity
Concept of Test Validity
Journalism 614: Reliability and Validity
پرسشنامه کارگاه.
RESEARCH METHODS Lecture 18
Methodology Week 5.
Cal State Northridge Psy 427 Andrew Ainsworth PhD
Presentation transcript:

Research: Conceptualization and Measurement Conceptualization Steps in measuring a variable Operational definitions Confounding Criteria for measurement quality Techniques of dealing with problems in measurement Ensuring reliability Validity Face validity Content validity Criterion related validity Construct validity 1

Conceptualization We want to speak of abstract things: “Intelligence” “Ability to cope with stress” “Life satisfaction” “Happiness” We cannot research these things until we know exactly what they are. Everyday language often vague and unspecified meanings. 2

Conceptualization Specify exactly what we mean and don’t mean by the terms we use in our research. No “true” (final) definitions of “the stuff of life” Conceptualization: The process of identifying and clarifying concepts We specify what we mean by using certain terms Indicators=the presence or absence of the concept we are studying. These are often multi-dimensional More than one specifiable aspect or facet What do we mean by happiness? 3

Steps in reaching a measurement of a variable We all have conceptions of what we understand by compassion, prejudice, poverty, etc. People do not always agree about the meanings Begin by asking people to describe what they mean when they use certain terms such as “intelligence” Consult the EXPERTS Literature review Even the experts do not agree Coming to an agreement on what we understand is called conceptualization. Result of this process is a concept e.g., “prejudice” 4

Operational definitions Specifying exactly what we are going to observe, and how we will do it. Make the variable directly measurable Describe of the “operations” used to measure a concept 5

Examples “Socio-economic Status” (SES): What was your total family income during the past 12 months? What is the highest level of school you completed? How would you operationalize “success at university?” If you operationalize badly, you end up not studying what you want (invalid operational definitions) E.g., operationalizing “success in career” by looking only at pay check 6

Confounding Intelligence tests require knowledge of the language in which they are given Also measuring acquired language skills Juvenile delinquency can be defined in terms of convictions in court But convictions are more frequent when they are not legally represented - thus also measuring economic status Confounding = When operational definitions measure more than one thing 7

Criteria for measurement quality Reliability Does it yield the same result every time? Stability over time. If I measure you now and again in half-an- hour, do I get the same reading? Maximum reliability depends on the construct – some constructs are unstable, e.g., heart rate. Single observers or raters 8

Techniques of dealing with problems in measurement reliability Test-retest method – Make the same measurement more than once (external) Split-half method – Divide the instrument in two halves - Cronbach's alpha (internal) 9

Ensuring reliability Reliability suffers when respondents or researchers have to interpret Objective scales are always more reliable Allow for little interpretation Using a fixed-response format helps e.g. Multiple choice, Likert type response formats Researcher does not have to interpret what the respondent meant 10

***Validity The extent to which an empirical measure adequately reflects the meaning of the concept under investigation For a scale: The degree to which it measures what it is supposed to measure Validity is divided into many types: Content validity Criterion-related validity Construct validity Face validity 11

Face validity How a measure conforms to our common agreements Examine the wording of the items Submit items to expert judges 12

Content validity How much a measure covers every element of a concept. Example: Measuring only the affective aspect of love, but not the behavioral. Experts in a given domain generally judge content validity. For example, the content of the SAT Subject Tests™ is evaluated by committees made up of experts who ensure that each test covers content that matches all relevant subject matter in its academic discipline. 13

Criterion-related validity Sometimes called predictive validity How a measure predicts performance on an external criterion e.g., How ACT or SAT results predict academic success at university, as a way of saying they have validity 14

Construct validity Closely tied to into the theoretical underpinnings of the concept. Variables ought to be related, theoretically, to other variables. This kind of validity is based on logical relationships between variables So: Does the instrument actually measure the concept (or construct)? e.g., Measure cranial circumference or brain weight to measure intelligence Most difficult to achieve, most important – measures lacking in construct validity are almost useless 15

How to check for construct validity How can you show that a measurement truly measures what it claims to? How would you show that your depression scale has construct validity? 16

How to check 1. See how it relates to similar and dissimilar concepts: Show that your depression scale relates positively to similar concepts e.g., People who score high on your depression scale will have many sad thoughts 17

How to check 2. Show that your depression scale relates negatively to opposite concepts. Examples: People who score high on it will have very low energy levels Husbands who score high on a measure of marital satisfaction have fewer extra-marital affairs 18

Conclusion The more a scale or instrument has the qualities of reliability and validity, the better it is. Reliability and validity need to be sorted out before you run the study 19

Conclusion Tension between validity and reliability Richness of meaning of concepts Operational definitions and measurements seem to rob concepts of their “richness of meaning” But the more conceptual variation and richness we allow in our study, the more opportunity for disagreement on how it applies in this situation Related to the tension between quantitative, structured techniques such as surveys, and qualitative, semi-structured methods such as interviews 20