Presentation is loading. Please wait.

Presentation is loading. Please wait.

Research: Conceptualization and Measurement Conceptualization Steps in measuring a variable Operational definitions Confounding Criteria for measurement.

Similar presentations


Presentation on theme: "Research: Conceptualization and Measurement Conceptualization Steps in measuring a variable Operational definitions Confounding Criteria for measurement."— Presentation transcript:

1 Research: Conceptualization and Measurement Conceptualization Steps in measuring a variable Operational definitions Confounding Criteria for measurement quality Techniques of dealing with problems in measurement Ensuring reliability Validity Face validity Content validity Criterion related validity Construct validity 1

2 Conceptualization Study of abstract things: “Intelligence” “Ability to cope with stress” “Life satisfaction” “Happiness” We cannot research these things until we know exactly what they are. Everyday language vague 2

3 Conceptualization Must specify exactly what we mean and don’t mean by terms used in research. No “true” (final) definitions of “the stuff of life” Conceptualization: Process of identifying and clarifying concepts; Indicators=Presence or absence of concept Often multi-dimensional What do we mean by happiness? 3

4 Measuring a Variable Understand by compassion, prejudice, poverty, etc. Not always agreement Ask people what they mean by “intelligence” Consult the EXPERTS Literature review Even experts do not agree Coming to an agreement = conceptualization. Result of process is a concept 4

5 Operational definitions Specifying exactly what we will observe, and how we will do it. Make variable directly measurable Description of the “operations” that we will undertake to measure a concept 5

6 Examples “Socio-economic Status”: What was your total family income during the past 12 months? What is the highest level of school you completed? How would you operationalize “success at university?” If you operationalize badly, you end up not studying what you want (invalid operational definitions) E.g., operationalizing “success in career” by looking only at pay check 6

7 Confounding Intelligence tests require knowledge of language in which they are given Therefore, tests also measure acquired language skills Juvenile delinquency can be defined as convictions in court But convictions are more frequent when they are not legally represented - thus also measuring economic status Confounding = When operational definitions measure more than one thing 7

8 Criteria for Measurement Quality Reliability Does measure yield same result every time? Stability over time. If subjects measured now and again in half-an-hour, get same results? Maximum reliability depends on the construct and raters Some constructs are unstable, e.g., heart rate. Variation in raters 8

9 Techniques of dealing with Problems in Measurement Reliability Test-retest method – Make same measurement more than once (external) Split-half method – Divide instrument into halves - Cronbach's alpha (internal) 9

10 Ensuring reliability Reliability suffers when respondents or researchers have to interpret Objective scales more reliable Little interpretation Using a fixed-response format helps e.g. Multiple choice, Likert type response formats Researcher does not have to interpret what respondent meant 10

11 Validity Extent to which empirical measure adequately reflects meaning of concept under investigation For a scale: Degree to which it measures what it is supposed to measure Validity Types: Content validity Criterion-related validity Construct validity Face validity 11

12 Face validity How a measure conforms to common agreements Examine the wording of the items Submit items to expert judges 12

13 Content validity How much a measure covers every element of a concept. Example: Measuring only the affective aspect of love, but not the behavioral. Experts generally judge content validity. For example, the content of the SAT Subject Tests™ is evaluated by committees made up of experts who ensure that each test covers content that matches all relevant subject matter in its academic discipline. 13

14 Criterion-related validity Sometimes called predictive validity How a measure predicts performance on an external criterion e.g., How ACT or SAT results predict academic success at university, as a way of saying they have validity 14

15 Construct validity Closely tied to into the theoretical underpinnings of the concept. Variables ought to be related, theoretically, to other variables. This kind of validity is based on logical relationships between variables So: Does the instrument actually measure the concept (or construct)? e.g., Measure cranial circumference or brain weight to measure intelligence Most difficult to achieve, most important – measures lacking in construct validity are almost useless 15

16 How to check for construct validity How can you show that a measurement truly measures what it claims to? How would you show that your depression scale has construct validity? 16

17 How to check 1. See how it relates to similar and dissimilar concepts: Show that your depression scale relates positively to similar concepts e.g., People who score high on your depression scale will have many sad thoughts 17

18 How to check 2. Show that your depression scale relates negatively to opposite concepts. Examples: People who score high on it will have very low energy levels Husbands who score high on a measure of marital satisfaction have fewer extra-marital affairs 18

19 Conclusion The more a scale or instrument has the qualities of reliability and validity, the better it is. Reliability and validity need to be sorted out before you run the study 19


Download ppt "Research: Conceptualization and Measurement Conceptualization Steps in measuring a variable Operational definitions Confounding Criteria for measurement."

Similar presentations


Ads by Google