Presentation is loading. Please wait.

Presentation is loading. Please wait.

Measuring Social Life: Qualitative & Quantitative Measurement

Similar presentations


Presentation on theme: "Measuring Social Life: Qualitative & Quantitative Measurement"— Presentation transcript:

1 Measuring Social Life: Qualitative & Quantitative Measurement
Chapter 6

2 Why Measure? Some reasons:
evaluate an explanation test a hypothesis provide empirical support for a theory make a decision about medical treatment study an applied issue More central to quantitative research than qualitative research.

3 Qualitative vs. Quantitative Measurement
Timing Direction Data Form Linkages

4 Measurement Process Conceptualization Operationalization
Conceptual definition Operationalization Operational definition

5 Measurement Process Two Parts of the Measurement Process
Conceptualization = refining an idea by giving it a very clear, explicit definition. Conceptual Definition = Defining a variable or concept in theoretical terms with assumptions and references to other concepts. Operationalization = the process of linking a conceptual definition with specific measures. Operational Definition = defining a concept as specific operations or actions that you carry to measure it.

6 Quantitative Measurement

7 Quantitative Measurement
Quantitative Conceptualization and Operationalization Three-part sequence: Conceptualization Operationalization Measurement Conceptual hypothesis = Stating a hypothesis with the variables as abstract concepts. Empirical hypothesis = the hypothesis stated in terms of specific measures of variables.

8 Quantitative Measurement

9 A Guide to Quantitative Measurement
Levels of Measurement Measurement = The degree a measure is refined or precise. Continuous and Discrete Variables Continuous Variable = A variable that can be measured with numbers that can be subdivided into smaller increments. Discrete variable = A variable measured with a limited number of fixed categories.

10 Measurement and Scaling Techniques
The most widely used classification of measurement scales are: nominal scale Assigning number symbols to events in order to label them. Nominal scale is the least powerful level of measurement. ordinal scale the use of an ordinal scale implies a statement of ‘greater than’ or ‘less than’ (an equality statement is also acceptable)

11 Measurement and Scaling Techniques
The most widely used classification of measurement scales are: interval scale Interval scales provide more powerful measurement than ordinal scales The primary limitation of the interval scale is the lack of a true zero Example: Fahrenheit scale ratio scale have an absolute or true zero of measurement. Measures of physical dimensions such as weight, height, distance

12 A Guide to Quantitative Measurement
Level Different Categories Ranked Distance between Categories Measured True Zero Nominal Yes Ordinal Interval Ratio

13 Level of Measurement

14 Sources of Error in Measurement
Respondent the respondent may be reluctant to express strong negative feelings very little knowledge but may not admit his ignorance. Transient factors like fatigue, boredom, anxiety Situation anonymity is not assured, thus respondent reluctant to express certain feelings

15 Sources of Error in Measurement
Measurer Errors may also creep in because of incorrect coding, faulty tabulation and/or statistical calculations, particularly in the data-analysis stage. Instrument The use of complex words, beyond the comprehension of the respondent, ambiguous meanings, poor printing, inadequate space for replies, response choice omissions

16 Qualitative Measurement

17 Qualitative Measurement
Qualitative Conceptualization and Operationalization Ideas are in flux during data collection Definitions become clarified during data collection.

18 Qualitative vs. Quantitative Measurement
1. Conceptualize variables by developing a clear, complete written conceptual definition for the core idea of each. You want build on past theories, consider definitions others have used, and be very logical. 1. Gather empirical data and simultaneously think about concepts to organize and make sense of the data. Develop a clear definitions for each idea that you use. They may be ones you have read about, new ideas you create, or ones that the people you are studying use. 2. Operationalize variables by creating specific activities to measure each. This is your operational definition that will closely match how you have defined the variable in its conceptual definition. 2. As you gather data, be very aware of processes you use to make sense of the data and your own thinking. Reflect on and describe this process of linking ideas to specific observations in the data. 3. Gather empirical data using the specific measurement activities of your operational definition, this links data to the conceptual definition. 3. Review and refine your definitions and the descriptions of how you gathered data and made sense of it.

19 How to Create Good Measures: Reliability and Validity
Creating a Good Measure Keep an open mind. Borrow from others. Anticipate difficulties. Do not forget your unit of analysis.

20 How to Create Good Measures: Reliability and Validity
Reliability and Validity in Quantitative Research Reliability = A feature of measures – the method of measuring is dependable and consistent. Clearly Conceptualize. Increase the Level of Measurement. Use Multiple Indicators. Multiple indicators = Having several different specific measures that to indicate the same concept. Use Pilot Studies and Replication.

21 Increasing Reliability  Decreasing Error
Increase sample size Eliminate unclear questions Standardize testing conditions Moderate the degree of difficulty of the tests Minimize the effects of external events Standardize instructions Maintain consistent scoring procedures

22 How Reliability is measured?
Reliability is measured using a Correlation coefficient r test1•test2 Reliability coefficients Indicate how scores on one test change relative to scores on a second test Can range from -1.0 to +1.0 +1.00 = perfect reliability 0.00 = no reliability

23 What the Reliability Coefficient Looks Like
Types of reliability Type of Reliability What It Is How You Do It What the Reliability Coefficient Looks Like Test-Retest A measure of stability Administer the same test/measure at two different times to the same group of participants rtest1•test1 Parallel Forms A measure of equivalence Administer two different forms of the same test to the same group of participants rform1•form2 Inter-Rater A measure of agreement Have two raters rate behaviors and then determine the amount of agreement between them Percentage of agreements Internal Consistency A measure of how consistently each item measures the same underlying construct Correlate performance on each item with overall performance across participants Cronbach’s alpha Kuder-Richardson

24 How to Create Good Measures: Reliability and Validity
Reliability and Validity in Quantitative Research Validity = A feature of measures; the concept of interest closely matches the method used to measure it. Measurement validity is the fit between conceptual and operational definitions Validity is more difficult to achieve than reliability.

25 How to Create Good Measures: Reliability and Validity

26 Types of validity Type of Validity What Is It?
How Do You Establish It? Content A measure of how well the items represent the entire universe of items Ask an expert if the items assess what you want them to Criterion Concurrent A measure of how well a test estimates a criterion Select a criterion and correlate scores on the test with scores on the criterion in the present Predictive A measure of how well a test predicts a criterion Select a criterion and correlate scores on the test with scores on the criterion in the future Construct A measure of how well a test assesses some underlying construct Assess the underlying construct on which the test is based and correlate these scores with the test scores

27 Reliability and Validity Quantitative Research
Improve Reliability Clear Conceptualization Increase level of measurement Multiple indicators Pretest, pilot studies, and replication

28 How to Create Good Measures: Reliability and Validity
Reliability and Validity in Qualitative Research measure in a thoughtful and consistent manner, so that it is dependable measure in a consistent and self-conscious way Measure with authenticity. Authenticity means a fair, honest, and balanced account of social life that captures what is “real” for particular people living in a specific time and place.

29 How to Create Good Measures: Reliability and Validity
Putting Reliability and Validity Together Reliability is easier to achieve than validity. Reliability is necessary but not sufficient for validity. A measure can produce the same result over and over but what it measures may not match the definition of the construct (i.e., validity). You can have a reliable measure that is invalid.

30 Scaling

31 Capturing Intensity: Scale Construction
Scale = A measure that captures the intensity of a person’s behaviors or feelings. Some Commonly Used Scales Likert Scale Measuring Social Distance Semantic Differential Guttman Scaling Thurstone Scalling

32 Likert scales Statements are written indicating an attitude toward a topic Items with clearly positive or negative attitudes are selected Statements are listed with a space for respondent to indicate degree of agreement

33 A Likert Scale Directions: Indicate to what extent you agree or disagree with the statements listed below by circling one of the following: SA means that you strongly agree with the statement (value = 5) A means that you agree with the statement (value = 4) U means that you are undecided about the statement (value = 3) D means that you disagree with the statement (value = 2) SD means that you strongly disagree with the statement (value = 1) Item Rating Government has no business funding child care programs. SD D U A SA Child care should be supported by federal, state, and local tax dollars.

34 Scoring Likert Responses Method of Summated Ratings
Item Rating Government has no business funding child care programs. SD D U A SA Child care should be supported by federal, state, and local tax dollars. Items are weighted Weights of unfavorable items are reversed Average score is computed

35 Thurstone scales Method of Equal Appearing Intervals
Statements are written indicating an attitude toward a topic Judges rank the statements from least favorable to most favorable Statements receiving consistent ratings are given the average score A set of statements is selected that covers the entire range of attitudes

36 Thurstone scales Administration
Respondents check items with which they agree Well-formed attitudes are indicated by consistently checking either high or low items Poorly-formed or inconsistent attitudes are indicated by inconsistent patterns or by checking off many neutral items

37 Thurstone Scale

38 Bogardus Social Distance Scale

39 Semantic Differential

40 Guttman Scaling

41 Capturing Intensity: Scale Construction
Summary Review Major Scales Likert—General Attitude Measure Indicates attitude using ranked answers showing degree of agreement/support. Semantic Differential—Indirect Evaluation Measure Indicates subjective feelings using connotations in adjective sets. Bogardus—Social Distance Measure Indicates acceptance of various levels of social intimacy with out-groups. Guttman—Structural of Response Measure Indicates whether a set of items corresponds to a hierarchical pattern Out-group Sociologists define an outgroupas a group to which you do not belong and from which you feel separated


Download ppt "Measuring Social Life: Qualitative & Quantitative Measurement"

Similar presentations


Ads by Google