Scales and Indices While trying to capture the complexity of a phenomenon We try to seek multiple indicators, regardless of the methodology we use: Qualitative.

Slides:



Advertisements
Similar presentations
Allyn & Bacon 2003 Social Work Research Methods: Qualitative and Quantitative Approaches Topic 7: Basics of Measurement Examine Measurement.
Advertisements

Standardized Scales.
Scales and Indices Scales and Indices combine several categories in a question or several questions into a “composite measure” that represents a larger.
Conceptualization and Measurement
Professor Gary Merlo Westfield State College
Research Methodology Lecture No : 11 (Goodness Of Measures)
The Scientific Method.
Validity In our last class, we began to discuss some of the ways in which we can assess the quality of our measurements. We discussed the concept of reliability.
RESEARCH METHODS Lecture 30. DATA TRANSFORMATION.
1 Single Indicator & Composite Measures UAPP 702: Research Design for Urban & Public Policy Based on notes by Steven W. Peuquet. Ph.D.
Beginning the Research Design
McGraw-Hill/Irwin © 2003 The McGraw-Hill Companies, Inc.,All Rights Reserved. Part Two THE DESIGN OF RESEARCH.
1 Measurement PROCESS AND PRODUCT. 2 MEASUREMENT The assignment of numerals to phenomena according to rules.
Sampling and Experimental Control Goals of clinical research is to make generalizations beyond the individual studied to others with similar conditions.
1 Measurement Measurement Rules. 2 Measurement Components CONCEPTUALIZATION CONCEPTUALIZATION NOMINAL DEFINITION NOMINAL DEFINITION OPERATIONAL DEFINITION.
Chapter 6 Indexes, Scales, and Typologies. Index and Scale  Index  Constructed by accumulating scores assigned to individual attributes.  Scale  Constructed.
Personality, 9e Jerry M. Burger
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
Validity Lecture Overview Overview of the concept Different types of validity Threats to validity and strategies for handling them Examples of validity.
Measuring Social Life Ch. 5, pp
Proposal Writing.
Indexes, Scales, and Typologies
9/7/2015 Research Methods for Counselors COUN 597 University of Saint Joseph Class # 2 Copyright © 2015 by R. Halstead. All rights reserved.
Sampling January 9, Cardinal Rule of Sampling Never sample on the dependent variable! –Example: if you are interested in studying factors that lead.
Sampling. Concerns 1)Representativeness of the Sample: Does the sample accurately portray the population from which it is drawn 2)Time and Change: Was.
Measurement Neuman and Robson Ch. 6. What is it? The process of creating measurable concrete variables from abstract concepts Extends the senses (empirical)
CHAPTER 6, INDEXES, SCALES, AND TYPOLOGIES
Statistical analysis Prepared and gathered by Alireza Yousefy(Ph.D)
Metode Riset Akuntansi Measurement and Sampling. Measurement Measurement in research consists of assigning numbers to empirical events, objects, or properties,
Sampling Class 7. Goals of Sampling Representation of a population Representation of a population Representation of a specific phenomenon or behavior.
Advanced Research Methods Indices, Scales and Typologies By David Warren Kirsch.
HOW TO WRITE RESEARCH PROPOSAL BY DR. NIK MAHERAN NIK MUHAMMAD.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 16.
Measurement Models: Exploratory and Confirmatory Factor Analysis James G. Anderson, Ph.D. Purdue University.
Lecture 02.
Academic Research Academic Research Dr Kishor Bhanushali M
The Sampling Design. Sampling Design Selection of Elements –The basic idea of sampling is that by selecting some of the elements in a population, we may.
SOCI 2003B: Sociological Methods Colleen Anne Dell, Ph.D. Carleton University, Department of Sociology & Anthropology Canadian Centre on Substance Abuse.
Scaling and Index Construction
The Practice of Social Research Chapter 6 – Indexes, Scales, and Typologies.
Chapter 6 Indexes, Scales, And Typologies Key Terms.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Methods of Data Collection Survey Methods Self-Administered Questionnaires Interviews Methods of Observation Non-Participant Observation Participant Observation.
Scales and Indices While trying to capture the complexity of a phenomenon We try to seek multiple indicators, regardless of the methodology we use: Qualitative.
RESEARCH METHODS IN INDUSTRIAL PSYCHOLOGY & ORGANIZATION Pertemuan Matakuliah: D Sosiologi dan Psikologi Industri Tahun: Sep-2009.
Chapter 6 Indexes, Scales, And Typologies. Chapter Outline Indexes versus Scales Index Construction Scale Construction.
Research Methodology Lecture No :32 (Revision Chapters 8,9,10,11,SPSS)
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Indices and Scales To construct composite measures of variables, we need indices and scales Social science studies deal with many composite measures Many.
Indexes and Scales Why use a “composite” measure of a concept? ▫ There is often no clear “single” indicator ▫ Increase the range of variation ▫ Make data.
Indexes, Scales, and Typologies
CHAPTER 6, INDEXES, SCALES, AND TYPOLOGIES
Associated with quantitative studies
Lecture 02.
Social Research Methods MAN-10 Erlan Bakiev, Ph. D.
Data Collection Methods
Research strategies & Methods of data collection
Measuring Social Life: How Many? How Much? What Type?
Reliability and Validity of Measurement
Chapter 6 Indexes, Scales, And Typologies
Indexes, Scales, and Typologies
INDEXES, SCALES, & TYPOLOGIES
RESEARCH METHODS Lecture 30
Types of Control I. Measurement Control II. Statistical Control
RESEARCH METHODOLOGY ON ENVIRONMENTAL HEALTH PRACTICE IN WEST AFRICA
Research strategies & Methods of data collection
Chapter 6 Indexes, Scales, and Typologies
Indexes and Scales Why use a “composite” measure of a concept?
Consumer Behaviour Lecture 3.
Presentation transcript:

Scales and Indices

While trying to capture the complexity of a phenomenon We try to seek multiple indicators, regardless of the methodology we use: Qualitative Qualitative : we prepare a sequence of questions and then ask more questions that help us clarify the issue of investigation Quantitative Quantitative: we construct several questionnaire items that help identify the concept

Composite Measures In quantitative research are the Sequence of items that Target the same issue Within the same questionnaire To achieve a fuller representation of the concept under investigation

Index Babbie (2004, p. 152): A type of composite measure that summarizes and rank-orders several specific observations and represents some more general dimensions * In other words: * In other words: it combines several distinct indicators of a construct into a single score  generally is a sum of scores of such indicators

Index Example: a) Example: a) your first exam contained 67 objective multiple-choice questions. The number of correct answers you received is the index of your understanding of the subject. b) b) your first project in this class has a checklist of issues to be addressed while you are working on it. The number of checkmarks you make on it once completing the project is your index of how ready it is for submission.

Index Neuman ( 2000, p. 177 ): “Base your answers on your thoughts regarding the following four occupations: long-distance truck driver, medical doctor, accountant, telephone operator. Score each answer 1 for yes and 0 for no: 1. Does it pay a good salary? 2. Is the job secure from lay-offs 3. Is the work interesting and challenging? 4. Are its working conditions good? 5. Are there opportunities for career advancement? 6. Is it prestigious or looked up to by others? 7. Does it permit freedom in decision-making?”

Index Construction : Establish the face validity : - Do your items pertain to the population? - Are your items general or specific? - Do the items provide enough variance? Examine bivariate relationships (logical consistency between all items) Examine multivariate relationships (correspondence between one group of items measuring the same thing and another group of items measuring the same thing)

Index Scoring What is your measurement range? Is there an adequate number of cases for each index point? Is there a need to assign weights to items? * * If unweighted, each of your items has the same value for the concept, so  sum up * * Weighting changes the theoretical definition of the construct, as some items matter more than others

Scale Babbie (2004, p. 152): A type of composite measure composed of several items that have a logical or empirical structure among them * In other words: * In other words: allows to measure the intensity or direction of a construct by aligning the responses on a continuum

Scale Exist in a variety of types Five most widely known are: Likert scale - Likert scale - Bogardus Social Distance scale - Thurstone scale - Guttman scale - Semantic Differential scale

Likert Scale Neuman (2000, p. 183)

Semantic Differential Scale Babbie (2004, p. 171)

Bogardus Social Distance Scale This social distance scale was taken from

Guttman Scale Neuman (2002, p. 191)

Thurstone Scale Neuman (2000, p. 187)

Scale Scoring Response frequencies could be used to identify the intensity (direction, potency, etc.) of a construct index Often, if several scales are used to identify a construct, the responses are summed and averaged in order to receive an index.

Validation Validation Internal validation: * Item analysis: * Item analysis: An assessment of whether each of the items included in a composite measure makes an independent contribution or merely duplicates the contribution of other items in the measure ( Babbie, 2004, p. 164 ) Is conducted through a variety of statistical techniques: - Regression - Factor Analysis

Validation Validation External validation: * * The process of testing the validity of a measure by examining its relationship to other presumed indicators of the same variables ( Babbie, 2004, p. 165 ) Is conducted by - trying it on a population with apparent traits - statistical procedures of establishing concurrent and predictive validity (often simple correlations)

Bad Index vs. Bad Validators Fails the Internal Validation: Item analysis can show presence of inconsistent relationships between the items Item analysis can show that the contribution of an item is insufficient The overall model is not supported by the data you collected * either or * Generally means that you need to either go back and re-think your theory or look for more relationships between the items in your model

Bad Index vs. Bad Validators Fails the External Validation: The index does not adequately measure the variable in question The validation items do not adequately measure the variable  thus, do not provide a sufficient testing power * * Generally means that you need to go back and re- examine you measure before blaming it on the validators

Missing Data not a good idea Try to guess from previous responses what value to insert ( not a good idea ) creates threats to validity Substitute the average score for cases where data are present ( creates threats to validity ) reduces the size of the usable data Eliminate all cases for which any information is missing ( reduces the size of the usable data )

Sampling

Non Probability Non Probability Do not know the size of the population from which the sample was drawn. Therefore, do not know how representative are their responses, controlling for their social-demographic characteristics.

Non Probability Non Probability Purposive Snowball Quota Selected Informants

Probability Probability Do know the size of the population from which the sample was drawn. Do know how representative are their responses, controlling for their social- demographic characteristics.

Probability Probability Simple Random Systematic Stratified Multistage Probability Proportionate to Size Disproportionate with Weighting

Probability Probability Bias: Effect of theoretically relevant characteristics on responses. Population, Study Population, Sampling Frame, Sampling Unit Sampling Error