Presentation on theme: "Designing Research Concepts, Hypotheses, and Measurement"— Presentation transcript:
1Designing Research Concepts, Hypotheses, and Measurement Chapter 3Designing Research Concepts, Hypotheses, and Measurement
2Research Design Must create a Research Design Questions are composed of conceptsMust start with a research question
3Stages of Research Developing Concepts Operationalization Selection of Research Method(s)Sampling StrategyData Collection ‘Plan’AnalysesResults and WritingAlso need to consider budget issues
4OperationalizationIt is critical to survey research to understand how to go from ideas to concepts to variables – operationalization.
5ConceptsConcept (p.35): an idea, a general mental formulation summarizing specific occurrencesA label we put on a phenomenon, a matter, a “thing” that enables us to link separate observations, make generalizations, communicate and inherit ideas.Concepts can be concrete, abstract, tangible or intangible. Concrete: Height, MajorAbstract: Happiness, Love
6Transferring Concepts into something Measurable Variable:A representation of concept in its variation of degree, varieties or occurrence.A characteristic of a thing that can assume varying degrees or values.Fixed meaning = constantMost variables are truly variable = multiple categories or variables
7Example: Concept and Variable Political participationVariables:Voted or notHow many times a person has votedWhat party a person votes for
8How to be measured?Conceptualization: The process of conceptualization includes coming to some agreement about the meaning of the conceptIn practice, you often move back and forth between loose ideas of what you are trying to study and searching for a word that best describes it.Sometimes you have to “make up” a name to encompass your concept.
9ConceptualizationAs you flush out the pieces or aspects of a concept, you begin to see the dimensions; the terms that define subgroups of a concept.With each dimension, you must decide on indicators – signs of the presence or absence of that dimension.Dimensions are usually concepts themselves.
10Operationalizing Choices You must operationalize: process of converting concepts into measurable termsThe process of creating a definition(s) for a concept that can be observed and measuredThe development of specific research procedures that will result in empirical observationsSES is defined as a combination of income and education and I will measure each by…The development of questions (or characteristics of data in qualitative work) that will indicate a concept
11Variable Attribute Choices Variable attributes need to be exhaustive and exclusiveRepresent full range of possible variationDegree of Precisionselection depends on your research interestIs it better to include too much or too little?
12VariablesThe dependent variable is the variable that the researcher measures; it is called a dependent variable because it depends upon (is caused by) the independent variable.The independent variable is the one that the researcher manipulates.Example: If you are studying the effects of a new educational program on student achievement, the program is the independent variable and your measures of achievement are the dependent ones.
13VariablesQualitative Variable: Composed of categories which are not comparable in terms of magnitudeQuantitative Variable: Can be ordered with respect to magnitude on some dimension Continuous Variable: A quantitative variable, which can be measured with an arbitrary degree of precision. Any two points on a scale of a continuous variable have an infinite number of values in between. It is generally measured.Discrete Variable: A quantitative variable where values can differ only by well-defined steps with no intermediate values possible. It is generally counted.
15Nominal Measures Only offer a name or a label for a variable There is not rankingThey are not numerically relatedGender; Race
16Ordinal Measures Variables with attributes that can be rank ordered Can say one response is more or less than anotherDistance between does not have meaninglower class, middle and upper classNote: Scales and indexes are ordinal measures, but conventions for analysis allow us to assume equidistance between attributes (if it makes logical sense); treat them like “interval” measures; and subject them to statistical tests
17Interval MeasuresDistance separating attributes has meaning and is standardized (equidistant)“0” value does not mean a variable is not presentScore on an ACT test 50 vs. 100does not mean person is twice as smart
18Ratio MeasuresAttributes of a variable have a “true zero point” that means somethingWaist measures and Biceps measuresAllows one to create ratios
19HypothesesHypotheses: (pg. 36) Untested statements that specify a relationship between 2 or more variables.Example: Milk Drinkers Make Better Lovers
20Characteristics of a Hypothesis States a relationship between two or more variablesIs stated affirmatively (not as a question)Can be tested with empirical evidenceMost useful when it makes a comparisonStates how multiple variables are relatedTheory or underlying logic of the relationship makes sense
21Hypotheses should be clearly stated at the beginning of a study. Do not have to have a hypothesis to conduct research, general research questions.
22Positive and Negative (Inverse) Relationships Positive: as values of independent variable increase, the values of the dependent variable increaseNegative: as values of independent variable increase, the values of the dependent variable decrease (or vice versa)
23Two-directional Hypotheses More general expression of a hypothesisUsually default in stat packagesSuggests that groups are different or concepts related, but without specifying the exact direction of the differenceExample: Men and women trust UK security differently.
24One-directional hypotheses More specific expression of a hypothesisSpecifies the precise direction of the relationship between the dependent and independent variables.Example: Women have greater trust in UK security compared to men.
25Determining Quality of Measurement Accuracy and Consistency in MeasurementValidity is accuracyReliability is consistency
26ReliabilityDefinition -- The extent to which the same research technique applied again to the same object (subject) will give you the same resultReliability does not ensure accuracy: a measure can be reliable but inaccurate (invalid) because of bias in the measure or in data collector/coder
27ValidityDefinition -- The extent to which our measure reflects what we think or want them to be measuring
28Face ValidityFace validity: the measure seems to be related to what we are interested in finding out even if it does not fully encompass the conceptconcept = intellectual capacitymeasure = grades (high face validity)measure = # of close friends (low face validity)
29Criterion ValidityCriterion validity (predictive validity): the measure is predictive of some external criterionCriterion = Success in CollegeMeasure = ACT scores (high criterion validity?)
30Construct ValidityConstruct Validity: the measure is logically related to another variable as conceptualized it to beconstruct = happinessmeasure = financial stabilityif not related to happiness, low construct validity
31Content ValidityContent Validity: how much a measure covers a range of meanings; did you cover the full range of dimensions related to a conceptExample: You think that you are measuring prejudice, but you only ask questions about racewhat about sex, religious etc.?
32Methodological Approaches, Reliability and Validity Qualitative research methods lend themselves to high validity and lower reliability.Quantitative research methods lend themselves to lower validity and higher reliability