Psych 231: Research Methods in Psychology

Slides:



Advertisements
Similar presentations
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Advertisements

Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Validity, Sampling & Experimental Control Psych 231: Research Methods in Psychology.
Validity, Sampling & Experimental Control Psych 231: Research Methods in Psychology.
Experimental Design: Single factor designs Psych 231: Research Methods in Psychology.
Writing with APA style (cont.) & Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Review for Exam 2 Psych 231: Research Methods in Psychology.
Writing with APA style (cont.) & Experiment Basics: Variables
Psych 231: Research Methods in Psychology
Reporting results: APA style Psych 231: Research Methods in Psychology.
Experimental Control & Design Psych 231: Research Methods in Psychology.
Experiment Basics: Variables
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experimental Control cont. Psych 231: Research Methods in Psychology.
Experimental Design: Single factor designs Psych 231: Research Methods in Psychology.
Reporting results: APA style Psych 231: Research Methods in Psychology.
Experimental Designs Psych 231: Research Methods in Psychology.
Sampling & Experimental Control Psych 231: Research Methods in Psychology.
Experimental Design: Single factor designs Psych 231: Research Methods in Psychology.
Experimental Control Psych 231: Research Methods in Psychology.
Psych 231: Research Methods in Psychology
Sampling & Experimental Control Psych 231: Research Methods in Psychology.
Experimental Control & Design Psych 231: Research Methods in Psychology.
Manipulation and Measurement of Variables
Variables cont. Psych 231: Research Methods in Psychology.
Sampling & Experimental Control Psych 231: Research Methods in Psychology.
Validity, Reliability, & Sampling
Manipulation and Measurement of Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Final Study Guide Research Design. Experimental Research.
Experiment Basics: Designs Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Slides to accompany Weathington, Cunningham & Pittenger (2010), Chapter 3: The Foundations of Research 1.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Designs Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Designs Psych 231: Research Methods in Psychology.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Experiment Basics: Designs Psych 231: Research Methods in Psychology.
Experiment Basics: Designs Psych 231: Research Methods in Psychology.
Experiment Basics: Designs Psych 231: Research Methods in Psychology.
Experiment Basics: Designs Psych 231: Research Methods in Psychology.
Experiment Basics: Designs
Experiment Basics: Variables
Experiment Basics: Designs
Reasoning in Psychology Using Statistics
Experiment Basics: Variables
2 independent Groups Graziano & Raulin (1997).
Experiment Basics: Designs
Experiment Basics: Designs
Experiment Basics: Designs
Experiment Basics: Variables
Experiment Basics: Designs
Experiment Basics: Designs
Experiment Basics: Variables
Experiment Basics: Designs
Reasoning in Psychology Using Statistics
Experiment Basics: Designs
Experiment Basics: Designs
Experiment Basics: Designs
Experiment Basics: Designs
Experiment Basics: Variables
Experiment Basics: Designs
Reasoning in Psychology Using Statistics
Presentation transcript:

Psych 231: Research Methods in Psychology Review for Exam 2 Psych 231: Research Methods in Psychology

Exam 2 Topics Covers chapters: 4, 5, 8, 10, 11, 12, 15 APA style Underlying reasons for the organization Parts of a manuscript Variables Sampling Control Experimental Designs Vocabulary Between vs. Within Factorial designs

APA style Parts (check Chapter 15 of your textbook) Purpose of presenting your research To get the work out there, to spur further research, replication, testing/falsifaction of your theory Why the structured format? Clairity: To ease communication of what was done forces a minimal amount of information Provides consistent format within a discipline Allows readers to cross-reference your sources easily Parts (check Chapter 15 of your textbook)

Parts of a research report Title Page Abstract Body References Authors Notes Footnotes Tables Figure Captions Figures

Title Page Short title – goes in header (with page number) on each page of the manuscript Running head – will go on each page of published article, no more than 50 characters Title should be maximally informative while short (10 to 12 words recommended) Order of Authorship sometimes carries meaning Affiliation – where the bulk of the research was done

Abstract Abstract short summary of entire paper 100 to 120 words the problem/issue the method the results the major conclusions

Body Hourglass shape Start broad Narrow focus Most focused Broaden Background Literature Review Start broad Statement of purpose Specific hypotheses (at least at operational level) Narrow focus - Methods - Results Most focused Discussion Conclusions Implications Broaden

Body Introduction Background, Literature Review, Statement of purpose, Specific hypotheses Methods (in enough detail that the reader can replicate the study) Participants Design Apparatus/Materials Procedure Results (state the results but don’t interpret them here) Verbal statement of results Refer to Tables and figures Statistical Outcomes Discussion (interpret the results) Relationship between purpose and results Theoretical (or methodological) contribution Implications

References When something odd comes up, don’t guess. Look it up! Author’s name Year Title of work Publication information Journal Issue pages When something odd comes up, don’t guess. Look it up!

The rest Authors Notes Footnotes Tables Figure Captions Figures

Variables Characteristics of the situation Constants Variables Types Levels Conceptual variables (constructs) Operationalized variables Underlying assumptions Types Independent variables (explanatory) Dependent variables (response) Extraneous variables Control variables Random variables Confound variables

Independent variables The variables that are manipulated by the experimenter Each IV must have at least two levels Combination of all the levels of all of the IVs results in the different conditions in an experiment Methods of manipulation Straightforward manipulations Stimulus manipulation Instructional manipulation Staged manipulations Event manipulation Subject manipulations

Independent variables Choosing the right range Things to watch out for Demand characteristics Experimenter bias Reactivity Ceiling and floor effects

Dependent variables The variables that are measured by the experimenter They are “dependent” on the independent variables (if there is a relationship between the IV and DV as the hypothesis predicts). How to measure your your construct: Can the participant provide self-report? Introspection Rating scales Is the dependent variable directly observable? Choice/decision (sometimes timed) Is the dependent variable indirectly observable? Physiological measures (e.g. GSR, heart rate) Behavioral measures (e.g. speed, accuracy)

Dependent variables Measuring Scales of measurement Errors Nominal Ordinal Interval Ratio Errors Validity reliability

Extraneous Variables Types Control variables Holding things constant - Controls for excessive random variability Random variables – may freely vary, to spread variability equally across all experimental conditions Randomization Confound variables Other variables, that haven’t been accounted for (manipulated, measured, randomized, controlled) that can impact changes in the dependent variable(s) Two things to watch out for: Experimenter bias (expectancy effects) Demand characteristics

Reliability & Validity Reliability =consistency Validity = measuring what is intended unreliable reliable reliable invalid invalid valid

Reliability Test-restest reliability Internal consistency reliability Inter-rater reliability

Validity Does your measure really measure what it is supposed to measure? There are many “kinds” of validity Construct Face Internal Threats History Maturation Selection Mortality Testing External Variable representativeness Subject representativeness Setting representativeness

Sampling Why? Goals: Types Don’t have the resources to test everybody Population Sample Goals: Maximize: Representativeness - to what extent do the characteristics of those in the sample reflect those in the population Reduce: Bias - a systematic difference between those in the sample and those in the population Types Probability sampling Simple random sampling Systematic sampling Stratified sampling Non-probability sampling Convenience sampling Quota sampling

Control Sources of Total (T) Variability: T = NRexp + NRother +R Our goal is to reduce R and NRother so that we can detect NRexp. That is, so we can see the changes in the DV that are due to the changes in the independent variable(s). R NR exp other

Control Methods of control Problems Comparison Production (picking levels) Constancy/Randomization Problems Excessive random variability: Confounding Dissimulation

Experimental designs Some vocabulary Factors Levels Conditions Within groups Between groups Control group Single factor designs Factorial designs Main effects Interactions

Single variable – one Factor Advantages: Simple, relatively easy to interpret the results Is the independent variable worth studying? If no effect, then usually don’t bother with a more complex design Sometimes two levels is all you need One theory predicts one pattern and another predicts a different pattern Disadvantages: “True” shape of the function is hard to see Interpolation Extrapolation

1 Factor - multilevel experiments Advantages Get a better idea of the true function of the relationship Disadvantages Needs more resources (participants and/or stimuli) Requires more complex statistical analysis (analysis of variance and pair-wise comparisons)

Between versus Within Subjects Designs Between subjects designs Each participant participates in one-and-only-one condition of the experiment. Within subjects designs all participants participate in all of the conditions of the experiment.

Between subjects designs. Advantages: Independence of groups (levels of the IV) Harder to guess what the experiment is about without experiencing the other levels of IV exposure to different levels of the independent variable(s) cannot “contaminate” the dependent variable No order effects to worry about Counterbalancing is not required Sometimes this is a ‘must,’ because you can’t reverse the effects of prior exposure to other levels of the IV Disadvantages Individual differences between the people in the groups Non-Equivalent groups Excessive variability

Within subjects designs Advantages: Don’t have to worry about individual differences Same people in all the conditions Variability between groups is smaller (statistical advantage) Fewer participants are required Disadvantages Order effects: Carry-over effects Progressive error Counterbalancing is probably necessary Range effects

Factorial experiments Two or more factors Factors - independent variables Levels - the levels of your independent variables 2 x 4 design means two independent variables, one with 2 levels and one with 4 levels Calculate # of “conditions” by multiplying the levels, a 2x4 design has 8 different conditions Main effects - the effects of your independent variables ignoring (collapsed across) the other independent variables Interaction effects - how your independent variables affect each other Example: 2x2 design, factors A and B Interaction: At A1, B1 is bigger than B2 At A2, B1 and B2 don’t differ A1 A2 B1 B2 B3 B4

Factorial experiments So there are lots of different potential outcomes: A = main effect of factor A B = main effect of factor B AB = interaction of A and B With 2 factors there are 8 basic possible patterns of results: 1) No effects at all 2) A only 3) B only 4) AB only 5) A & B 6) A & AB 7) B & AB 8) A & B & AB

2 x 2 factorial design A1 A2 B2 B1 Marginal means B1 mean B2 mean Condition mean A1B1 A2B1 A1B2 A2B2 Marginal means B1 mean B2 mean A1 mean A2 mean Main effect of B Main effect of A

Factorial Designs Advantages Disadvantages Interaction effects One should always consider the interaction effects before trying to interpret the main effects Adding factors decreases the variability Because you’re controlling more of the variables that influence the dependent variable This increases the statistical Power of the statistical tests Increases generalizability of the results Because you have a situation closer to the real world (where all sorts of variables are interacting) Disadvantages Experiments become very large, and unwieldy The statistical analyses get much more complex Interpretation of the results can get hard In particular for higher-order interactions Higher-order interactions (when you have more than two interactions, e.g., ABC).