Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors & Confusions when interpreting.

Slides:



Advertisements
Similar presentations
Week 2 – PART III POST-HOC TESTS. POST HOC TESTS When we get a significant F test result in an ANOVA test for a main effect of a factor with more than.
Advertisements

Mixed Designs: Between and Within Psy 420 Ainsworth.
One-Way BG ANOVA Andrew Ainsworth Psy 420. Topics Analysis with more than 2 levels Deviation, Computation, Regression, Unequal Samples Specific Comparisons.
1 G Lect 14b G Lecture 14b Within subjects contrasts Types of Within-subject contrasts Individual vs pooled contrasts Example Between subject.
More on ANOVA. Overview ANOVA as Regression Comparison Methods.
Multiple Group X² Designs & Follow-up Analyses X² for multiple condition designs Pairwise comparisons & RH Testing Alpha inflation Effect sizes for k-group.
Chapter 11 One-way ANOVA: Models with a Single Categorical Predictor
Regression Models w/ k-group & Quant Variables Sources of data for this model Variations of this model Main effects version of the model –Interpreting.
Multiple Group X² Designs & Follow-up Analyses X² for multiple condition designs Pairwise comparisons & RH Testing Alpha inflation Effect sizes for k-group.
LECTURE 15 Hypotheses about Contrasts EPSY 640 Texas A&M University.
PSY 307 – Statistics for the Behavioral Sciences
Analyses of K-Group Designs : Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors.
Introduction the General Linear Model (GLM) l what “model,” “linear” & “general” mean l bivariate, univariate & multivariate GLModels l kinds of variables.
Analyses of K-Group Designs : Omnibus F & Pairwise Comparisons ANOVA for multiple condition designs Pairwise comparisons and RH Testing Alpha inflation.
One-way Between Groups Analysis of Variance
Review of Factorial Designs 5 terms necessary to understand factorial designs 5 patterns of factorial results for a 2x2 factorial designs Descriptive &
Statistics for the Social Sciences
K-group ANOVA & Pairwise Comparisons ANOVA for multiple condition designs Pairwise comparisons and RH Testing Alpha inflation & Correction LSD & HSD procedures.
2x2 BG Factorial Designs Definition and advantage of factorial research designs 5 terms necessary to understand factorial designs 5 patterns of factorial.
Statistical Methods in Computer Science Hypothesis Testing II: Single-Factor Experiments Ido Dagan.
Correlation and Regression Analysis
Advanced Research Methods in Psychology - lecture - Matthew Rockloff
Analysis of Variance. ANOVA Probably the most popular analysis in psychology Why? Ease of implementation Allows for analysis of several groups at once.
Comparing Means. Anova F-test can be used to determine whether the expected responses at the t levels of an experimental factor differ from each other.
Analyses of K-Group Designs : Omnibus F & Follow-up Analyses ANOVA for multiple condition designs Pairwise comparisons, alpha inflation & correction Alpha.
Decomposition of Treatment Sums of Squares using prior information on the structure of the treatments and/or treatment groups.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
So far... We have been estimating differences caused by application of various treatments, and determining the probability that an observed difference.
PSY 307 – Statistics for the Behavioral Sciences Chapter 16 – One-Factor Analysis of Variance (ANOVA)
Effect Size Estimation in Fixed Factors Between-Groups ANOVA
Orthogonal Linear Contrasts This is a technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Effect Size Estimation in Fixed Factors Between- Groups Anova.
Psychology 301 Chapters & Differences Between Two Means Introduction to Analysis of Variance Multiple Comparisons.
Coding Multiple Category Variables for Inclusion in Multiple Regression More kinds of predictors for our multiple regression models Some review of interpreting.
Orthogonal Linear Contrasts
Curvilinear 2 Modeling Departures from the Straight Line (Curves and Interactions)
Multivariate Analysis. One-way ANOVA Tests the difference in the means of 2 or more nominal groups Tests the difference in the means of 2 or more nominal.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Testing Hypotheses about Differences among Several Means.
Chapter 10: Analyzing Experimental Data Inferential statistics are used to determine whether the independent variable had an effect on the dependent variance.
Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, October 15, 2013 Analysis of Variance (ANOVA)
MGS3100_04.ppt/Sep 29, 2015/Page 1 Georgia State University - Confidential MGS 3100 Business Analysis Regression Sep 29 and 30, 2015.
Regression Chapter 16. Regression >Builds on Correlation >The difference is a question of prediction versus relation Regression predicts, correlation.
Statistics for the Social Sciences Psychology 340 Fall 2012 Analysis of Variance (ANOVA)
Analysis of Variance 1 Dr. Mohammed Alahmed Ph.D. in BioStatistics (011)
Regression Models w/ 2 Quant Variables Sources of data for this model Variations of this model Main effects version of the model –Interpreting the regression.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Within Subjects Analysis of Variance PowerPoint.
Basic Analysis of Factorial Designs The F-tests of a Factorial ANOVA Using LSD to describe the pattern of an interaction.
ANOVA: Analysis of Variance.
Chapter 13 - ANOVA. ANOVA Be able to explain in general terms and using an example what a one-way ANOVA is (370). Know the purpose of the one-way ANOVA.
Regression Models w/ 2-group & Quant Variables Sources of data for this model Variations of this model Main effects version of the model –Interpreting.
+ Comparing several means: ANOVA (GLM 1) Chapter 11.
Marshall University School of Medicine Department of Biochemistry and Microbiology BMS 617 Lecture 13: One-way ANOVA Marshall University Genomics Core.
Stats Lunch: Day 8 Repeated-Measures ANOVA and Analyzing Trends (It’s Hot)
Orthogonal Linear Contrasts A technique for partitioning ANOVA sum of squares into individual degrees of freedom.
Remember You just invented a “magic math pill” that will increase test scores. On the day of the first test you give the pill to 4 subjects. When these.
Introduction to ANOVA Research Designs for ANOVAs Type I Error and Multiple Hypothesis Tests The Logic of ANOVA ANOVA vocabulary, notation, and formulas.
ANCOVA Workings of ANOVA & ANCOVA ANCOVA, partial correlations & multiple regression Using model plotting to think about ANCOVA & Statistical control Homogeneity.
General Linear Model What is the General Linear Model?? A bit of history & our goals Kinds of variables & effects It’s all about the model (not the data)!!
ANOVA and Multiple Comparison Tests
Factorial BG ANOVA Psy 420 Ainsworth. Topics in Factorial Designs Factorial? Crossing and Nesting Assumptions Analysis Traditional and Regression Approaches.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
1. Refresher on the general linear model, interactions, and contrasts UCL Linguistics workshop on mixed-effects modelling in R May 2016.
Non-linear relationships
Introduction to the General Linear Model (GLM)
Quantitative Methods Simple Regression.
1-Way ANOVA with Numeric Factor – Dose-Response
Chapter 12 A Priori and Post Hoc Comparisons Multiple t-tests
SPSS SPSS Problem (Part 1). SPSS SPSS Problem (Part 1)
Reasoning in Psychology Using Statistics
Presentation transcript:

Analytic Comparisons & Trend Analyses Analytic Comparisons –Simple comparisons –Complex comparisons –Trend Analyses Errors & Confusions when interpreting Comparisons Comparisons using SPSS Orthogonal Comparisons Effect sizes for analytic & trend analyses

Analytic Comparisons -- techniques to make specific comparisons among condition means. There are two types… Simple Analytic Comparisons -- to compare the means of two IV conditions at a time Rules for assigning weights: 1. Assign weight of “0” to any condition not involved in RH 2. Assign weights to reflect comparison of interest 3. Weights must add up to zero Tx2 Tx1 C E.g. #1 RH: Tx1 < C (is 10 < 40 ?) E.g. #2 RH: Tx2 < Tx1(is 40 < 10?) How do Simple Analytic Comparisons & Pairwise Comparisons differ? Usually there are only k-1 analytic comparisons (1 for each df)

Complex Analytic Comparisons -- To compare two “groups” of IV conditions, where a “group” is sometimes one condition and sometimes 2 or more conditions that are “combined” and represented as their average mean. Rules for assigning weights: 1. Assign weight of “0” to any condition not involved in RH 2. Assign weights to reflect group comparison of interest 3. Weights must add up to zero Tx2 Tx1 C RH: Control higher than average of Tx conditions (40 > 25?) Careful !!! Notice the difference between the proper interpretation of this complex comparison and of the set of simple comparisons below. RH: Control is poorer than (is 40 < 40) both of Tx conditions (is 10 < 40) Notice the complex & set of simple comparisons have different interpretations!

Criticism of Complex Analytical Comparisons Complex comparisons are seldom useful for testing research hypotheses !! (Most RH are addressed by the proper set of simple comparisons!) Complex comparisons require assumptions about the comparability of IV conditions (i.e., those combined into a “group”) that should be treated as research hypotheses !! Why would you run two (or more) separate IV conditions, being careful to following their different operational definitions, only to “collapse” them together in a complex comparison Complex comparisons are often misinterpreted as if it were a set of simple comparisons

Trend Analyses -- To describe the shape of the IV-DV relationship Trend analyses can be applied whenever the IV is quantitative. There are three basic types of trend (w/ two versions of each) Linear Trends positivenegative Quadratic Trends (requires at least 3 IV conditions) U-shapedinverted-U-shaped Cubic Trends (requires at least 4 IV conditions) Note: Trend analyses are computed same as analytics -- using weights (coefficients) from “table” (only for =n & =spacing)

Not only is it important to distinguish between the two different types of each basic trend, but it is important to identify shapes that are combinations of trends (and the different kinds) Here are two different kinds of “linear + quadratic” that would have very different interpretations + linear &+ linear & U-shapedinverted quadratic U-shape quad (“accelerating returns” curve) ( “diminishing returns” curve) Here is a common combination of + linear & cubic (“learning curve”)

“How to mess-up interpreting analytic comparisons” Simple Comparisons: -- ignore the direction of the simple difference (remember you must have a difference in the correct direction) Complex Comparisons: -- ignore direction of the difference (remember you must have a difference in the correct direction) -- misinterpret complex comparison as if it were a set of simple comparisons Trend Analyses: -- ignore specific pattern of the trend (remember you must have a shape in the correct direction or pattern) -- misinterpret trend as if it were a set of simple comps -- ignore combinations of trend (e.g., the RH of a linear trend “really means” that there is a significant linear trend, and no significant quadratic or cubic trend) -- perform trend analyses on non-quantitative IV conditions

Caveats about Analytic Comparisons via SPSS “polynomial” subcommand of ONEWAY and GLM assume equally spaced IV conditions and equal-n (so do the weights given in our text and most tables of weights for polynomials -- it possible to do a trend analysis with unequal IV-condition spacing and/or unequal-n, but just not using “polynomial” or the weights in the back of the book) “contrast” subcommand of ONEWAY uses separate error terms for each analytic comparison, rather than full model error term “contrast” subcommand of GLM (for within-groups designs) doesn’t give the exact set of analytic comparisons you specify (rather it gives the “closest” set of orthogonal comparisons -- see next page) “polynomial” subcommand of ONEWAY uses separate error terms for each trend, rather than full model error term

One last thing - orthogonal and nonorthogonal sets of analytics Orthogonal means independent or unrelated -- the idea of a set of orthogonal analytic comparisons is that each would provide statistically independent information. The way to determine if a pair of comparisons is orthogonal is to sum the products of the corresponding weights. If that sum is zero, then the pair of comparisons is orthogonal. Non-orthogonal PairOrthogonal Pair Tx1 Tx2 C Sum = 1 Sum = 0 For a “set” of comparisons to be orthogonal, each pair must be.

Advantages and Disadvantages of Orthogonal comparison sets Advantages each comparison gives statistically independent information, so the orthogonal set gives the most information possible for that number of comparisons it is a mathematically elegant way of expressing the variation among the IV conditions -- SS IV is partitioned among the comps Disadvantages “separate research questions” often doesn’t translate into “statistically orthogonal comparisons” (e.g., & ) can only have # orthogonal comparisons = df IV the comparisons included in an orthogonal set rarely address the set of research hypotheses one has (e.g., sets of orthogonal analyses usually include one or more complex comparisons)

Effect sizes for analytic & trend analyses Most statistical packages present F-test or t-test results for each analytic or trend analysis. Use the one of the following formulas to estimate the associated effect size r =  [ F / (F + df error )] or r =  [ t 2 / (t 2 + df)] Be sure you properly interpret the effect size! Remember the cautions and criticisms of these types of comparisons.