Statistical Analysis of the Two Group Post-Only Randomized Experiment.

Slides:



Advertisements
Similar presentations
T-test Charmaine Bird.
Advertisements

Inference for Regression
Linear regression models
Effect Size Overheads1 The Effect Size The effect size (ES) makes meta-analysis possible. The ES encodes the selected research findings on a numeric scale.
ANOVA Demo Part 1: Explanation Psy 320 Cal State Northridge Andrew Ainsworth PhD.
One way-ANOVA Analysis of Variance Let’s say we conduct this experiment: effects of alcohol on memory.
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
Statistical Analysis of the Regression Point Displacement Design (RPD)
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Treatment Effects: What works for Whom? Spyros Konstantopoulos Michigan State University.
Chapter Topics Types of Regression Models
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Practical Meta-Analysis -- D. B. Wilson
Ch. 14: The Multiple Regression Model building
BCOR 1020 Business Statistics
Today Concepts underlying inferential statistics
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Regression and Correlation Methods Judy Zhong Ph.D.
5.1 Basic Estimation Techniques  The relationships we theoretically develop in the text can be estimated statistically using regression analysis,  Regression.
Practical Meta-Analysis -- The Effect Size -- D. B. Wilson 1 The Effect Size The effect size (ES) makes meta-analysis possible The ES encodes the selected.
The Use of Dummy Variables. In the examples so far the independent variables are continuous numerical variables. Suppose that some of the independent.
Introduction Multilevel Analysis
Brain Mapping Unit The General Linear Model A Basic Introduction Roger Tait
Multiple regression - Inference for multiple regression - A case study IPS chapters 11.1 and 11.2 © 2006 W.H. Freeman and Company.
Multivariate Analysis. One-way ANOVA Tests the difference in the means of 2 or more nominal groups Tests the difference in the means of 2 or more nominal.
Copyright © 2011 Pearson Education, Inc. Analysis of Variance Chapter 26.
Linear correlation and linear regression + summary of tests
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Simple Linear Regression ANOVA for regression (10.2)
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 14 – 1 Chapter 14: Analysis of Variance Understanding Analysis of Variance The Structure of Hypothesis Testing with ANOVA Decomposition of SST.
Scatterplots & Regression Week 3 Lecture MG461 Dr. Meredith Rolfe.
Experimental Design, Statistical Analysis CSCI 4800/6800 University of Georgia March 7, 2002 Eileen Kraemer.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Analysis of Covariance Combines linear regression and ANOVA Can be used to compare g treatments, after controlling for quantitative factor believed to.
Analysis of Covariance (ANCOVA)
Chapter 6 Simple Regression Introduction Fundamental questions – Is there a relationship between two random variables and how strong is it? – Can.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Categorical Independent Variables STA302 Fall 2013.
General Linear Model.
The t-Test for Differences Between Groups
T test-origin Founder WS Gosset Wrote under the pseudonym “Student” Mostly worked in tea (t) time ? Hence known as Student's t test. Preferable when the.
ANOVAs.  Analysis of Variance (ANOVA)  Difference in two or more average scores in different groups  Simplest is one-way ANOVA (one variable as predictor);
FIXED AND RANDOM EFFECTS IN HLM. Fixed effects produce constant impact on DV. Random effects produce variable impact on DV. F IXED VS RANDOM EFFECTS.
Introducing Communication Research 2e © 2014 SAGE Publications Chapter Seven Generalizing From Research Results: Inferential Statistics.
The t-distribution William Gosset lived from 1876 to 1937 Gosset invented the t -test to handle small samples for quality control in brewing. He wrote.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
Inference for regression - More details about simple linear regression IPS chapter 10.2 © 2006 W.H. Freeman and Company.
Multiple Regression Analysis Regression analysis with two or more independent variables. Leads to an improvement.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
1. Complete SRTE on ANGEL 2. Project Due Wed midnight on ANGEL 3. Complete post-test Dec 6, 12pm – Dec 8, 12pm Up to 2% extra credit on your overall grade.
The General Linear Model. Estimation -- The General Linear Model Formula for a straight line y = b 0 + b 1 x x y.
Formula for Linear Regression y = bx + a Y variable plotted on vertical axis. X variable plotted on horizontal axis. Slope or the change in y for every.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 26 Analysis of Variance.
Aron, Aron, & Coups, Statistics for the Behavioral and Social Sciences: A Brief Course (3e), © 2005 Prentice Hall Chapter 10 Introduction to the Analysis.
Educational Research Inferential Statistics Chapter th Chapter 12- 8th Gay and Airasian.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Inferential Statistics Psych 231: Research Methods in Psychology.
AP Statistics Chapter 14 Section 1.
The Skinny on High School
Statistical Analysis of the Randomized Block Design
Linear Regression Prof. Andy Field.
12 Inferential Analysis.
12 Inferential Analysis.
Chapter 10 Introduction to the Analysis of Variance
Presentation transcript:

Statistical Analysis of the Two Group Post-Only Randomized Experiment

Analysis Requirements l Two groups l A post-only measure l Two distributions, each with an average and variation l Want to assess treatment effect l Treatment effect = statistical (i.e., nonchance) difference between the groups RXORORXORO

Statistical Analysis

Control group mean

Statistical Analysis Control group mean Treatment group mean

Statistical Analysis Control group mean Treatment group mean Is there a difference?

What Does Difference Mean?

Medium variability

What Does Difference Mean? Medium variability High variability

What Does Difference Mean? Medium variability High variability Low variability

What Does Difference Mean? Medium variability High variability Low variability The mean difference is the same for all three cases.

What Does Difference Mean? Medium variability High variability Low variability Which one shows the greatest difference?

What Does Difference Mean? l A statistical difference is a function of the difference between means relative to the variability. l A small difference between means with large variability could be due to chance. l Like a signal-to-noise ratio. Low variability Which one shows the greatest difference?

What Do We Estimate? Low variability

What Do We Estimate? Low variability Signal Noise

What Do We Estimate? Low variability Signal Noise Difference between group means =

What Do We Estimate? Low variability Signal Noise Difference between group means Variability of groups =

What Do We Estimate? Low variability Signal Noise Difference between group means Variability of groups = = X T - X C SE(X T - X C ) __ __

What Do We Estimate? Low variability Signal Noise Difference between group means Variability of groups = X T - X C SE(X T - X C ) = = t-value __ __

What Do We Estimate? l The t-test, one-way analysis of variance (ANOVA) and a form of regression all test the same thing and can be considered equivalent alternative analyses. l The regression model is emphasized here because it is the most general. Low variability

Regression Model for t-Test or One-Way ANOVA y i =  0 +  1 Z i + e i

Regression Model for t-Test or One-Way ANOVA y i = outcome score for the i th unit  0 =coefficient for the intercept  1 =coefficient for the slope Z i =1 if i th unit is in the treatment group 0 if i th unit is in the control group e i =residual for the i th unit y i =  0 +  1 Z i + e i where:

In Graph Form...

0 (Control) 1 (Treatment) ZiZi

In Graph Form... 0 (Control) 1 (Treatment) YiYi ZiZi

In Graph Form... 0 (Control) 1 (Treatment) YiYi ZiZi

In Graph Form... 0 (Control) 1 (Treatment)  0 is the intercept y-value when z=0. YiYi ZiZi

In Graph Form... 0 (Control) 1 (Treatment)  0 is the intercept y-value when z=0.  1 is the slope. YiYi ZiZi

Why Is  1 the Mean Difference? 0 (Control) 1 (Treatment)  0 is the intercept y-value when z=0.  1 is the slope. YiYi ZiZi

Why Is  1 the Mean Difference? 0 (Control) 1 (Treatment) Intuitive Explanation: Because slope is the change in y for a 1-unit change in x. YiYi ZiZi Change in y Unit change in x (i.e., z)

Why Is  1 the Mean Difference? 0 (Control) 1 (Treatment) Since the 1-unit change in x is the treatment- control difference, the slope is the difference between the posttest means of the two groups. YiYi ZiZi Change in y

Why  1 Is the Mean Difference in y i =  0 +  1 Z i + e i

Why  1 Is the Mean Difference in First, determine effect for each group: y i =  0 +  1 Z i + e i

Why  1 Is the Mean Difference in First, determine effect for each group: For control group (Z i = 0): y i =  0 +  1 Z i + e i

Why  1 Is the Mean Difference in First, determine effect for each group: For control group (Z i = 0): y i =  0 +  1 Z i + e i y C =  0 +  1 (0) + 0

Why  1 Is the Mean Difference in First, determine effect for each group: For control group (Z i = 0): y i =  0 +  1 Z i + e i y C =  0 +  1 (0) + 0 e i averages to 0 across the group.

Why  1 Is the Mean Difference in First, determine effect for each group: For control group (Z i = 0): y i =  0 +  1 Z i + e i y C =  0 +  1 (0) + 0 y C =  0 e i averages to 0 across the group.

Why  1 Is the Mean Difference in First, determine effect for each group: For control group (Z i = 0): For treatment group (Z i = 1): y i =  0 +  1 Z i + e i y C =  0 +  1 (0) + 0 y C =  0 e i averages to 0 across the group.

Why  1 Is the Mean Difference in First, determine effect for each group: For control group (Z i = 0): For treatment group (Z i = 1): y i =  0 +  1 Z i + e i y C =  0 +  1 (0) + 0 y C =  0 y T =  0 +  1 (1) + 0 e i averages to 0 across the group.

Why  1 Is the Mean Difference in First, determine effect for each group: For control group (Z i = 0): For treatment group (Z i = 1): y i =  0 +  1 Z i + e i y C =  0 +  1 (0) + 0 y C =  0 y T =  0 +  1 (1) + 0 e i averages to 0 across the group.

Why  1 Is the Mean Difference in First, determine effect for each group: For control group (Z i = 0): For treatment group (Z i = 1): y i =  0 +  1 Z i + e i y C =  0 +  1 (0) + 0 y C =  0 y T =  0 +  1 (1) + 0 y T =  0 +  1 e i averages to 0 across the group.

Why  1 Is the Mean Difference in y i =  0 +  1 Z i + e i

Why  1 Is the Mean Difference in Then, find the difference between the two groups: y i =  0 +  1 Z i + e i

Why  1 Is the Mean Difference in Then, find the difference between the two groups: y i =  0 +  1 Z i + e i y T =  0 +  1 yTyT treatment

Why  1 Is the Mean Difference in Then, find the difference between the two groups: y i =  0 +  1 Z i + e i y C =  0 y T =  0 +  1 y T - y C = controltreatment

Why  1 Is the Mean Difference in Then, find the difference between the two groups: y i =  0 +  1 Z i + e i y C =  0 y T =  0 +  1 y T - y C = (  0 +  1 ) controltreatment

Why  1 Is the Mean Difference in Then, find the difference between the two groups: y i =  0 +  1 Z i + e i y C =  0 y T =  0 +  1 y T - y C = (  0 +  1 ) -  0 controltreatment

Why  1 Is the Mean Difference in Then, find the difference between the two groups: y i =  0 +  1 Z i + e i y C =  0 y T =  0 +  1 y T - y C = (  0 +  1 ) -  0 controltreatment y T - y C =  0 +  1 -  0

Why  1 Is the Mean Difference in Then, find the difference between the two groups: y i =  0 +  1 Z i + e i y C =  0 y T =  0 +  1 y T - y C = (  0 +  1 ) -  0 controltreatment y T - y C =  0 +  1 -  0 

Why  1 Is the Mean Difference in Then, find the difference between the two groups: y i =  0 +  1 Z i + e i y C =  0 y T =  0 +  1 y T - y C = (  0 +  1 ) -  0 controltreatment y T - y C =  0 +  1 -  0 y T - y C =  1 

Conclusions l t-test, one-way ANOVA and regression analysis all yield same results in this case. l The regression analysis method utilizes a dummy variable for treatment. l Regression analysis is the most general model of the three.