Lecture 22 Dustin Lueker.  The sample mean of the difference scores is an estimator for the difference between the population means  We can now use.

Slides:



Advertisements
Similar presentations
Regression and correlation methods
Advertisements

Lesson 10: Linear Regression and Correlation
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Inference for Regression
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Statistics Measures of Regression and Prediction Intervals.
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Sociology 601 Class 17: October 28, 2009 Review (linear regression) –new terms and concepts –assumptions –reading regression computer outputs Correlation.
1 Multiple Regression Interpretation. 2 Correlation, Causation Think about a light switch and the light that is on the electrical circuit. If you and.
Introduction to Regression Analysis
The Simple Linear Regression Model: Specification and Estimation
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
1 More Regression Information. 2 3 On the previous slide I have an Excel regression output. The example is the pizza sales we saw before. The first thing.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
SIMPLE LINEAR REGRESSION
Linear Regression and Correlation Analysis
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
More Simple Linear Regression 1. Variation 2 Remember to calculate the standard deviation of a variable we take each value and subtract off the mean and.
Correlation and Regression. Correlation What type of relationship exists between the two variables and is the correlation significant? x y Cigarettes.
Regression Chapter 10 Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania.
SIMPLE LINEAR REGRESSION
6.4 Prediction -We have already seen how to make predictions about our dependent variable using our OLS estimates and values for our independent variables.
Business Statistics - QBM117 Least squares regression.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Correlation & Regression
Correlation and Linear Regression
Chapter 8: Bivariate Regression and Correlation
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Chapter 11 Simple Regression
STA291 Statistical Methods Lecture 27. Inference for Regression.
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
Least-Squares Regression Section 3.3. Why Create a Model? There are two reasons to create a mathematical model for a set of bivariate data. To predict.
Regression Analysis. Scatter plots Regression analysis requires interval and ratio-level data. To see if your data fits the models of regression, it is.
© 2014 by Pearson Higher Education, Inc Upper Saddle River, New Jersey All Rights Reserved HLTH 300 Biostatistics for Public Health Practice, Raul.
Elementary Statistics Correlation and Regression.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Political Science 30: Political Inquiry. Linear Regression II: Making Sense of Regression Results Interpreting SPSS regression output Coefficients for.
1 Quadratic Model In order to account for curvature in the relationship between an explanatory and a response variable, one often adds the square of the.
Statistics Bivariate Analysis By: Student 1, 2, 3 Minutes Exercised Per Day vs. Weighted GPA.
Chapter 4 Summary Scatter diagrams of data pairs (x, y) are useful in helping us determine visually if there is any relation between x and y values and,
Correlation – Recap Correlation provides an estimate of how well change in ‘ x ’ causes change in ‘ y ’. The relationship has a magnitude (the r value)
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Simple Linear Regression The Coefficients of Correlation and Determination Two Quantitative Variables x variable – independent variable or explanatory.
Lecture 22 Dustin Lueker.  Similar to testing one proportion  Hypotheses are set up like two sample mean test ◦ H 0 :p 1 -p 2 =0  Same as H 0 : p 1.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Linear Regression and Correlation Chapter 13.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Lecture 22 Dustin Lueker.  Similar to testing one proportion  Hypotheses are set up like two sample mean test ◦ H 0 :p 1 -p 2 =0  Same as H 0 : p 1.
Lecture 13 Dustin Lueker. 2  Inferential statistical methods provide predictions about characteristics of a population, based on information in a sample.
Introduction Many problems in Engineering, Management, Health Sciences and other Sciences involve exploring the relationships between two or more variables.
Basic Estimation Techniques
Basic Estimation Techniques
The Weather Turbulence
STA 291 Summer 2008 Lecture 23 Dustin Lueker.
STA 291 Spring 2008 Lecture 23 Dustin Lueker.
STA 291 Spring 2008 Lecture 22 Dustin Lueker.
Presentation transcript:

Lecture 22 Dustin Lueker

 The sample mean of the difference scores is an estimator for the difference between the population means  We can now use exactly the same methods as for one sample ◦ Replace X i by D i 2STA 291 Summer 2010 Lecture 22

 Small sample confidence interval Note: ◦ When n is large (greater than 30), we can use the z- scores instead of the t-scores 3STA 291 Summer 2010 Lecture 22

 Small sample test statistic for testing difference in the population means ◦ For small n, use the t-distribution with df=n-1 ◦ For large n, use the normal distribution instead (z value) 4STA 291 Summer 2010 Lecture 22

 Variability in the difference scores may be less than the variability in the original scores ◦ This happens when the scores in the two samples are strongly associated ◦ Subjects who score high before the intensive training also tend to score high after the intensive training  Thus these high scores aren’t raising the variability for each individual sample 5STA 291 Summer 2010 Lecture 22

 If we wanted to examine the improvement students made after taking a class we would hope to see what type of value for ? Assuming we take X 1 -X 2 with X 1 being the student’s first exam score. 1.Positive 2.Negative STA 291 Summer 2010 Lecture 226

 Assuming we match people of similar health into 2 groups and gave group 1 a cholesterol medication and measured each groups cholesterol level after 8 weeks, what would we hope would be if we are subtracting group 2 from group 1? 1.Positive 2.Negative 3.Zero STA 291 Summer 2010 Lecture 227

 Regression ◦ The process of using sample information about explanatory variables (independent variables) to predict the value of a response variable (dependent variable)  Many types of regression ◦ One response variable to many response variables ◦ Linear, quadratic, cubic, logistic, exponential, etc. 8STA 291 Summer 2010 Lecture 22

 Uses one explanatory variable to predict a response variable ◦ Only type of regression we will look at in here  Model  y = Dependent (response) variable  x = Independent (explanatory) variable  β 0 =y-intercept  β 1 =Slope of the line (defined as rise/run)  ε=Error variable 9STA 291 Summer 2010 Lecture 22

 Model we will use in problems  y-hat = Dependent variable  x = Independent variable  b 0 =y-intercept  b 1 =Slope of the line (defined as rise/run)  Example:  Estimating college GPA by ACT score  College GPA would be our dependent (response) variable  ACT score would be our independent (explanatory) variable 10STA 291 Summer 2010 Lecture 22

 Notice that the equation is for y-hat which is an estimator of y ◦ When using a regression model it is important to remember that it will not exactly predict y, but rather give an estimate of what we would expect y to be  This is the reason we don’t have to have the error (ε) in the model we use, because error is accepted since we are simply what we would expect the value of y to be given x, basically estimating y 11STA 291 Summer 2010 Lecture 22

 Correlation Coefficient ◦ R = (-1,1)  Sometimes referred to as a lower case “r”  How strong the linear relationship is between the response and explanatory variable as well as the direction  ± indicates a positive relationship or a negative relationship  positive means our estimate of y goes up as x goes up  negative means our estimate of y goes down as x goes up  The closer the |R| is to one, the stronger the relationship is between the response and explanatory variables  R=0 indicates no relationship 12STA 291 Summer 2010 Lecture 22

 Coefficient of Determination ◦ Denoted by R 2  Calculated by squaring the correlation coefficient  Interpretation  The percent of variation in the response variable that is explained by the model  Simple Linear Regression  The percent of variation in y that is explained by x  This is because our model only has one variable ◦ The higher the R 2 value the better because we can explain more of the variation in our response variable, which is the one we are wanting to examine 13STA 291 Summer 2010 Lecture 22

 If the correlation coefficient is -.7, what would be the coefficient of determination?  Would larger values for the explanatory variable (x) yield larger or smaller values for the response variable (y)? STA 291 Summer 2010 Lecture 2214

 If model A has a correlation coefficient of.7, what would the correlation coefficient of model B need to be for us to be able to say that B is the better model? STA 291 Summer 2010 Lecture 2215

 If the slope of our simple linear regression equation is 13 and the y-intercept is -2, what would y-hat be if x=3?  What would y be? STA 291 Summer 2010 Lecture 2216