Inference about the slope parameter and correlation

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 12 Simple Linear Regression
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Simple Linear Regression
Chapter 12 Simple Linear Regression
Introduction to Regression Analysis
PSY 307 – Statistics for the Behavioral Sciences
Chapter 10 Simple Regression.
Correlation and Simple Regression Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
The Simple Regression Model
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Introduction to Regression Analysis, Chapter 13,
Simple Linear Regression Analysis
Correlation & Regression
Correlation and Regression
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Copyright © Cengage Learning. All rights reserved. 12 Simple Linear Regression and Correlation.
Copyright © Cengage Learning. All rights reserved. 13 Linear Correlation and Regression Analysis.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Correlation and Regression
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
© The McGraw-Hill Companies, Inc., Chapter 11 Correlation and Regression.
Introduction to Linear Regression
Production Planning and Control. A correlation is a relationship between two variables. The data can be represented by the ordered pairs (x, y) where.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
©The McGraw-Hill Companies, Inc. 2008McGraw-Hill/Irwin Linear Regression and Correlation Chapter 13.
Inference concerning two population variances
The simple linear regression model and parameter estimation
Introduction For inference on the difference between the means of two populations, we need samples from both populations. The basic assumptions.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Regression Analysis: Statistical Inference
AP Statistics Chapter 14 Section 1.
Correlation and Simple Linear Regression
Math 4030 – 10b Inferences Concerning Variances: Hypothesis Testing
Inference for Regression
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
Correlation and Simple Linear Regression
Correlation and Regression
Inference about the Slope and Intercept
LESSON 24: INFERENCES USING REGRESSION
6-1 Introduction To Empirical Models
Inference about the Slope and Intercept
Correlation and Simple Linear Regression
Statistical Inference about Regression
Interval Estimation and Hypothesis Testing
Correlation and Regression
Simple Linear Regression
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Introduction to Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Inference about the slope parameter and correlation

Introduction The slope is the true average change in the dependent variable y associated with a 1-unit increase in x. The slope of the least squares line gives an estimate of the true slope This estimate depends on the values of Y, which are random. If we can determine the sampling distribution of the estimate, we can perform inference for the true slope.

Estimator of slope is a linear combination of normal variables The estimator of the slope is where . This is a linear combination of normal random variables , and thus it has a normal distribution.

Mean and variance of estimator of slope

Estimated variance of estimator Recall that in simple linear regression we estimate using (which the book calls ). Then the estimated standard deviation of the estimator is .

The t statistic The assumptions of the simple linear regression model then imply that the standardized variable has a t distribution with n-2 d.f.

Confidence interval for slope Confidence intervals and hypothesis tests for are then carried out in the usual manner. A confidence interval for is

Hypothesis test procedures Null hypothesis: Test statistic: Alternative hypothesis Rejection region The test versus tests the usefulness of the model.

Correlation The sample correlation coefficient gives a measure of the linear relationship among X and Y. Whereas for linear regression the X variable is fixed, here it doesn’t matter which variable is called X, and which is called Y. The statistic is related to the coefficient of determination in simple linear regression, and forms an estimate of the population correlation coefficient .

The sample correlation coefficient The sample correlation coefficient for the n pairs is Recall that , so that the estimated slope and have the same sign.

Properties of r The value of r is independent of the units in which x and y are measured r lies in the interval r = 1 if and only if all pairs lie on a straight line with positive slope, and r = -1 if and only if all pairs lie on a straight line with negative slope. The square of the correlation coefficient gives the value of the coefficient of determination from fitting the simple linear regression model. r measures the degree of the linear relationship

When is the correlation strong? Weak Moderate Strong The rationale for calling correlations weak even when they are as large in absolute value as .5 is that even in that case , so that if the linear model explains at most 25% of the observed variation, which is not very impressive.

Inferences about the population correlation coefficient We can think of the pairs as having been drawn from a bivariate population of pairs, with some joint pmf or pdf, and correlation . When the joint pdf is bivariate normal, one can carry out inference for . Let (X,Y) be bivariate normal with respective means , variances , and correlation coefficient .

Inferences about the population correlation coefficient (continued) If X = x, it can be shown that the (conditional) distribution of Y is normal with mean and variance This fits the simple linear regression model with , , and .

Inferences about the population correlation coefficient (continued) The implication is that if the observed pairs are actually drawn from a bivariate normal distribution, then the simple linear regression model is an appropriate way of studying the behavior of Y given X=x. If , then , independent of x.

Testing for the absence of correlation When is true, the test statistic has a t distribution with n-2 d.f. Alternative hypothesis Rejection region

Testing for absence of correlation (continued) The null hypothesis states that there is no linear relationship between X and Y in the population. In the context of regression analysis, we used to test for the absence of a linear relationship ( ). Since the tests are equivalent.