STA 291 Spring 2008 Lecture 23 Dustin Lueker.

Slides:



Advertisements
Similar presentations
Regression and correlation methods
Advertisements

Lesson 10: Linear Regression and Correlation
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Statistics Measures of Regression and Prediction Intervals.
Sociology 601 Class 17: October 28, 2009 Review (linear regression) –new terms and concepts –assumptions –reading regression computer outputs Correlation.
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
6.4 Prediction -We have already seen how to make predictions about our dependent variable using our OLS estimates and values for our independent variables.
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Simple Linear Regression Analysis
Lecture 5 Correlation and Regression
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
© The McGraw-Hill Companies, Inc., 2000 Business and Finance College Principles of Statistics Lecture 10 aaed EL Rabai week
Lecture 22 Dustin Lueker.  The sample mean of the difference scores is an estimator for the difference between the population means  We can now use.
Least-Squares Regression Section 3.3. Why Create a Model? There are two reasons to create a mathematical model for a set of bivariate data. To predict.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
1 Quadratic Model In order to account for curvature in the relationship between an explanatory and a response variable, one often adds the square of the.
Correlation – Recap Correlation provides an estimate of how well change in ‘ x ’ causes change in ‘ y ’. The relationship has a magnitude (the r value)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Lecture 22 Dustin Lueker.  Similar to testing one proportion  Hypotheses are set up like two sample mean test ◦ H 0 :p 1 -p 2 =0  Same as H 0 : p 1.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Chapters 8 Linear Regression. Correlation and Regression Correlation = linear relationship between two variables. Summarize relationship with line. Called.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Lecture 22 Dustin Lueker.  Similar to testing one proportion  Hypotheses are set up like two sample mean test ◦ H 0 :p 1 -p 2 =0  Same as H 0 : p 1.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
Introduction Many problems in Engineering, Management, Health Sciences and other Sciences involve exploring the relationships between two or more variables.
Module II Lecture 1: Multiple Regression
Lecture Slides Elementary Statistics Twelfth Edition
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
The simple linear regression model and parameter estimation
Statistics 200 Lecture #6 Thursday, September 8, 2016
Chapter 14 Introduction to Multiple Regression
Regression and Correlation
Regression Analysis.
Regression Analysis AGEC 784.
Lecture #26 Thursday, November 17, 2016 Textbook: 14.1 and 14.3
Linear Regression.
LECTURE 13 Thursday, 8th October
Correlation and Simple Linear Regression
Political Science 30: Political Inquiry
Linear Regression and Correlation Analysis
Week 14 Chapter 16 – Partial Correlation and Multiple Regression and Correlation.
POSC 202A: Lecture Lecture: Substantive Significance, Relationship between Variables 1.
Simple Linear Regression
Simple Linear Regression - Introduction
Correlation and Simple Linear Regression
Lecture Slides Elementary Statistics Thirteenth Edition
Correlation and Regression
CHAPTER 29: Multiple Regression*
CHAPTER 26: Inference for Regression
The Weather Turbulence
Correlation and Simple Linear Regression
I271b Quantitative Methods
STA 291 Summer 2008 Lecture 23 Dustin Lueker.
Simple Linear Regression
Correlation and Regression
CHAPTER 14 MULTIPLE REGRESSION
Simple Linear Regression and Correlation
Product moment correlation
y = mx + b Linear Regression line of best fit REMEMBER:
Inferential Statistics
STA 291 Summer 2008 Lecture 14 Dustin Lueker.
STA 291 Spring 2008 Lecture 22 Dustin Lueker.
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

STA 291 Spring 2008 Lecture 23 Dustin Lueker

Comparing Dependent Samples The sample mean of the difference scores is an estimator for the difference between the population means We can now use exactly the same methods as for one sample Replace Xi by Di STA 291 Spring 2008 Lecture 22

Comparing Dependent Samples Small sample confidence interval Note: When n is large (greater than 30), we can use the z- scores instead of the t-scores STA 291 Spring 2008 Lecture 22

Comparing Dependent Samples Small sample test statistic for testing difference in the population means For small n, use the t-distribution with df=n-1 For large n, use the normal distribution instead (z value) STA 291 Spring 2008 Lecture 22

Reducing Variability Variability in the difference scores may be less than the variability in the original scores This happens when the scores in the two samples are strongly associated Subjects who score high before the intensive training also dent to score high after the intensive training Thus these high scores aren’t raising the variability for each individual sample STA 291 Spring 2008 Lecture 22

Example If we wanted to examine the improvement students made after taking a class we would hope to see what type of value for ? Assuming we take X1-X2 with X1 being the student’s first exam score. Positive Negative STA 291 Spring 2008 Lecture 23

Example Assuming we match people of similar health into 2 groups and gave group 1 a cholesterol medication and measured each groups cholesterol level after 8 weeks, what would we hope would be if we are subtracting group 2 from group 1? Positive Negative Zero STA 291 Spring 2008 Lecture 23

An Introduction to Regression The process of using sample information about explanatory variables (independent variables) to predict the value of a response variable (dependent variable) Many types of regression One response variable to many response variables Linear, quadratic, cubic, logistic, exponential, etc. STA 291 Spring 2008 Lecture 23

Simple Linear Regression Uses one explanatory variable to predict a response variable Only type of regression we will look at in here Model y = Dependent (response) variable x = Independent (explanatory) variable β0=y-intercept β1=Slope of the line (defined as rise/run) ε=Error variable STA 291 Spring 2008 Lecture 23

Simple Linear Regression Model we will use in problems y-hat = Dependent variable x = Independent variable b0=y-intercept b1=Slope of the line (defined as rise/run) Example: Estimating college GPA by ACT score College GPA would be our dependent (response) variable ACT score would be our independent (explanatory) variable STA 291 Spring 2008 Lecture 23

Simple Linear Regression Notice that the equation is for y-hat which is an estimator of y When using a regression model is it important to remember that it will not exactly predict y, but rather give an estimate of what we would expect y to be This is the reason we don’t have to have the error (ε) in the model we use, because error is accepted since we are simply what we would expect the value of y to be given x, basically estimating y STA 291 Spring 2008 Lecture 23

Analyzing the Model Correlation Coefficient R = (-1,1) Sometimes referred to as a lower case “r” How strong the linear relationship is between the response and explanatory variable as well as the direction ± indicates a positive relationship or a negative relationship positive means our estimate of y goes up as x goes up negative means our estimate of y goes down as x goes up The closer the |R| is to one, the stronger the relationship is between the response and explanatory variables R=0 indicates no relationship STA 291 Spring 2008 Lecture 23

Analyzing the Model Coefficient of Determination Denoted by R2 Calculated by squaring the correlation coefficient Interpretation The percent of variation in the response variable that is explained by the model Simple Linear Regression The percent of variation in y that is explained by x This is because our model only has one variable The higher the R2 value the better because we can explain more of the variation in our response variable, which is the one we are wanting to examine STA 291 Spring 2008 Lecture 23

Example If the correlation coefficient is -.7, what would be the coefficient of determination? .7 -.7 .49 -.49 Would larger values for the explanatory variable (x) yield larger or smaller values for the response variable (y)? STA 291 Spring 2008 Lecture 23

Example If model A has a correlation coefficient of .7, what would the correlation coefficient of model B need to be for us to be able to say that B is the better model? .35 -.6 -.9 STA 291 Spring 2008 Lecture 23

Example If the slope of our simple linear regression equation is 13 and the y-intercept is -2, what would y-hat be if x=3? 39 41 37 -23 What would y be? STA 291 Spring 2008 Lecture 23