Kin 304 Regression Linear Regression Least Sum of Squares

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

Chapter 12 Simple Linear Regression
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Simple Linear Regression. Start by exploring the data Construct a scatterplot  Does a linear relationship between variables exist?  Is the relationship.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 10 Regression. Defining Regression Simple linear regression features one independent variable and one dependent variable, as in correlation the.
Simple Linear Regression
Simple Linear Regression 1. Correlation indicates the magnitude and direction of the linear relationship between two variables. Linear Regression: variable.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Chapter Topics Types of Regression Models
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Simple Linear Regression Analysis
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Correlation & Regression
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Ch4 Describing Relationships Between Variables. Section 4.1: Fitting a Line by Least Squares Often we want to fit a straight line to data. For example.
Statistical Methods Statistical Methods Descriptive Inferential
Regression. Population Covariance and Correlation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Regression Regression relationship = trend + scatter
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Stats Methods at IC Lecture 3: Regression.
Inference about the slope parameter and correlation
The simple linear regression model and parameter estimation
Regression and Correlation of Data Summary
REGRESSION G&W p
AP Statistics Chapter 14 Section 1.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Quantitative Methods Simple Regression.
BPK 304W Correlation.
Simple Linear Regression - Introduction
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
No notecard for this quiz!!
Simple Linear Regression
Least-Squares Regression
Correlation and Regression
Ch11 Curve Fitting II.
Simple Linear Regression
Correlation and Regression
Adequacy of Linear Regression Models
Created by Erin Hodgess, Houston, Texas
Presentation transcript:

Kin 304 Regression Linear Regression Least Sum of Squares Assumptions about the relationship between Y and X Standard Error of Estimate Multiple Regression Standardized Regression

Prediction Can we predict one variable from another? Linear Regression Analysis Y = mX + c m = slope; c = intercept Regression

Linear Regression Correlation Coefficient (r) how well the line fits Standard Error of Estimate (S.E.E.) how well the line predicts Regression

Least Sum of Squares Curve Fitting (Residual) Regression

Assumptions about the relationship between Y and X For each value of X there is a normal distribution of Y from which the sample value of Y is drawn The population of values of Y corresponding to a selected X has a mean that lies on the straight line In each population the standard deviation of Y about its mean has the same value

Standard Error of Estimate measure of how well the equation predicts Y has units of Y true score 68.26% of time is within plus or minus 1 SEE of predicted score Standard deviation of the normal distribution of residuals Regression

Right Hand L. = 0.99Left Hand L. + 0.254 r = 0.94 S.E.E. = 0.38cm Regression Kin 304 Spring 2009

How good is my equation? Regression equations are sample specific Cross-validation Studies Test your equation on a different sample Split sample studies Take a 50% random sample and develop your equation then test it on the other 50% of the sample Regression

Multiple Regression More than one independent variable Y = m1X1 + m2X2 + m3X3 …… + c Same meaning for r, and S.E.E., just more measures used to predict Y Stepwise regression variables are entered into the equation based upon their relative importance Regression

Building a multiple regression equation X1 has the highest correlation with Y, therefore it would be the first variable included in the equation. X3 has a higher correlation with Y than X2. However, X2 would be a better choice than X3. to include in an equation with X1, to predict Y. X2 has a low correlation with X1 and explains some of the variance that X1 does not. X3 Y X1 X2

Standardized Regression The numerical value is of mn is dependent upon the size of the independent variable Y = m1X1 + m2X2 + m3X3 …… + c Variables are transformed into standard scores before regression analysis, therefore mean and standard deviation of all independent variables are 0 and 1 respectively. The numerical value of zmn now represents the relative importance of that independent variable to the prediction Y = zm1X1 + zm2X2 + zm3X3 …… + c Regression