Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.

Slides:



Advertisements
Similar presentations
AP Statistics Section 3.2 C Coefficient of Determination
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Kin 304 Regression Linear Regression Least Sum of Squares
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Regression: Mathematical method for determining the best equation that reproduces a data set Linear Regression: Regression method applied with.
2.2 Correlation Correlation measures the direction and strength of the linear relationship between two quantitative variables.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Chapter 2 – Simple Linear Regression - How. Here is a perfect scenario of what we want reality to look like for simple linear regression. Our two variables.
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Linear Regression Analysis
Correlation & Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Introduction to Linear Regression and Correlation Analysis
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 6 & 7 Linear Regression & Correlation
Anthony Greene1 Regression Using Correlation To Make Predictions.
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Statistical Methods Statistical Methods Descriptive Inferential
Chapter 8 Curve Fitting.
Chapter Line of best fit. Objectives  Determine a line of best fit for a set of linear data.  Determine and interpret the correlation coefficient.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 4 Section 2 – Slide 1 of 20 Chapter 4 Section 2 Least-Squares Regression.
CHAPTER 3 INTRODUCTORY LINEAR REGRESSION. Introduction  Linear regression is a study on the linear relationship between two variables. This is done by.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
A medical researcher wishes to determine how the dosage (in mg) of a drug affects the heart rate of the patient. DosageHeart rate
CHAPTER 5 Regression BPS - 5TH ED.CHAPTER 5 1. PREDICTION VIA REGRESSION LINE NUMBER OF NEW BIRDS AND PERCENT RETURNING BPS - 5TH ED.CHAPTER 5 2.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Copyright © Cengage Learning. All rights reserved. 1 STRAIGHT LINES AND LINEAR FUNCTIONS.
1 Data Analysis Linear Regression Data Analysis Linear Regression Ernesto A. Diaz Department of Mathematics Redwood High School.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Differential Equations Linear Equations with Variable Coefficients.
Simple Linear Regression The Coefficients of Correlation and Determination Two Quantitative Variables x variable – independent variable or explanatory.
Method 3: Least squares regression. Another method for finding the equation of a straight line which is fitted to data is known as the method of least-squares.
OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Chapter 12 Simple Regression Statistika.  Analisis regresi adalah analisis hubungan linear antar 2 variabel random yang mempunyai hub linear,  Variabel.
Regression and Correlation of Data Summary
Part 5 - Chapter 17.
LEAST – SQUARES REGRESSION
Linear Regression Special Topics.
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
Kin 304 Regression Linear Regression Least Sum of Squares
Ch12.1 Simple Linear Regression
BPK 304W Regression Linear Regression Least Sum of Squares
Part 5 - Chapter 17.
M248: Analyzing data Block D UNIT D2 Regression.
Simple Linear Regression
Section 2: Linear Regression.
Economics of agro-food safety and international market for agro-food products and legislation Antonio Stasi.
Least-Squares Regression
Discrete Least Squares Approximation
Product moment correlation
3.2 – Least Squares Regression
Sleeping and Happiness
Presentation transcript:

Linear Regression James H. Steiger

Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You wish to find a simple, convenient mathematical function that comes close to most of the points, thereby describing succinctly the relationship between X and Y.

Linear Regression The straight line is a particularly simple function. When we fit a straight line to data, we are performing linear regression analysis.

Linear Regression The Goal Find the “best fitting” straight line for a set of data. Since every straight line fits the equation with slope b and Y-intercept c, it follows that our task is to find a b and c that produce the best fit. There are many ways of operationalizing the notion of a “best fitting straight line.” A very popular choice is the “Least Squares Criterion.”

The Least Squares Criterion For every data point, define the “predicted score” to be the value of the straight line evaluated at. The “regression residual,” or “error of prediction” is the distance from the straight line to the data point in the up-down direction.

The Least Squares Criterion

The least squares criterion operationalizes the notion of best fitting line as that choice of b and c that minimizes the sum of squared “residuals,” or errors. Solving this problem is an elementary problem in differential calculus. The method of solution need not concern us here, but the result is famous, important, and should be memorized.

The Least Squares Solution

The Standardized Least Squares Solution Suppose X and Y are both in Z score form. What will the equations for the least squares coefficients b and c reduce to? (C.P.)

Variance of Predicted and Error Scores When you compute a regression line, you can “plug in” all the X scores into the equation for the regression line, and obtain predicted and error scores for each person. If you do this, it is well known that the variances of the predicted and error scores are Moreover, the predicted and error scores are always precisely uncorrelated, that is,

Variance of Predicted and Error Scores Can you prove the results on the preceding page, using linear transformation and linear combination theory?