Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.

Slides:



Advertisements
Similar presentations
Coefficient of Determination- R²
Advertisements

AP Statistics Section 3.2 C Coefficient of Determination
Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 12 Simple Linear Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Analisis Varians Dwi Arah Pertemuan 22 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-4 Variation and Prediction Intervals.
Linear regression models
Correlation and Regression
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
Least Square Regression
The Simple Regression Model
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Chapter Topics Types of Regression Models
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Pertemua 19 Regresi Linier
This Week Continue with linear regression Begin multiple regression –Le 8.2 –C & S 9:A-E Handout: Class examples and assignment 3.
Simple Linear Regression and Correlation
Chapter 7 Forecasting with Simple Regression
Regression, Residuals, and Coefficient of Determination Section 3.2.
Linear Regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Introduction to Linear Regression and Correlation Analysis
Relationship of two variables
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Linear Regression Least Squares Method: the Meaning of r 2.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
BIOL 582 Lecture Set 11 Bivariate Data Correlation Regression.
Regression. Population Covariance and Correlation.
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
Regression Analysis Relationship with one independent variable.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
^ y = a + bx Stats Chapter 5 - Least Squares Regression
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
AP STATISTICS LESSON 3 – 3 (DAY 2) The role of r 2 in regression.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 19 Measure of Variation in the Simple Linear Regression Model (Data)Data.
REGRESSION AND CORRELATION SIMPLE LINEAR REGRESSION 10.2 SCATTER DIAGRAM 10.3 GRAPHICAL METHOD FOR DETERMINING REGRESSION 10.4 LEAST SQUARE METHOD.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Describing Bivariate Relationships. Bivariate Relationships When exploring/describing a bivariate (x,y) relationship: Determine the Explanatory and Response.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
The simple linear regression model and parameter estimation
Lecture #26 Thursday, November 17, 2016 Textbook: 14.1 and 14.3
LEAST – SQUARES REGRESSION
CHAPTER 3 Describing Relationships
Simple Linear Regression
Relationship with one independent variable
AP Stats: 3.3 Least-Squares Regression Line
^ y = a + bx Stats Chapter 5 - Least Squares Regression
Least Squares Method: the Meaning of r2
Relationship with one independent variable
Least-Squares Regression
Correlation and Regression
Created by Erin Hodgess, Houston, Texas
Presentation transcript:

Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes the relationship between the two variables Y i =  0 +  1 X i +   ^ ^

y -intercept of regression equation  0  0 Slope of regression equation  1  1 Dependent Response Variable Independent Explanatory Variable Residuals (error) Population Parameter Estimate ^ ^  YiYi XiXi YiYi ^

Definition  Regression Equation Given a collection of paired data, the regression equation  Regression Line (line of best fit or least-squares line) is the graph of the regression equation algebraically describes the relationship between the two variables Y i =  0 +  1 X i ^^ ^

Definitions  Residual (error) for a sample of paired ( x,y ) data, the difference ( y - y ) between an observed sample y -value and the value of y, which is the value of y that is predicted by using the regression equation.  Least-Squares Property A straight line satisfies this property if the sum of the squares of the residuals is the smallest sum possible. ^ ^

x y

x y y = x x y Residual = 7 Residual = -13 Residual = -5 Residual = 11 ^

Total Deviation from the mean of the particular point ( x, y ) the vertical distance y - y, which is the distance between the point ( x, y ) and the horizontal line passing through the sample mean y Explained Deviation the vertical distance y - y, which is the distance between the predicted y value and the horizontal line passing through the sample mean y Unexplained Deviation the vertical distance y - y, which is the vertical distance between the point ( x, y ) and the regression line. (The distance y - y is also called a residual. ) ^ ^ ^

Total deviation ( y - y ) Unexplained deviation ( y - y ) Explained deviation ( y - y ) (5, 32) (5, 25) (5, 17) y = x ^ y = 17 ^ ^ y x y = 25 y = 32 ^

( y - y ) = ( y - y ) + (y - y ) (total deviation) = (explained deviation) + (unexplained deviation) (total variation) = (explained variation) + (unexplained variation) Σ ( y - y ) 2 = Σ ( y - y ) 2 + Σ (y - y) 2 ^ ^ ^ ^ SST = SSR + SSE

Q=SSE=Σ (ε) 2 =Σ (y - y) 2 ^ =Σ (y -  0 -  1 X i ) 2 ^ ^ Minimize with respect to  1 and  0 ^^

 0 = (  y) (  x 2 ) - (  x) (  xy) n(  xy) - (  x) (  y) n(  x 2 ) - (  x) 2  1 = n(  x 2 ) - (  x) 2 ^ ^ ^ ^

Multiple Regression Models Polynomial Model Y k =  0 +  1 X 1k ………  k X nk +  k Y k =  0 +  1 X+  2 X 2 ………  k X k +  k Y k =  1 X 1k ………  k X nk +  k Multiple Regression Models (no intercept)

Y = XB + e Y is the n x 1 response vector (n x 1) X is the n x (k + 1) design matrix B is the n x 1 regression coefficients vector e is the n x 1 error (residual) vector 0 ≤ k ≤ n

Y k =  0 +  1 X 1k +  2 X 2k +  k Describe Y, X, B & e for