FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.

Slides:



Advertisements
Similar presentations
Economics 20 - Prof. Anderson1 The Simple Regression Model y =  0 +  1 x + u.
Advertisements

Multiple Regression Analysis
Kin 304 Regression Linear Regression Least Sum of Squares
The Simple Regression Model
Chapter 12 Simple Linear Regression
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
(c) 2007 IUPUI SPEA K300 (4392) Outline Least Squares Methods Estimation: Least Squares Interpretation of estimators Properties of OLS estimators Variance.
Part 1 Cross Sectional Data
The Simple Regression Model
Simple Linear Regression
Linear Regression with One Regression
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Chapter 12a Simple Linear Regression
The Simple Regression Model
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
The Simple Regression Model
Lecture 1 (Ch1, Ch2) Simple linear regression
Analysis of Individual Variables Descriptive – –Measures of Central Tendency Mean – Average score of distribution (1 st moment) Median – Middle score (50.
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Ordinary Least Squares
Simple Linear Regression
Correlation & Regression
Introduction to Linear Regression and Correlation Analysis
Introduction to Linear Regression
Chap 12-1 A Course In Business Statistics, 4th © 2006 Prentice-Hall, Inc. A Course In Business Statistics 4 th Edition Chapter 12 Introduction to Linear.
MTH 161: Introduction To Statistics
Statistical Methods Statistical Methods Descriptive Inferential
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Chapter 8: Simple Linear Regression Yang Zhenlin.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Psychology 202a Advanced Psychological Statistics October 22, 2015.
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Multiple Regression Analysis: Estimation. Multiple Regression Model y = ß 0 + ß 1 x 1 + ß 2 x 2 + …+ ß k x k + u -ß 0 is still the intercept -ß 1 to ß.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Review.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
Inference about the slope parameter and correlation
The simple linear regression model and parameter estimation
Simple Linear Regression
Ch. 2: The Simple Regression Model
Kin 304 Regression Linear Regression Least Sum of Squares
Ch12.1 Simple Linear Regression
The Simple Regression Model
Multiple Regression Analysis
BPK 304W Regression Linear Regression Least Sum of Squares
Slope of the regression line:
Ch. 2: The Simple Regression Model
Regression Models - Introduction
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Simple Linear Regression
Linear Regression Summer School IFPRI
Regression Models - Introduction
Presentation transcript:

FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u

FIN357 Li2 Some Terminology In the simple linear regression model, where y =  0 +  1 x + u, we typically refer to y as the Dependent Variable, or Left-Hand Side Variable, or Explained Variable, or

FIN357 Li3 Some Terminology we typically refer to x as the Independent Variable, or Right-Hand Side Variable, or Explanatory Variable, or Regressor, or Control Variables

FIN357 Li4 Some Assumptions The average value of u, the error term, in the population is 0. That is, E(u) = 0 E(u|x) = 0 E(y|x) =  0 +  1 x

FIN357 Li5.. x1x1 x2x2 E(y|x) as a linear function of x, where for any x the distribution of y is centered about E(y|x) E(y|x) =  0 +  1 x y f(y)

FIN357 Li6 Ordinary Least Squares (OLS) Let {(x i,y i ): i=1, …,n} denote a random sample of size n from the population For each observation in this sample, it will be the case that y i =  0 +  1 x i + u i

FIN357 Li7.... y4y4 y1y1 y2y2 y3y3 x1x1 x2x2 x3x3 x4x4 } } { { u1u1 u2u2 u3u3 u4u4 x y Population regression line, sample data points and the associated error terms E(y|x) =  0 +  1 x

FIN357 Li8 Basic idea of regression is to estimate the population parameters from a sample Intuitively, OLS is fitting a line through the sample points such that the sum of squared residuals is as small as possible. The residual, û, is an estimate of the error term, u, and is the difference between the fitted line (sample regression function) and the sample point

FIN357 Li9.... y4y4 y1y1 y2y2 y3y3 x1x1 x2x2 x3x3 x4x4 } } { { û1û1 û2û2 û3û3 û4û4 x y Sample regression line, sample data points and the associated estimated error terms (residuals)

FIN357 Li10 It could be shown that estimated coefficient is

FIN357 Li11 OLS regressions Now that we’ve derived the formula for calculating the OLS estimates of our parameters (also called coefficients), you’ll be happy to know you don’t have to compute them by hand Regressions in GRETL are very simple. Have you installed the software yet?

FIN357 Li12 Some Properties of OLS If x and y are positively correlated, the slope will be positive If x and y are negatively correlated, the slope will be negative The sum of the OLS residuals is zero The OLS regression line always goes through the mean of the sample

FIN357 Li13 More terminology

FIN357 Li14 Goodness-of-Fit How do we think about how well our sample regression line fits our sample data? Can compute the fraction of the total sum of squares (SST) that is explained by the model, call this the R-squared of regression R 2 = SSE/SST = 1 – SSR/SST

FIN357 Li15 Under some conditions, OLS esimated coefficients are unbiased. Unbiasedness is a description of the estimator “on average” In a given sample we may be “near” or “far” from the true parameter