Simple Linear Regression

Presentation on theme: "Simple Linear Regression"— Presentation transcript:

Simple Linear Regression
Deriving Estimators Method of Moments Estimation Ordinary Least Squares Estimation

Method of Moments The Idea: Make assumptions about two “Moments” of the population distribution of u: Write these assumptions in terms of the PRF:

Impose the assumptions on the SRF:
Solve

What have we found?

ORDINARY LEAST SQUARES ESTIMATION
The Idea: Find estimators of the PRF regression coefficients that minimize the LOSS associated with making a mistake. Minimize the SUM OF SQUARED ERRORS (or sum of squared residuals) The residual (or error) The Sum of Squared Residuals (SSR) How do we find and We need some Calculus

OLS: Minimize the Sum of Squared Residuals (SSR)
Find and that minimize This generates the NORMAL EQUATIONS , and Solve these for the OLS estimators: ,and

Properties of OLS Estimators
A. Sample Properties Algebraic properties (by virtue of OLS construction: the NORMAL EQUATIONS From NORMAL EQUATION 1 The mean of the residuals is zero From NORMAL EQUATION 2 The residuals are uncorrelated with x

Total Sum of Squares = Explained Sum of Squares + Residual Sum of Squares
or SST = SSE + SSR What does this mean? , ,

Goodness of Fit: R2 How well does the SRF DESCRIBE the SAMPLE? R-SQUARED=SSE/SST=(1-SSR/SST)

Properties of OLS Estimators
B. POPULATION PROPERTIES That is Properties of the Sampling Distributions of the OLS estimators and That is, “What are the mean and variance of sampling distribution of each OLS estimator?

The mean of OLS estimators:
The Classical Linear Model (CLM) Assumptions concerning the PRF. SLR.1 (Linear in the Parameters) , SLR.2 (Random Sample of Size n) SLR.3 (Sample Variation in X) SLR.4 (Zero conditional mean of the random variable u) E(u|x)=0

Using these 4 SLR assumptions we can show that the OLS estimators are UNBIASED
That is we can show How ?

The Variance of OLS Estimators
We need one more CLM assumption. SLR.5 (Homoskedasticity) Var(u|x)=2

Under these 5 SLR assumptions we can show

GAUSS-MARKOV THEOREM Under assumptions SLR.1 through SLR.5, the OLS estimators, are the Best Linear Unbiased Estimators (BLUE) of the PRF coefficients That is The OLS estimators have the minimum variance of all possible linear unbiased estimators.

G-M Thm Proof a weighted average of the y variables where the weight is We know that the OLS estimator, , is 1. a linear function of the y variables. 2. an unbiased estimator of Let’s consider any linear estimator, impose the SLR assumptions on that estimator, and see what properties that estimator has.

Let the estimator, Now let’s impose on this estimator condition 2, unbiasedness: Define di to be the difference between the weights ai and wi. That is di = a1 – wi and, ai = wi + di . Thus, and or For our estimator to be unbiased, we must impose the following 2 conditions and

Now consider, Recall that two of the SLR assumptions for the Gauss-Markov theorem are: Independent random variables, and homoskedasticity. This means: Consider the last term above. The definition of wi and property of unbiasedness together mean This means

Recall our task is to find the estimator,
, that is linear, unbiased and has the smallest variance among all linear unbiased estimators. Remember the weight ai is any weight. We have shown We already know what wi is Then for minimum variance it must be that

Because this term is the sum of squares, the only way this sum can be
zero is if each of the di terms is zero. If each di is zero, then each weight ai that satisfies all three conditions: linear, unbiased and minimum variance is ai = wi + di = wi That is, the only weight satisfying these three conditions is ai = wi. Recall that wi is the weight for the OLS estimator. Therefore the OLS estimator has the minimum variance among all linear unbiased estimators. It is BLUE. QED