5 ORDINARY LEAST SQUARES ESTIMATION The Idea: Find estimators of the PRF regression coefficients that minimize the LOSS associated with making a mistake.Minimize the SUM OF SQUARED ERRORS (or sum of squared residuals)The residual (or error)The Sum of Squared Residuals (SSR)How do we findandWe need some Calculus
7 OLS: Minimize the Sum of Squared Residuals (SSR) Findandthat minimizeThis generates the NORMAL EQUATIONS, andSolve these for the OLS estimators:,and
8 Properties of OLS Estimators A. Sample PropertiesAlgebraic properties (by virtue of OLS construction: the NORMAL EQUATIONSFrom NORMAL EQUATION 1The mean of the residuals is zeroFrom NORMAL EQUATION 2The residuals are uncorrelated with x
9 Total Sum of Squares = Explained Sum of Squares + Residual Sum of Squares or SST = SSE + SSRWhat does this mean?,,
10 Goodness of Fit: R2How well does the SRF DESCRIBE the SAMPLE?R-SQUARED=SSE/SST=(1-SSR/SST)
11 Properties of OLS Estimators B. POPULATION PROPERTIESThat is Properties of the Sampling Distributions of the OLS estimatorsandThat is, “What are the mean and variance of sampling distribution of each OLS estimator?
12 The mean of OLS estimators: The Classical Linear Model (CLM) Assumptions concerning the PRF.SLR.1 (Linear in the Parameters),SLR.2 (Random Sample of Size n)SLR.3 (Sample Variation in X)SLR.4 (Zero conditional mean of the random variable u)E(u|x)=0
13 Using these 4 SLR assumptions we can show that the OLS estimators are UNBIASED That is we can showHow ?
14 The Variance of OLS Estimators We need one more CLM assumption.SLR.5 (Homoskedasticity)Var(u|x)=2
16 GAUSS-MARKOV THEOREMUnder assumptions SLR.1 through SLR.5, the OLS estimators,are the Best Linear Unbiased Estimators (BLUE) of the PRF coefficientsThat is The OLS estimators have the minimum variance of all possible linear unbiased estimators.
17 G-M Thm Proofa weighted average of the y variables where the weight isWe know that the OLS estimator,, is1. a linear function of the y variables.2. an unbiased estimator ofLet’s consider any linear estimator, impose the SLR assumptions on thatestimator, and see what properties that estimator has.
18 Let the estimator,Now let’s impose on this estimator condition 2, unbiasedness:Define di to be the difference between the weights ai and wi. That is di = a1 – wi and, ai = wi + di . Thus,andorFor our estimator to be unbiased, we must impose the following 2 conditionsand
19 Now consider,Recall that two of the SLR assumptions for the Gauss-Markov theorem are:Independent random variables, and homoskedasticity. This means:Consider the last term above. The definition of wi and property of unbiasednesstogether meanThis means
20 Recall our task is to find the estimator, , that islinear,unbiasedand has the smallest variance among all linear unbiased estimators.Remember the weight ai is any weight. We have shownWe already know what wi isThen for minimum variance it must be that
21 Because this term is the sum of squares, the only way this sum can be zero is if each of the di terms is zero. If each di is zero, then each weightai that satisfies all three conditions: linear, unbiased and minimum varianceis ai = wi + di = wi That is, the only weight satisfying these threeconditions is ai = wi. Recall that wi is the weight for the OLS estimator.Therefore the OLS estimator has the minimum variance among alllinear unbiased estimators. It is BLUE.QED