Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multiple Regression Analysis: OLS Asymptotics

Similar presentations


Presentation on theme: "Multiple Regression Analysis: OLS Asymptotics"— Presentation transcript:

1 Multiple Regression Analysis: OLS Asymptotics
Dr. Woraphon Yamaka

2 Multiple Regression Analysis: Inference
Assumption MLR.6 (Normality of error terms) independently of It is assumed that the unobserved factors are normally distributed around the population regression function. The form and the variance of the distribution does not depend on any of the explanatory variables. It follows that:

3 Multiple Regression Analysis: Inference
Discussion of the normality assumption The error term is the sum of “many” different unobserved factors Sums of independent factors are normally distributed (CLT) Problems: How many different factors? Number large enough? Possibly very heterogenuous distributions of individual factors How independent are the different factors? The normality of the error term is an empirical question At least the error distribution should be “close” to normal In many cases, normality is questionable or impossible by definition

4 Multiple Regression Analysis: Inference
Discussion of the normality assumption (cont.) Examples where normality cannot hold: Wages (nonnegative; also: minimum wage) Number of arrests (takes on a small number of integer values) Unemployment (indicator variable, takes on only 1 or 0) In some cases, normality can be achieved through transformations of the dependent variable (e.g. use log(wage) instead of wage) Under normality, OLS is the best (even nonlinear) unbiased estimator Important: For the purposes of statistical inference, the assumption of normality can be replaced by a large sample size

5 What is OLS? It is estimator not a Model!!!!!
In this class we will learn a simple application of OLS in linear regression Model The formula of linear regression Model

6 Let we start with the simple Linear regression (with intercept)
y x

7 Graphical for the relationship between X and Y (without intercept)
x

8 In mathematical point of view
Least squares means Least === Minimize Squares === So we now work on the Minimization problem.

9 How we do the minimization problem?
the method of Lagrange multipliers (named after Joseph-Louis Lagrange[1]) is a strategy for finding the local maxima and minima of a function subject to equality constraints  In OLS here, we do not have a constraint, thus, our problem becomes Objection Function

10 FOC , SOC FOC SOC Let prove together.

11 Solution

12 Example Suppose we have two data sets X=(0,2,3,1,2,4,2, 5,7,5)
Y=(2,4,6,2,4,8,4, 10,14,10) X1=(1,3,4,2,3,5, 3,6,8,6) Y1=(2,4,6,2,4,8, 4,10,14,10)

13 Ans X’ Example 1.1 X=(0,2,3,1,2,4,2,5,7,5) Y=(0,4,6,2,4,8,4,10,14,10)
X=(0,2,3,1,2,4,2,5,7,5) Y=(0,4,6,2,4,8,4,10,14,10) Ans X’ y

14 Example 1.2 (กรณีที่ตั้งแบบจำลองผิด)
X=(0,2,3,1,2,4,2,5,7,5) Y1=(1,5,7,3,5,9,5,11,15,11) X’ y

15 Example 1.3 X=(0,2,3,1,2,4,2,5,7,5) Y1=(1,5,7,3,5,9,5,11,15,11) X’ y

16 Multiple Regression Analysis: OLS Asymptotics
So far we focused on properties of OLS that hold for any sample Properties of OLS that hold for any sample/sample size Expected values/unbiasedness under MLR.1 – MLR.4 Variance formulas under MLR.1 – MLR.5 Gauss-Markov Theorem under MLR.1 – MLR.5 Exact sampling distributions/tests under MLR.1 – MLR.6 Properties of OLS that hold in large samples Consistency under MLR.1 – MLR.4 Asymptotic normality/tests under MLR.1 – MLR.5 Without assuming normality of the error term!

17 Multiple Regression Analysis: OLS Asymptotics
Consistency Interpretation: Consistency means that the probability that the estimate is arbitrari- ly close to the true population value can be made arbitrarily high by increasing the sample size Consistency is a minimum requirement for sensible estimators An estimator is consistent for a population parameter if for arbitrary and Alternative notation: The estimate converges in proba-bility to the true population value

18 Multiple Regression Analysis: OLS Asymptotics
Theorem 5.1 (Consistency of OLS) Special case of simple regression model Assumption MLR.4‘ One can see that the slope estimate is consistent if the explanatory variable is exogenous, i.e. un-correlated with the error term. All explanatory variables must be uncorrelated with the error term. This assumption is weaker than the zero conditional mean assumption MLR.4.

19 Multiple Regression Analysis: OLS Asymptotics
For consistency of OLS, only the weaker MLR.4‘ is needed Asymptotic analog of omitted variable bias True model Misspecified model Bias There is no omitted variable bias if the omitted variable is irrelevant or uncorrelated with the included variable

20 Multiple Regression Analysis: OLS Asymptotics
Variances of the OLS estimators Depending on the sample, the estimates will be nearer or farther away from the true population values How far can we expect our estimates to be away from the true population values on average (= sampling variability)? Sampling variability is measured by the estimator‘s variances Assumption MLR.5 (Homoskedasticity) The value of the explanatory variable must contain no information about the variability of the unobserved factors

21 Multiple Regression Analysis: OLS Asymptotics
Theorem 2.2 (Variances of the OLS estimators) Conclusion: The sampling variability of the estimated regression coefficients will be the higher, the larger the variability of the unobserved factors, and the lower, the higher the variation in the explanatory variable Under assumptions SLR.1 – SLR.5:

22 Multiple Regression Analysis: OLS Asymptotics
Estimating the error variance The variance of u does not depend on x, i.e. equal to the unconditional variance One could estimate the variance of the errors by calculating the variance of the residuals in the sample; unfortunately this estimate would be biased An unbiased estimate of the error variance can be obtained by substracting the number of estimated regression coefficients from the number of observations

23 Multiple Regression Analysis: OLS Asymptotics
Theorem 2.3 (Unbiasedness of the error variance) Calculation of standard errors for regression coefficients Plug in for the unknown The estimated standard deviations of the regression coefficients are called “standard errors.” They measure how precisely the regression coefficients are estimated.

24 Multiple Regression Analysis: OLS Asymptotics
Asymptotic analysis of the OLS sampling errors (cont.) This is why large samples are better Example: Standard errors in a birth weight equation shrinks with the rate shrinks with the rate Use only the first half of observations

25 Assignment 3


Download ppt "Multiple Regression Analysis: OLS Asymptotics"

Similar presentations


Ads by Google