Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Properties of Least Squares Regression Coefficients
Managerial Economics in a Global Economy
Multiple Regression Analysis
The Simple Regression Model
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
Theory and Estimation of Regression Models Simple Regression Theory
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Bivariate Regression Analysis
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
The Simple Linear Regression Model: Specification and Estimation
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Simple Linear Regression
3-variable Regression Derive OLS estimators of 3-variable regression
All rights reserved by Dr.Bill Wan Sing Hung - HKBU 4A.1 Week 4a Multiple Regression The meaning of partial regression coefficients.
Econ 140 Lecture 71 Classical Regression Lecture 7.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
Ordinary Least Squares
Introduction to Linear Regression and Correlation Analysis
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
1Spring 02 First Derivatives x y x y x y dy/dx = 0 dy/dx > 0dy/dx < 0.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Ordinary Least Squares Regression.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
Linear Regression with One Regression
6. Simple Regression and OLS Estimation
Ch. 2: The Simple Regression Model
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Multiple Regression Analysis: Estimation
Objectives By the end of this lecture students will:
The Simple Linear Regression Model: Specification and Estimation
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
THE LINEAR REGRESSION MODEL: AN OVERVIEW
Chapter 5: The Simple Regression Model
Evgeniya Anatolievna Kolomak, Professor
Fundamentals of regression analysis
The Simple Regression Model
ECONOMETRICS DR. DEEPTI.
Multiple Regression Analysis
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Ch. 2: The Simple Regression Model
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
Correlation and Simple Linear Regression
Two-Variable Regression Model: The Problem of Estimation
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF ESTIMATION
The regression model in matrix form
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Simple Linear Regression
OVERVIEW OF LINEAR MODELS
The Simple Regression Model
Simple Linear Regression
Linear Regression Summer School IFPRI
Ch3 The Two-Variable Regression Model
Financial Econometrics Fin. 505
Regression Models - Introduction
Presentation transcript:

Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation Basic Econometrics Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation Prof. Himayatullah May 2004

3-1. The method of ordinary least square (OLS) Least-square criterion: Minimizing U^2i = (Yi – Y^i) 2 = (Yi- ^1 - ^2X)2 (3.1.2) Normal Equation and solving it for ^1 and ^2 = Least-square estimators [See (3.1.6)(3.1.7)] Numerical and statistical properties of OLS are as follows: Prof. Himayatullah May 2004

3-1. The method of ordinary least square (OLS) OLS estimators are expressed solely in terms of observable quantities. They are point estimators The sample regression line passes through sample means of X and Y The mean value of the estimated Y^ is equal to the mean value of the actual Y: E(Y) = E(Y^) The mean value of the residuals U^i is zero: E(u^i )=0 u^i are uncorrelated with the predicted Y^i and with Xi : That are u^iY^i = 0; u^iXi = 0 Prof. Himayatullah May 2004

3-2. The assumptions underlying the method of least squares Ass 1: Linear regression model (in parameters) Ass 2: X values are fixed in repeated sampling Ass 3: Zero mean value of ui : E(uiXi)=0 Ass 4: Homoscedasticity or equal variance of ui : Var (uiXi) = 2 [VS. Heteroscedasticity] Ass 5: No autocorrelation between the disturbances: Cov(ui,ujXi,Xj ) = 0 with i # j [VS. Correlation, + or - ] Prof. Himayatullah May 2004

3-2. The assumptions underlying the method of least squares Ass 6: Zero covariance between ui and Xi Cov(ui, Xi) = E(ui, Xi) = 0 Ass 7: The number of observations n must be greater than the number of parameters to be estimated Ass 8: Variability in X values. They must not all be the same Ass 9: The regression model is correctly specified Ass 10: There is no perfect multicollinearity between Xs Prof. Himayatullah May 2004

3-3. Precision or standard errors of least-squares estimates In statistics the precision of an estimate is measured by its standard error (SE) var( ^2) = 2 / x2i (3.3.1) se(^2) =  Var(^2) (3.3.2) var( ^1) = 2 X2i / n x2i (3.3.3) se(^1) =  Var(^1) (3.3.4) ^ 2 = u^2i / (n - 2) (3.3.5) ^ =  ^ 2 is standard error of the estimate Prof. Himayatullah May 2004

3-3. Precision or standard errors of least-squares estimates Features of the variance: + var( ^2) is proportional to 2 and inversely proportional to x2i + var( ^1) is proportional to 2 and X2i but inversely proportional to x2i and the sample size n. + cov ( ^1 , ^2) = - var( ^2) shows the independence between ^1 and ^2 Prof. Himayatullah May 2004

3-4. Properties of least-squares estimators: The Gauss-Markov Theorem An OLS estimator is said to be BLUE if : + It is linear, that is, a linear function of a random variable, such as the dependent variable Y in the regression model + It is unbiased , that is, its average or expected value, E(^2), is equal to the true value 2 + It has minimum variance in the class of all such linear unbiased estimators An unbiased estimator with the least variance is known as an efficient estimator Prof. Himayatullah May 2004

3-4. Properties of least-squares estimators: The Gauss-Markov Theorem Given the assumptions of the classical linear regression model, the least-squares estimators, in class of unbiased linear estimators, have minimum variance, that is, they are BLUE Prof. Himayatullah May 2004

3-5. The coefficient of determination r2: A measure of “Goodness of fit” Yi = i + i or Yi - = i - i + i or yi = i + i (Note: = ) Squaring on both side and summing =>  yi2 = 2 x2i +  2i ; or TSS = ESS + RSS Prof. Himayatullah May 2004

TSS =  yi2 = Total Sum of Squares 3-5. The coefficient of determination r2: A measure of “Goodness of fit” TSS =  yi2 = Total Sum of Squares ESS =  Y^ i2 = ^22 x2i = Explained Sum of Squares RSS =  u^2I = Residual Sum of Squares ESS RSS 1 = -------- + -------- ; or TSS TSS RSS RSS 1 = r2 + ------- ; or r2 = 1 - ------- TSS TSS Prof. Himayatullah May 2004

r =  r2 is sample correlation coefficient Some properties of r 3-5. The coefficient of determination r2: A measure of “Goodness of fit” r2 = ESS/TSS is coefficient of determination, it measures the proportion or percentage of the total variation in Y explained by the regression Model 0  r2  1; r =  r2 is sample correlation coefficient Some properties of r Prof. Himayatullah May 2004

3-6. A numerical Example (pages 80-83) 3-5. The coefficient of determination r2: A measure of “Goodness of fit” 3-6. A numerical Example (pages 80-83) 3-7. Illustrative Examples (pages 83-85) 3-8. Coffee demand Function 3-9. Monte Carlo Experiments (page 85) 3-10. Summary and conclusions (pages 86-87) Prof. Himayatullah May 2004