Kakhramon Yusupov June 15th, 2017 1:30pm – 3:00pm Session 3

Slides:



Advertisements
Similar presentations
Autocorrelation and Heteroskedasticity
Advertisements

Regression Analysis.
Applied Econometrics Second edition
Econometric Modeling Through EViews and EXCEL
Multivariate Regression
Forecasting Using the Simple Linear Regression Model and Correlation
Specification Error II
Multicollinearity Multicollinearity - violation of the assumption that no independent variable is a perfect linear function of one or more other independent.
Economics Prof. Buckles1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Chapter 10 Simple Regression.
Additional Topics in Regression Analysis
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Topic 3: Regression.
1.The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2.Homoscedasticity --the.
Ekonometrika 1 Ekonomi Pembangunan Universitas Brawijaya.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Autocorrelation Lecture 18 Lecture 18.
Simple Linear Regression Analysis
Ordinary Least Squares
Regression Method.
Returning to Consumption
What does it mean? The variance of the error term is not constant
Specification Error I.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
EC 532 Advanced Econometrics Lecture 1 : Heteroscedasticity Prof. Burak Saltoglu.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
I271B QUANTITATIVE METHODS Regression and Diagnostics.
8-1 MGMG 522 : Session #8 Heteroskedasticity (Ch. 10)
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?
11.1 Heteroskedasticity: Nature and Detection Aims and Learning Objectives By the end of this session students should be able to: Explain the nature.
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
Heteroscedasticity Heteroscedasticity is present if the variance of the error term is not a constant. This is most commonly a problem when dealing with.
Heteroscedasticity Chapter 8
Ch5 Relaxing the Assumptions of the Classical Model
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
THE LINEAR REGRESSION MODEL: AN OVERVIEW
Econometric methods of analysis and forecasting of financial markets
Multivariate Regression
Fundamentals of regression analysis
Fundamentals of regression analysis 2
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
Pure Serial Correlation
STAT 497 LECTURE NOTE 9 DIAGNOSTIC CHECKS.
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Chapter 12 – Autocorrelation
Serial Correlation and Heteroskedasticity in Time Series Regressions
Chapter 6: MULTIPLE REGRESSION ANALYSIS
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Multiple Regression Models
Migration and the Labour Market
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
Serial Correlation and Heteroscedasticity in
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
BEC 30325: MANAGERIAL ECONOMICS
Chapter 13 Additional Topics in Regression Analysis
Linear Regression Summer School IFPRI
Autocorrelation Dr. A. PHILIP AROKIADOSS Chapter 5 Assistant Professor
Lecturer Dr. Veronika Alhanaqtah
Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences? It may be difficult to separate.
Autocorrelation MS management.
Heteroskedasticity.
Financial Econometrics Fin. 505
BEC 30325: MANAGERIAL ECONOMICS
Serial Correlation and Heteroscedasticity in
Presentation transcript:

Kakhramon Yusupov June 15th, 2017 1:30pm – 3:00pm Session 3 MODEL DIAGNOSTICS AND SPECIFICATION

Multicollinearity: reasons Data collection process Constraints on model or in the population being sampled. Model specification An over-determined models

Perfect v.s less than perfect Perfect multicollinearity is the case when two ore more independent variables Can create perfect linear relationship. Perfect multicollinearity is the case when two ore more independent variables Can create less than perfect linear relationship.

Practical consequences The OLS is BLUE but large variances and covariances making process estimation difficult. Large variances cause large confidence intervals and accepting or rejecting hypothesis are biased. T statistics are biased Although t-stats are low, R-square might be very high. The sensitivity of estimators and variances are very high to small changes in dataset.

Ha: Not all slope coefficients are simultaneously zero Due to low t-stats we can not reject our Null Hypothesis Ha: Not all slope coefficients are simultaneously zero Due to high R square the F-value will be very high and rejection of Ho will be easy

Detection Multicollinearity is a question of degree. It is a feature of sample but not population. How to detect : High R square but low t-stats. High correlation coefficients among the independent variables. Auxiliary regression High VIF Eigenvalue and condition index.***

Auxiliary regression Ho: The Xi variable is not collinear Run regression where one X is dependent and other X’s are independent and Obtain R square Df num = k-2 Df denom = n-k+1 k- is the number of explanatory variables including intercept. n- is sample size. If F stat is higher than F critical then Xi variable is collinear Rule of thumb: if R square of auxiliary regression is higher than over R square then it might be troublesome.

What to do ? Do nothing. Combining cross section and time series Transformation of variables (differencing, ratio transformation) Additional data observations.

Assumption: Homoscedasticity or equal variance of ui X Y f(u)

Reasons: Error learning models; Higher variability in independent variable might increase higher variability in dependent variable. Spatial Correlation. Data collecting biases. Existence of extreme observations (outliers) Incorrect specification of Model Skewness in the distribution

OLS Estimation: Hetroscedasticity If variance of residuals is constant then Our equation collapses to original variance Formula.

Consequences: The regression coefficients are unbiased The usual formula for coefficient variances is wrong The OLS estimation is BLU but not efficient. t-test and F test are not valid.

Method of Generalized Least Squares

Method of Generalized Least Squares

Hetroscedasticity: Detection. Graphical Method Park test White’s general Hetroscedasticity test Breush-Pagan-Godfrey Test

Park test If coefficient beta is statistically different from zero, Is not known and we use If coefficient beta is statistically different from zero, it indicates that Hetroscedasticity is present.

Goldfeld-Quandt Test Order your sample from lowest to highest. Omit your central your central observation and divide your sample into two samples. 2. Run two regressions for two samples and obtain RSS1 and RSS2. RSS1 represents RSS from small X sample. 3. Each RSS has following degrees of freedom Calculate Follows F distribution with df of num and denom equal to

Breush-Pagan-Godfrey Test If you reject your Null hypothesis then there is Hetroscedasticity. .

Breush-Pagan-Godfrey Test Step1. Estimate original regression model and get residuals . Step2. Obtain Step3. Construct Step4. Estimate the following regression model. Step5. Obtain m- is number of parameters of Step 4 regression

White’s general Heteroscedasticity test Step1. Estimate original regression model and get residuals . Step2. Estimate If you reject your Null hypothesis then there is Hetroscedasticity. .

Remedial Measures Weighted Least squares White’s Hetroscedasticity consistent variance and standard errors. Transformations according to Hetroscedasticity pattern.

Heteroscedasticity robust inference LM test score and assume that Regress each element of X2 onto all elements of X1 and collect residual in r matrix Then form u*R Then run regression 1 on ur

Autocorrelation reasons: Inertia. Specification Bias: Omitted relevant variables. Specification bias: Incorrect functional form. Cobweb phenomenon. Lags Data manipulation. Data Transformation. Non-stationary

Consequences: The regression coefficients are unbiased The usual formula for coefficient variances is wrong The OLS estimation is BLU but not efficient. t-test and F test are not valid.

Detection. DW test: (the regression model includes intercept, there is no lagged dependent variable, the explanatory variables, the X’s are non-stochastic, residuals follow AR(1) process, residuals are normally distributed)

Detection: Breusch-Godfrey There is no kth order serial correlation Test Statistic Where n- number of observations, p-number of residual lag variables.

Remedial Measures GLS Newey-West Autocorrelation consistent variance and standard errors. Including lagged dependent variable. Transformations according to Autocorrelation pattern.

Generalized Least Square If the value of rho is known If the value of rho is not known

Cochrane-Orcutt procedure : First estimate original regression and obtain residuals After runing AR(1) regression obtain the value of Rho and run GLS regression Using GLS coefficients obtain new residuals and obtain new value of Rho Continue the process until you get convergence in coefficients.

Representation

Endogeneity 1. Omission of relevant variables

What to do ? If omitted variable does not relate to other included independent variables then OLS estimator still BLUE Proxy variables Use other methods other than OLS