Chow test.

Slides:



Advertisements
Similar presentations
Econometric Modelling
Advertisements

Cointegration and Error Correction Models
Functional Form and Dynamic Models
Multiple Regression.
F-tests continued.
Autocorrelation and Heteroskedasticity
Introduction Describe what panel data is and the reasons for using it in this format Assess the importance of fixed and random effects Examine the Hausman.
Ordinary least Squares
Regression Analysis.
The Multiple Regression Model.
Forecasting Using the Simple Linear Regression Model and Correlation
Correlation and regression
Chapter 13 Multiple Regression
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Multiple Linear Regression Model
Chapter 12 Multiple Regression
Additional Topics in Regression Analysis
Multicollinearity Omitted Variables Bias is a problem when the omitted variable is an explanator of Y and correlated with X1 Including the omitted variable.
Chapter 11 Multiple Regression.
Further Inference in the Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Topic 3: Regression.
Multiple Regression and Correlation Analysis
Ch. 14: The Multiple Regression Model building
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Multiple Linear Regression Analysis
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Chapter 14 Introduction to Multiple Regression
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Chap 6 Further Inference in the Multiple Regression Model
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 14-1 Chapter 14 Introduction to Multiple Regression Statistics for Managers using Microsoft.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice- Hall, Inc. Chap 14-1 Business Statistics: A Decision-Making Approach 6 th Edition.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc.. Chap 14-1 Chapter 14 Introduction to Multiple Regression Basic Business Statistics 10 th Edition.
Lesson 14 - R Chapter 14 Review. Objectives Summarize the chapter Define the vocabulary used Complete all objectives Successfully answer any of the review.
Example x y We wish to check for a non zero correlation.
Introduction to Multiple Regression Lecture 11. The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & 2 or more.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
11.1 Heteroskedasticity: Nature and Detection Aims and Learning Objectives By the end of this session students should be able to: Explain the nature.
4-1 MGMG 522 : Session #4 Choosing the Independent Variables and a Functional Form (Ch. 6 & 7)
Heteroscedasticity Heteroscedasticity is present if the variance of the error term is not a constant. This is most commonly a problem when dealing with.
Predicting Energy Consumption in Buildings using Multiple Linear Regression Introduction Linear regression is used to model energy consumption in buildings.
ECONOMETRICS EC331 Prof. Burak Saltoglu
Ch5 Relaxing the Assumptions of the Classical Model
F-tests continued.
Chapter 14 Introduction to Multiple Regression
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
15.5 The Hausman test For the random effects estimator to be unbiased in large samples, the effects must be uncorrelated with the explanatory variables.
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
THE LINEAR REGRESSION MODEL: AN OVERVIEW
Multiple Regression Analysis and Model Building
Fundamentals of regression analysis
Fundamentals of regression analysis 2
Elementary Statistics
Multivariate Analysis Lec 4
Further Inference in the Multiple Regression Model
Multiple Regression Models
Techniques for Data Analysis Event Study
Goodness of Fit The sum of squared deviations from the mean of a variable can be decomposed as follows: TSS = ESS + RSS This decomposition can be used.
Chapter 7: The Normality Assumption and Inference with OLS
Product moment correlation
Regression Forecasting and Model Building
BEC 30325: MANAGERIAL ECONOMICS
Chapter 13 Additional Topics in Regression Analysis
Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences? It may be difficult to separate.
BEC 30325: MANAGERIAL ECONOMICS
Introduction to Regression
Presentation transcript:

Chow test

Introduction Discuss the problems associated with structural breaks in the data. Examine the Chow test for a structural break. Assess an example of the use of the Chow test and ways to solve the problem of structural breaks. Introduce the problem of multicollinearity.

Structural Breaks Structural breaks can occur in time series data or cross sectional data, when there is a sudden change in the relationship being examined. Examples include sudden policy changes such as a change in government or sudden move in asset prices (1987) or serious international disaster such as a civil war We then need to decide whether 2 separate regression lines are more efficient than a single regression.

Structural Break in 1997

Structural Break In this example a single regression line is not a good fit of the data due to the obvious structural break in 1997. We need to test if a structural break has occurred in 1997, usually the break is not as obvious as this. We will use the Chow test, which is a variation of the F-test for a restriction

Chow Test (stages in using test) Run the regression using all the observations, before and after the structural break, collect the RSS Run 2 separate regressions, one before, RSS(1) and one after, RSS(2) the structural break. Calculate the test statistic using the following formulae:

Chow Test

Chow Test The final stage of the Chow Test is to compare the test statistic with the critical value from the F-Tables. The null hypothesis in this case is structural stability, if we reject the null hypothesis, it means we have a structural break in the data We then need to decide how to overcome this break.

Chow Test If there is evidence of a structural break, it may mean we need to split the data into 2 samples and run separate regressions. Another method to overcome this problem is to use dummy variables (To be covered later in term), the benefit of this approach is that we do not lose any degrees of freedom through a loss of observations.

Problems with Chow Test The test may suggest splitting the data, this may mean fewer degrees of freedom When should the cut off point be for the test, usually there should be a theoretical basis for this. There is the potential for structural instability across the whole data range. It is possible to test every observation for a structural break.

Multicollinearity Multicollinearity occurs when two explanatory variables are strongly correlated with each other. In all multiple regression models there is some degree of collinearity between the explanatory variables, however not enough to cause a serious problem. However in some cases the collinearity between variables is so high, it affects the regression, producing coefficients with high standard errors.

Multicollinearity It may be that multicollinearity is not a problem if the other conditions are favourable: High number of observations Sample variance of explanatory variables is high Variance of the residual is low

Models which can have multicollinearity Models with large numbers of lags. Models which use asset returns or interest rates, i.e. 3 month and 10 year interest rates. (This can be overcome by using a term structure of interest rates variable, i.e. one rate minus the other) Demand models which include different prices of goods.

Measuring Multicollinearity The main way of testing for multicollinearity is to check the t-statistics and R-squared statistic. If the regression produces a high R-squared statistic (>0.9) but low t-statistics which are not significant, then multicollinearity could be a problem We could then produce a pair-wise correlation coefficient to determine if the variables are suffering from high levels of multicollinearity. The problem with this approach is to decide on when the correlation is so large that multicollinearity is present.

Remedies for Multicollinearity Remove one of the variables from the regression which is causing the multicollinearity or alternatively replace it with a variable that is not collinear (This can cause omitted variable bias). Find data that has more observations. Transform the variables, i.e. put data into ratio form or take logarithms of the data Ignore the problem, after all the estimators are still BLUE.

Increasing Observations To overcome multicollinearity, it may be necessary to increase the number of observations by: Extending data series. Increasing the frequency of the data, with financial data it is often possible to get daily data. Pooling the data, it could be that cross section and time series data could be combined.

Conclusion The F-test can be used to test a specific restriction on a model, such as constant returns to scale. The Chow test is used to determine if the data is structurally stable. If there is a structural break, we need to split the data or use dummy variables Multicollinearity occurs when the explanatory variables are closely correlated.