Download presentation
Presentation is loading. Please wait.
1
Applied Econometrics:
“As it happens, the econometric modeling was done in the basement of the building and the econometric theory courses were taught on the top floor (third). I was perplexed by the fact that the same language was used in both places. Even more amazing was the transmogrification of particular individual who wantonly sinned in the basement and metamorphosed into the highest of high priests as they ascended to the third floor” Leamer (1978, p.vi)
2
There is a big difference between applied and theoretical econometrics
Big communication gap between econometric theorists and applied econometricians. Theorist, when often called upon to teach applied econometrics courses teach econometric theory in these courses. They give examples, an applied paper is usually required to justify the course being called applied econometrics course In these courses students are taught in hands-on fashion how to undertake a wide variety of econometric techniques. Examples : elementary level – use and interpretation of dummy variables, F, Chi-square testing, correcting for non-spherical errors etc.
3
Examples – advanced – testing for unit root and cointegration, correcting sample selection bias, estimating Tobit, Poisson, ordered probit models etc. Focus is on mechanics of estimation and testing rather than on the fundamentals of applied work – such as problem articulation, data cleaning and model specification. Teaching applied econometrics is difficult because the applied work is difficult Econometrics is much easier without data Many econometrics instructors teach whey they enjoy teaching and what they know how to teach not what students need
4
Students need to learn some standard operating procedures or rules of behavior
Follow these rules and students will avoid elementary mistakes Most econometrics instructors believe their students don’t need to be taught such elementary rules of behavior – these rules lack the intellectual rigor prized by econometric theorists
5
Tukey (1969, p.90) expresses the difficulty in the teaching task as … “divert them from depending upon the ‘authority’ if standard textbook solution but without being able to substitute a second religion for the first. Stimulate intelligent problem formulation without being able to say quite how this is done. Demand high standards of statistical reasoning, but without specifying a simple model of statistics which might serve as a criterion for quality of reasoning.”
6
Applied Econometrics Commandments
Rule 1 – Use common sense and economic theory: e.g. match per capita variables with per capita variables, use real exchange rates to explain real imports/exports, use nominal interest rates to explain real demand for money, select appropriate functional form for dependent variables constrained to lie between 0 and 1, resist trying to explain a trendless variable with a trended variable, never infer causation from correlation
7
Rule 2 – Avoid Type III errors
Type III errors occur when a researcher produces the right answer to the wrong question - An approximate answer to the right question is worth a great deal more than a precise answer to a wrong question. The relevant hypothesis/objective/specification may be completely different from what is initially suggested It ma be that a cumulative change in a variable is most relevant not the most recent change or the hypothesis should be that a coefficient is equal to another coefficient rather than equal to zero. The research question needs to be formulated appropriately
8
Rule 3 – Know the context Researcher should be intimately familiar with the phenomenon being investigated. Ask questions – how closely do measured variables match their theoretical counterparts?
9
Rule 4 – Inspect the data Knowing context is not enough. Researcher needs to become intimately familiar with the specific data with which she is working. -involves summary statistics, graphs, data cleaning to both check and ‘get a feel for’ the data
10
Rule 5 – Keep it sensibly simple
Beginning with a simple model (bottom-up approach) combine with the general to specific (top-bottom approach) to produce a compromise process which, judged by its wide application is viewed as an acceptable rule of behaviour
11
Rule 6 – Use the interocular trauma test
Output from empirical work should be looked at long and hard until the answer hit you between the eye. Check that results make sense. Are the signs of the coefficients as expected? Are some important variables statistically significant? Are the implications of the result consistent with theory?
12
Rule 7 – Understand the cost and benefits of data mining
Data mining is inevitable but the art of applied econometrician is to allow the data driven theory whild avoiding the dangers inherent in data mining. Don’t worship r-squared or the 5% significance level
13
Rule 8 – Be prepared to compromise
Very seldom does one’s problems even come close to satisfying the assumptions under which the econometric theory produces optimal solutions – researcher may be forced to compromise and adopt sub- optima solutions. – in practice, there are no standard problems only standard solutions
14
Rule 9 – Do not confuse statistical significance with meaningful magnitude
Coefficients of trivial magnitude may test significantly different from zero, creating a misleading impression of what is important. Researchers mush always look at the magnitude of coefficient estimates as well as their significance Sanctification through significance testing should be replaced by continual searches for additional evidence both corroborating and disonfirming.
15
Rule 10 – Report a sensitivity analysis
Because of the DGP, it is important to check if the empirical results are sensitive to the assumptions upon which the estimation has been based – To what extent are the substantive results of the research affected by adopting different specifications about which reasonable people might disagree
16
Getting the wrong sign Run an a priori favorite specification and discover a wrong sign First step – check economic theory Econometric reasons for ‘wrong’ sign Omitted variable High variances – multicollinearity, small sample size, minimal variation in Y Selection bias Ceteris paribus confusion – Data definition/measurement Outliers Interaction terms Specification error Simultaneity / lack of identification Bad instruments
17
Common Mistakes Interpretation of a significant DW or heteroskedasticity test as pointing to a need to change estimation technique. Could be something wrong with the specification. Thinking that White heteroskedastic consistent coefficient estimates are different from OLS estimates Forgetting interaction and quadratic terms Using a linear functional form when the dependent variable is a fraction Believing that multicollinearity causes bias or invalidates inferences
18
Using an ordered qualitative variable as a regressor- e. g
Using an ordered qualitative variable as a regressor- e.g. education 1 for elementary school, 2 for high school, 3 for university etc. Show impact of moving from 1 to 2 to be same as 2 to 3. Measuring forecast success in logit/probit models by the fraction of outcomes predicted correctly Interpreting the LM test for a non zero variance of ‘random’ intercepts in panel data as a test for random effects versus fixed effects – this is a test for testing if the intercepts are all equal (fixed effects intercepts). Hausman test needed to test appropriateness of random effects specification Using Tobit in a context in which it I clear that a separate equation should be used to determine limit observation
19
Testing for unit roots without a strategy for determining if a drift or time trend should be included Not understanding selection bias particularly, self selection bias Forgetting about possible endogenity in the empirical specification
20
Be sure to know Instrumental variables Mixed estimation Box-Cox
Non-nested testing Bootstrapping ML ARIMA VAR Heckman two stage Identification Panel Data Non stationarity.
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.