K. Ensor, STAT 421 1 Spring 2005 Estimation of AR models Assume for now mean is 0. Estimate parameters of the model, including the noise variace. –Least.

Slides:



Advertisements
Similar presentations
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Advertisements

Stationary Time Series
Dates for term tests Friday, February 07 Friday, March 07
VAR Models Gloria González-Rivera University of California, Riverside
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Computer vision: models, learning and inference Chapter 8 Regression.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Model Building For ARIMA time series
Econ 427 lecture 24 slides Forecast Evaluation Byron Gangnes.
AGC DSP AGC DSP Professor A G Constantinides©1 Modern Spectral Estimation Modern Spectral Estimation is based on a priori assumptions on the manner, the.
K. Ensor, STAT Spring 2005 Behavior of constant terms and general ARIMA models MA(q) – the constant is the mean AR(p) – the mean is the constant.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 7: Box-Jenkins Models – Part II (Ch. 9) Material.
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
STAT 497 LECTURE NOTES 8 ESTIMATION.
Properties of the estimates of the parameters of ARMA models.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Non-Seasonal Box-Jenkins Models
Forecasting JY Le Boudec 1. Contents 1.What is forecasting ? 2.Linear Regression 3.Avoiding Overfitting 4.Differencing 5.ARMA models 6.Sparse ARMA models.
Maximum Likelihood We have studied the OLS estimator. It only applies under certain assumptions In particular,  ~ N(0, 2 ) But what if the sampling distribution.
Prediction and model selection
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Non-Seasonal Box-Jenkins Models
BOX JENKINS METHODOLOGY
Box Jenkins or Arima Forecasting. H:\My Documents\classes\eco346\Lectures\chap ter 7\Autoregressive Models.docH:\My Documents\classes\eco346\Lectures\chap.
ARMA models Gloria González-Rivera University of California, Riverside
STAT 497 LECTURE NOTES 2.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 8: Estimation & Diagnostic Checking in Box-Jenkins.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Curve-Fitting Regression
K. Ensor, STAT Spring 2005 Long memory or long range dependence ARMA models are characterized by an exponential decay in the autocorrelation structure.
Autocorrelation in Time Series KNNL – Chapter 12.
FORECASTING. Minimum Mean Square Error Forecasting.
K. Ensor, STAT Spring 2005 Model selection/diagnostics Akaike’s Information Criterion (AIC) –A measure of fit plus a penalty term for the number.
Learning Theory Reza Shadmehr LMS with Newton-Raphson, weighted least squares, choice of loss function.
Introduction to Time Series Analysis
Generalised method of moments approach to testing the CAPM Nimesh Mistry Filipp Levin.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Dynamic Models, Autocorrelation and Forecasting ECON 6002 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s notes.
Autoregressive (AR) Spectral Estimation
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
Univariate Time series - 2 Methods of Economic Investigation Lecture 19.
ISEN 315 Spring 2011 Dr. Gary Gaukler. Forecasting for Stationary Series A stationary time series has the form: D t =  +  t where  is a constant.
Geology 6600/7600 Signal Analysis 26 Oct 2015 © A.R. Lowry 2015 Last time: Wiener Filtering Digital Wiener Filtering seeks to design a filter h for a linear.
Computacion Inteligente Least-Square Methods for System Identification.
Time Series Analysis PART II. Econometric Forecasting Forecasting is an important part of econometric analysis, for some people probably the most important.
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Analysis of financial data Anders Lundquist Spring 2010.
K. Ensor, STAT Spring 2004 Volatility Volatility – conditional variance of the process –Don’t observe this quantity directly (only one observation.
Ch3: Model Building through Regression
Model Building For ARIMA time series
Regression with Autocorrelated Errors
Predictive distributions
Linear regression Fitting a straight line to observations.
مدلسازي تجربي – تخمين پارامتر
Module 3 Forecasting a Single Variable from its own History, continued
Simple Linear Regression
Nonlinear Fitting.
VAR Models Gloria González-Rivera University of California, Riverside
Learning Theory Reza Shadmehr
Mathematical Foundations of BME
Presentation transcript:

K. Ensor, STAT Spring 2005 Estimation of AR models Assume for now mean is 0. Estimate parameters of the model, including the noise variace. –Least squares Set up regression equation removing any rows that have zeros (see page 38) –Yule-Walker estimators Set up regression equation substituting zero (the mean) for any missing values. –Burg estimators Balances the above two approaches. –Maximum likelihood estimator Write the likelihood in terms of the residual process based on the one step ahead predictors given a finite history.

K. Ensor, STAT Spring 2005 For MA models Maximum likelihood estimators –Nonlinear function Likelihood evaluator –Conditional likelihood – obtain shocks recursively starting with initial values. –Exact likelihood – the initial shocks become parameters in the model and are estimated jointly with the other parameters »Kalman filter – an algorithm for the recursive updating of the residuals and residual variance (basis for the likelihood) Nonlinear optimizer –Typical problems may occur Slow – not so much a problem these days May not converge – always a problem but should be o.k. for well posed models –Key – keep the number of parameters low. Other shortcut estimation procedures available – see Brockwell and Davis or Shumway and Stoffer.

K. Ensor, STAT Spring 2005 For ARMA models Maximum likelihood –same comments as for MA models. Keep the number of parameters low. This is usually the case for an appropriately specified ARMA model. The Splus arima.mle obtains parameter estimates through the exact maximum likelihood equation.

K. Ensor, STAT Spring 2005 Forecasting – AR (page 39) Choose forecast that minimize the expected mean squared error notation: r h (l) – l step ahead forecast at the forecast origin h. The function that minimizes the expected mean squared error is …. For linear models – this is a linear function or the AR equation itself. See page 39, 40 and 41 for the relevant equations. Note –for a stationary AR(p) model the l step ahead forecast converges to the mean as l goes to infinity (mean reversion) –The varince of the forecast error approaches the unconditional variance of the process.

K. Ensor, STAT Spring 2005 Forecasting – MA (pg 47) and ARMA For MA find recursively –After lag q, the forecasts are simply the unconditional mean and forecast error is the unconditional variance after lag q. –Prior to lag q, find forecasts and forecast errors recurisively For ARMA – behavior is a mixture of the AR and MA behavior –Find recursively – substituting observed series when available, and predicted when not, or zero when predicted values are unavailable. Refer to the MA representation of an ARMA model (pg 55) –The l step ahead forecast is a linear function of the shocks. –The forecast error is a linear function of l shocks. –Thus the variance of the forecast error is given by equation 2.31 on page 55. –Note what happens as l goes to infinity.