Statistics 153 Review - Sept 30, 2008

Slides:



Advertisements
Similar presentations
FINANCIAL TIME-SERIES ECONOMETRICS SUN LIJIAN Feb 23,2001.
Advertisements

Autocorrelation Functions and ARIMA Modelling
Dates for term tests Friday, February 07 Friday, March 07
Model Building For ARIMA time series
Part II – TIME SERIES ANALYSIS C5 ARIMA (Box-Jenkins) Models
Time Series Building 1. Model Identification
R. Werner Solar Terrestrial Influences Institute - BAS Time Series Analysis by means of inference statistical methods.
An Introduction to Time Series Ginger Davis VIGRE Computational Finance Seminar Rice University November 26, 2003.
How should these data be modelled?. Identification step: Look at the SAC and SPAC Looks like an AR(1)- process. (Spikes are clearly decreasing in SAC.
Time series of the day. Stat Sept 2008 D. R. Brillinger Simple descriptive techniques Trend X t =  +  t +  t Filtering y t =  r=-q s a r.
Modeling Cycles By ARMA
3 mo treasury yield borrowing costs Dow industrials NY Times 18 Sept 2008 front page.
NY Times 23 Sept time series of the day. Stat Sept 2008 D. R. Brillinger Chapter 4 - Fitting t.s. models in the time domain sample autocovariance.
Prediction and model selection
Financial Time Series CS3. Financial Time Series.
1 ECON 240C Lecture 8. 2 Outline: 2 nd Order AR Roots of the quadratic Roots of the quadratic Example: change in housing starts Example: change in housing.
ARIMA Forecasting Lecture 7 and 8 - March 14-16, 2011
Financial Econometrics
L7: ARIMA1 Lecture 7: ARIMA Model Process The following topics will be covered: Properties of Stock Returns AR model MA model ARMA Non-Stationary Process.
BOX JENKINS METHODOLOGY
AR- MA- och ARMA-.
ARMA models Gloria González-Rivera University of California, Riverside
STAT 497 LECTURE NOTES 2.
Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes.
#1 EC 485: Time Series Analysis in a Nut Shell. #2 Data Preparation: 1)Plot data and examine for stationarity 2)Examine ACF for stationarity 3)If not.
Applied Forecasting ST3010 Michaelmas term 2015 Prof. Rozenn Dahyot Room 128 Lloyd Institute School of Computer Science and Statistics Trinity College.
John G. Zhang, Ph.D. Harper College
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Big Data at Home Depot KSU – Big Data Survey Course Steve Einbender Advanced Analytics Architect.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
K. Ensor, STAT Spring 2004 Memory characterization of a process How would the ACF behave for a process with no memory? What is a short memory series?
Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
MULTIVARIATE TIME SERIES & FORECASTING 1. 2 : autocovariance function of the individual time series.
Auto Regressive, Integrated, Moving Average Box-Jenkins models A stationary times series can be modelled on basis of the serial correlations in it. A non-stationary.
Statistics 349.3(02) Analysis of Time Series. Course Information 1.Instructor: W. H. Laverty 235 McLean Hall Tel:
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
The Box-Jenkins (ARIMA) Methodology
Ch16: Time Series 24 Nov 2011 BUSI275 Dr. Sean Ho HW8 due tonight Please download: 22-TheFed.xls 22-TheFed.xls.
Introduction to stochastic processes
STAT 497 LECTURE NOTES 3 STATIONARY TIME SERIES PROCESSES
Analysis of Financial Data Spring 2012 Lecture 4: Time Series Models - 1 Priyantha Wijayatunga Department of Statistics, Umeå University
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Time Series Analysis.
Introduction to Time Series Analysis
Time Series Analysis and Its Applications
Chapter 6: Autoregressive Integrated Moving Average (ARIMA) Models
Model Building For ARIMA time series
Univariate time series modelling and forecasting
Stochastic models - time series.
Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Machine Learning Week 4.
Stochastic models - time series.
STOCHASTIC HYDROLOGY Random Processes
Chapter 9 Model Building
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
Algebra U = ∑ a(t) Y(t) E{U} = c Y ∑ a(t)
The Spectral Representation of Stationary Time Series
Tutorial 10 SEG7550.
Econometrics I Professor William Greene Stern School of Business
Introduction to Time Series Analysis
Introduction to Time Series Analysis
Econometrics Chengyuan Yin School of Mathematics.
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
BOX JENKINS (ARIMA) METHODOLOGY
Chap 7: Seasonal ARIMA Models
Presentation transcript:

Statistics 153 Review - Sept 30, 2008   Notes re the 153 Midterm on October 2, 2008 1. The class will be split into two groups by first letter of surname. Letters A to L will take the exam in 330 Evans. Letters M to Z will take it in 340 Evans. 2. The exam will cover material through Chapter 4. 3. There will be 2 questions, answer both. 4. The questions will be like the Assignment's, but the exam will be closed book - no books or notes allowed. 5. No questions to the Proctors about the exam content please. If unsure, make an interpretation, state it and answer that. 6. You will have 60 minutes, exactly, to work. 7. The exam itself will be handed out. There will be space on it to answer the questions. 8. The solutions to Assignment 3 will be posted in the glass case, center corridor, third floor Evans, Tuesday after class, but the papers won't have been graded yet. 9. I will do a review in class September 30. Suggest some topics. 3

Name:________________________ October 2, 2008 MIDTERM EXAMINATION Statistics 153 D. R. Brillinger   Answer both questions in the space provided. Show your work. If you are not sure of the meaning of a question, set down an interpretation, and provide a reasonable answer. You have exactly 60 minutes for the exam. Question 1. Let {Zt} be a purely random process.

What is a time series? a sequence of numbers, x, indexed by t

Stat 153 - 11 Sept 2008 D. R. Brillinger Simple descriptive techniques Trend Xt =  + t + t

Filtering/filters yt = r=-qs ar xt+r yt = k hk xt-k p. 189 This form carries stationary into stationary Filters may be in series stationarity preserved if filter time invariant

Differencing yt = xt - xt-1 =  xt "removes" linear trend Seasonal variation model Xt = mt + St + t St  St-s 12 xt = xt - xt-12 , t in months

Stationary case, autocorrelation estimate at lag k, rk t=1N-k (xt- )(xt+k - ) over t=1N (xt - )2 autocovariance estimate at lag k, ck t=1N-k (xt - )(xt+k - ) / N

Stat 153 - 16 Sept 2008 D. R. Brillinger Chapter 3 mean function variance function autocovariance

Strictly stationary All joint distributions unaffected by simple time shift Second-order stationary

Properties of autocovariance function Does not identify model uniquely

Useful models and acf's Purely random Building block

Random walk not stationary

*

Moving average, MA(q) From * stationary

Backward shift operator Linear process. Need convergence condition Stationary

autoregressive process, AR(p) first-order, AR(1) Markov * Linear process For convergence/stationarity root of φ(z)=0 in |z|>1

a.c.f. From * p.a.c.f. vanishes for k>p

In general case, Very useful for prediction

ARMA(p,q) Roots of (z)=0 in |z|>1 for stationarity Roots of θ(z)=0 in |z|>1 for invertibility

ARIMA(p,d,q).

Yule-Walker equations for AR(p). Correlate, with Xt-k , each side of For AR(1)

Stat 153 - 23 Sept 2008 D. R. Brillinger Chapter 4 - Fitting t.s. models in the time domain sample autocovariance coefficient. Under stationarity, ...

Estimated autocorrelation coefficient asymptotically normal interpretation

Estimating the mean Can be bigger or less than 2/N

Fitting an autoregressive, AR(p) Easy. Remember regression and least squares normal equations

AR(1) Cp.

Seasonal ARIMA. seasonal parameter s SARIMA(p,d,q)(P,D,Q)s Example

Residual analysis. Paradigm observation = fitted value plus residual The parametric models have contained Zt

Portmanteau lack-of-fit statistic ARMA(p,q) appropriate?