Introduction to Time Series Analysis. Lecture 11.

Similar documents
Introduction to Time Series Analysis. Lecture 12.

Parameter estimation: ACVF of AR processes

6.3 Forecasting ARMA processes

Autoregressive Moving Average (ARMA) Models and their Practical Applications

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

Time Series I Time Domain Methods

STAT Financial Time Series

MAT3379 (Winter 2016)

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Midterm Suggested Solutions

ITSM-R Reference Manual

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Akaike criterion: Kullback-Leibler discrepancy

Covariances of ARMA Processes

Time Series: Theory and Methods

Time Series Analysis -- An Introduction -- AMS 586

STAT 443 (Winter ) Forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Modelling using ARMA processes

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

FE570 Financial Markets and Trading. Stevens Institute of Technology

Chapter 6: Model Specification for Time Series

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009

at least 50 and preferably 100 observations should be available to build a proper model

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

2. An Introduction to Moving Average Models and ARMA Models

CHAPTER 8 FORECASTING PRACTICE I

Applied time-series analysis

Univariate ARIMA Models

Econometrics II Heij et al. Chapter 7.1

Empirical Market Microstructure Analysis (EMMA)

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

COMPUTER SESSION 3: ESTIMATION AND FORECASTING.

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Univariate Time Series Analysis; ARIMA Models

System Identification

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Introduction to ARMA and GARCH processes

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Ch 6. Model Specification. Time Series Analysis

Multivariate Time Series

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

Ch 4. Models For Stationary Time Series. Time Series Analysis

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

distributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as

Econometría 2: Análisis de series de Tiempo

STOR 356: Summary Course Notes

Review Session: Econometrics - CLEFIN (20192)

Multivariate Time Series: VAR(p) Processes and Models

Introduction to Estimation Methods for Time Series models Lecture 2

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38

TMA4285 December 2015 Time series models, solution.

Gaussian processes. Basic Properties VAG002-

AR, MA and ARMA models

Lecture 1: Stationary Time Series Analysis

Some Time-Series Models

A Data-Driven Model for Software Reliability Prediction

Introduction to Time Series Analysis. Lecture 7.

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

STOR 356: Summary Course Notes Part III

Time Series Outlier Detection

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Econometric Forecasting

Elements of Multivariate Time Series Analysis

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).

Residual Bootstrap for estimation in autoregressive processes

Ross Bettinger, Analytical Consultant, Seattle, WA

Lecture on ARMA model

Problem Set 2: Box-Jenkins methodology

Lecture 2: ARMA(p,q) models (part 2)

Discrete time processes

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

STAD57 Time Series Analysis. Lecture 8

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Quantitative Finance I

Model selection using penalty function criteria

An algorithm for robust fitting of autoregressive models Dimitris N. Politis

Classical Decomposition Model Revisited: I

1 Class Organization. 2 Introduction

ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Time Series Analysis

Classic Time Series Analysis

Nonlinear time series

Transcription:

Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker estimation: example 1

Review (Lecture 1): Time series modelling and forecasting 1. Plot the time series. Look for trends, seasonal components, step changes, outliers. 2. Transform data so that residuals are stationary. (a) Remove trend and seasonal components. (b) Differencing. (c) Nonlinear transformations (log, ). 3. Fit model to residuals. 4. Forecast time series by forecasting residuals and inverting any transformations. 2

Review: Time series modelling and forecasting Stationary time series models: ARMA(p,q). p = 0: MA(q), q = 0: AR(p). We have seen that any causal, invertible linear process has: an MA( ) representation (from causality), and an AR( ) representation (from invertibility). Real data cannot be exactly modelled using a finite number of parameters. We choose p,q to give a simple but accurate model. 3

Review: Time series modelling and forecasting How do we use data to decide on p,q? 1. Use sample ACF/PACF to make preliminary choices of model order. 2. Estimate parameters for each of these choices. 3. Compare predictive accuracy/complexity of each (using, e.g., AIC). NB: We need to compute parameter estimates for several different model orders. Thus, recursive algorithms for parameter estimation are important. We ll see that some of these are identical to the recursive algorithms for forecasting. 4

Review: Time series modelling and forecasting Model: ACF: PACF: AR(p) decays zero forh > p MA(q) zero forh > q decays ARMA(p,q) decays decays 5

Introduction to Time Series Analysis. Lecture 11. 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker estimation: example 6

Parameter estimation We want to estimate the parameters of an ARMA(p,q) model. We will assume (for now) that: 1. The model order (p and q) is known, and 2. The data has zero mean. If (2) is not a reasonable assumption, we can subtract the sample mean ȳ, fit a zero-mean ARMA model, φ(b)x t = θ(b)w t, to the mean-corrected time series X t = Y t ȳ, and then use X t +ȳ as the model for Y t. 7

Parameter estimation: Maximum likelihood estimator One approach: Assume that {X t } is Gaussian, that is,φ(b)x t = θ(b)w t, where W t is i.i.d. Gaussian. Chooseφ i,θ j to maximize the likelihood: L(φ,θ,σ 2 ) = f φ,θ,σ 2(X 1,...,X n ), where f φ,θ,σ 2 is the joint (Gaussian) density for the given ARMA model. (c.f. choosing the parameters that maximize the probability of the data.) 8

Maximum likelihood estimation Suppose that X 1,X 2,...,X n is drawn from a zero mean Gaussian ARMA(p,q) process. The likelihood of parameters φ R p,θ R q, σ 2 w R + is defined as the density of X = (X 1,X 2,...,X n ) under the Gaussian model with those parameters: L(φ,θ,σ 2 w) = 1 (2π) n/2 Γ n 1/2 exp ( 1 ) 2 X Γ 1 n X, where A denotes the determinant of a matrix A, and Γ n is the variance/covariance matrix of X with the given parameter values. The maximum likelihood estimator (MLE) ofφ,θ,σ 2 w maximizes this quantity. 9

Parameter estimation: Maximum likelihood estimator Advantages of MLE: Efficient (low variance estimates). Often the Gaussian assumption is reasonable. Even if{x t } is not Gaussian, the asymptotic distribution of the estimates (ˆφ, ˆθ,ˆσ 2 ) is the same as the Gaussian case. Disadvantages of MLE: Difficult optimization problem. Need to choose a good starting point (often use other estimators for this). 10

Preliminary parameter estimates Yule-Walker for AR(p): Regress X t ontox t 1,...,X t p. Durbin-Levinson algorithm withγ replaced by ˆγ. Yule-Walker for ARMA(p,q): Method of moments. Not efficient. Innovations algorithm for MA(q): withγ replaced by ˆγ. Hannan-Rissanen algorithm for ARMA(p,q): 1. Estimate high-order AR. 2. Use to estimate (unobserved) noisew t. 3. Regress X t ontox t 1,...,X t p,ŵt 1,...,Ŵt q. 4. Regress again with improved estimates ofw t. 11

Yule-Walker estimation For a causal AR(p) model φ(b)x t = W t, we have p E X t i X t φ j X t j = E(X t i W t ) for i = 0,...,p j=1 γ(0) φ γ p = σ 2 and γ p Γ p φ = 0, where φ = (φ 1,...,φ p ), and we ve used the causal representation X t = W t + ψ j W t j j=1 to calculate the values E(X t i W t ). 12

Yule-Walker estimation Method of moments: We choose parameters for which the moments are equal to the empirical moments. In this case, we choose φ so that γ = ˆγ. Yule-Walker equations for ˆφ: ˆΓ pˆφ = ˆγp, ˆσ 2 = ˆγ(0) ˆφ ˆγ p. These are the forecasting equations. We can use the Durbin-Levinson algorithm. 13

Some facts about Yule-Walker estimation If ˆγ(0) > 0, then ˆΓ m is nonsingular. In that case, ˆφ 1 = ˆΓ p ˆγ p defines the causal model X t ˆφ 1 X t 1 ˆφ p X t p = W t, {W t } WN(0,ˆσ 2 ). If {X t } is an AR(p) process, ˆφ AN (φ, σ2 ˆφ hh AN ( 0, 1 n n Γ 1 p ) ), ˆσ 2 P σ 2. for h > p. Thus, we can use the sample PACF to test for AR order, and we can calculate approximate confidence intervals for the parameters φ. 14

Yule-Walker estimation: Confidence intervals If {X t } is an AR(p) process, and n is large, n(ˆφ p φ p ) is approximately N(0,ˆσ 2ˆΓ 1 p ), with probability 1 α,φ pj is in the interval ˆσ ) 1/2 ˆφ pj ±Φ 1 α/2 n (ˆΓ 1 p, jj where Φ 1 α/2 is the 1 α/2 quantile of the standard normal. 15

Yule-Walker estimation: Confidence intervals with probability 1 α,φ p is in the ellipsoid { ) } φ R p : (ˆφp φ) ˆΓp (ˆφp φ ˆσ2 n χ2 1 α(p), where χ 2 1 α(p) is the (1 α) quantile of the chi-squared withpdegrees of freedom. To see this, notice that ( ) Var Γ 1/2 p (ˆφ p φ p ) = Γ 1/2 p var(ˆφ p φ p )Γ 1/2 p = σ2 w n I. Thus, v = Γ 1/2 p (ˆφ p φ p ) N(0,ˆσ w/ni) 2 n and so v v χ 2 (p). ˆσ w 16

Introduction to Time Series Analysis. Lecture 11. 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker estimation: example 17