Università di Pavia. Forecasting. Eduardo Rossi

Similar documents
Forecasting with ARMA

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Introduction to Stochastic processes

Introduction to ARMA and GARCH processes

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Ch. 14 Stationary ARMA Process

Principles of forecasting

Empirical Market Microstructure Analysis (EMMA)

Chapter 4: Models for Stationary Time Series

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Discrete time processes

Econ 623 Econometrics II Topic 2: Stationary Time Series

Forecasting and Estimation

Class 1: Stationary Time Series Analysis

ECONOMETRICS Part II PhD LBS

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Univariate Time Series Analysis; ARIMA Models

Midterm Suggested Solutions

Autoregressive and Moving-Average Models

2.5 Forecasting and Impulse Response Functions

Single Equation Linear GMM with Serially Correlated Moment Conditions

Autoregressive Moving Average (ARMA) Models and their Practical Applications

ECON 616: Lecture 1: Time Series Basics

ARMA Estimation Recipes

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

Univariate Nonstationary Time Series 1

Basic concepts and terminology: AR, MA and ARMA processes

Econometrics II Heij et al. Chapter 7.1

Trend-Cycle Decompositions

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

3. ARMA Modeling. Now: Important class of stationary processes

Ch 9. FORECASTING. Time Series Analysis

Ch. 19 Models of Nonstationary Time Series

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

Lecture 1: Stationary Time Series Analysis

Chapter 9: Forecasting

1 Linear Difference Equations

Lecture 1: Stationary Time Series Analysis

Define y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Ch 4. Models For Stationary Time Series. Time Series Analysis

at least 50 and preferably 100 observations should be available to build a proper model

Single Equation Linear GMM with Serially Correlated Moment Conditions

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

Lecture on ARMA model

Covariances of ARMA Processes

Lesson 9: Autoregressive-Moving Average (ARMA) models

5 Transfer function modelling

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

A time series is called strictly stationary if the joint distribution of every collection (Y t

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

1 Class Organization. 2 Introduction

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2: Box-Jenkins methodology

Time Series 3. Robert Almgren. Sept. 28, 2009

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 2: ARMA(p,q) models (part 2)

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56

Chapter 6: Model Specification for Time Series

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,

LINEAR STOCHASTIC MODELS

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Empirical Macroeconomics

2. An Introduction to Moving Average Models and ARMA Models

E 4101/5101 Lecture 6: Spectral analysis

Lecture 2: Univariate Time Series

TMA4285 December 2015 Time series models, solution.

Dynamic Regression Models

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009

Advanced Econometrics

Stationary Stochastic Time Series Models

5: MULTIVARATE STATIONARY PROCESSES

Empirical Macroeconomics

STAD57 Time Series Analysis. Lecture 8

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Ch 6. Model Specification. Time Series Analysis

ARMA (and ARIMA) models are often expressed in backshift notation.

Some Time-Series Models

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Econometría 2: Análisis de series de Tiempo

Time Series 2. Robert Almgren. Sept. 21, 2009

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Multivariate ARMA Processes

Lecture note 2 considered the statistical analysis of regression models for time

7. MULTIVARATE STATIONARY PROCESSES

ARIMA Modelling and Forecasting

Class: Trend-Cycle Decomposition

Lesson 2: Analysis of time series

11. Further Issues in Using OLS with TS Data

6.3 Forecasting ARMA processes

Chapter 3 - Temporal processes

Lecture 4a: ARMA Model

Vector Auto-Regressive Models

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Transcription:

Università di Pavia Forecasting Eduardo Rossi

Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The forecast with the smallest MSE is Y t+1 t = E[Y t+1 X t ] Suppose Y t+1 t is a linear function of X t: Ŷ t+1 t = α X t if E[(Y t+1 α X t )X t] = 0 then α X t is the linear projection of Y t+1 on X t. Eduardo Rossi c - Time Series Econometrics 11 2

Linear Projection The LP projection produces the smallest MSE among the class of linear forecasting rule using P(Y t+1 X t ) = α X t MSE[ P(Y t+1 X t )] MSE[E(Y t+1 X t )] E[(Y t+1 α X t )X t] = 0 E[Y t+1 X t] = α E[X t X t] α = E[Y t+1 X t]e[x t X t] 1 Eduardo Rossi c - Time Series Econometrics 11 3

Properties of Linear Projection The MSE associated with a LP is given by E[(Y t+1 α X t ) 2 ] = E[(Y t+1 ) 2 ] 2E(α X t Y t+1 )+E(α X t X t α) Replacing α E[(Y t+1 α X t ) 2 ] = E[(Y t+1 ) 2 ] 2E(Y t+1 X t)[e(x t X t)] 1 E(X t Y t+1 ) +E(Y t+1 X t)[e(x t X t)] 1 [E(X t X t)][e(x t X t)] 1 E(X t Y t+1 ) E[(Y t+1 α X t ) 2 ] = E[(Y t+1 ) 2 ] E(Y t+1 X t)[e(x t X t)] 1 E(X t Y t+1 ) If X t includes a constant term, then P[(aY t+1 +b) X t ] = a P(Y t+1 X t )+b The forecast error is [ay t+1 +b] [a P(Y t+1 X t )+b] = a[y t+1 P(Y t+1 X t )] is uncorrelated with X t as required of a linear projection. Eduardo Rossi c - Time Series Econometrics 11 4

LP and OLS LP is closely related to OLS regression y t+1 = β X t +u t [ ] 1 [ 1 T β = X t X 1 t T T t=1 ] T X t Y t+1 t=1 β is constructed from the sample moments, while α is constructed from population moments. If {X t,y t+1 } is covariance stationary and ergodic for second moments, then the sample moments will converge to the population moments as the sample size T goes to infinity 1 T 1 T T X t X t t=1 p E[X t X t] T p X t y t+1 E[Xt Y t+1 ] t=1 Eduardo Rossi c - Time Series Econometrics 11 5

LP and OLS implying β p α β is consistent for the LP coefficient. Eduardo Rossi c - Time Series Econometrics 11 6

Forecast based on an infinite number of observations Forecasting based on lagged ǫ s. Infinite MA: (Y t µ) = ψ(l)ǫ t ǫ t WN(0,σ 2 ) ψ 0 = 1, j=0 ψ j <. An infinite number of obs on ǫ through date t: {ǫ t,ǫ t 1,...}. We know the values of µ and {ψ 1,ψ 2,...} Y t+s = µ+ǫ t+s +ψ 1 ǫ t+s 1 +ψ 2 ǫ t+s 2 +...+ψ s ǫ t +ψ s+1 ǫ t 1 +... Eduardo Rossi c - Time Series Econometrics 11 7

Forecast based on an infinite number of observations The optimal linear forecast is: Ê[Y t+s ǫ t,ǫ t 1,...] = µ+ψ s ǫ t +ψ s+1 ǫ t 1 +... where Ê[Y t+s X t ] P(Y t+s 1,X t ). The unknown future ǫ s are set to their expected value of zero. The forecast error is Y t+s Ê[Y t+s ǫ t,ǫ t 1,...] = ǫ t+s +ψ 1 ǫ t+s 1 +ψ 2 ǫ t+s 2 +...+ψ s 1 ǫ t+1 Eduardo Rossi c - Time Series Econometrics 11 8

Forecast based on an infinite number of observations E[(Y t+s Ê[Y t+s ǫ t,ǫ t 1,...]) 2 ] = (1+ψ 2 1 +...+ψ 2 s 1)σ 2 when s the MSE converges to the unconditional variance σ 2 j=0 ψ2 j. MA(q): ψ(l) = 1+θ 1 L+...+θ q L q Y t+s = µ+ǫ t+s +θ 1 ǫ t+s 1 +...+θ t+s q ǫ t+s q The optimal linear forecast is µ+θ s ǫ t +θ s+1 ǫ t 1 +...+θ q ǫ t q+s s = 1,...,q Ê[Y t+s ǫ t,ǫ t 1,...] = µ s = q +1,... Eduardo Rossi c - Time Series Econometrics 11 9

Forecast based on an infinite number of observations MSE: σ 2 s = 1 (1+θ 1 +...+θ 2 s 1)σ 2 s = 2,3,...,q (1+θ 2 1 +...+θ 2 q)σ 2 s = q +1,q +2,... The MSE increases with the forecast horizon up until s = q. For s > q the forecast is the unconditional mean and the MSE is the unconditional variance of the series. Eduardo Rossi c - Time Series Econometrics 11 10

Forecast based on an infinite number of observations Compact lag operator ψ(l) L s = L s +ψ 1 L 1 s +ψ 2 L 2 s +...+ψ s 1 L 1 +ψ s L 0 +ψ s+1 L 1 +ψ s+2 L 2 +... the annihilation operator replaces negative powers of L by zero [ ] ψ(l) = ψ s L 0 +ψ s+1 L 1 +ψ s+2 L 2 +... L s + Ê[Y t+s ǫ t,ǫ t 1,...] = µ+ [ ] ψ(l) L s ǫ t + Eduardo Rossi c - Time Series Econometrics 11 11

Forecast based on an infinite number of observations Forecasting based on lagged Y s. In the usual forecasting situation we have obs on lagged Y s. Suppose the infinite MA process has an Infinite AR representation η(l)(y t µ) = ǫ t η(l) = j=0 η jl j, η 0 = 1 and j=0 η j < η(l) = [ψ(l)] 1. A c.s. AR(p) satisfies (1 φ 1 L φ 2 L 2 +...+φ p L p )(Y t µ) = ǫ t φ(l)(y t µ) = ǫ t η(l) = φ(l) ψ(l) = [φ(l)] 1 Eduardo Rossi c - Time Series Econometrics 11 12

Forecast based on an infinite number of observations For an MA(q): Y t µ = (1+θ 1 L+...+θ q L q )ǫ t Y t µ = θ(l)ǫ t ψ(l) = θ(l) η(l) = [θ(l)] 1 provided that is based on an invertible representation. Eduardo Rossi c - Time Series Econometrics 11 13

Forecast based on an infinite number of observations ARMA(p,q) can be represented as an AR( ) with ψ(l) = θ(l) φ(l) provided that the roots of φ(z) and θ(z) lie outside the unit circle. When the restrictions are satisfied obs on {Y t,y t 1,Y t 2,... } will be sufficient to construct {ǫ t,ǫ t 1,...}. Eduardo Rossi c - Time Series Econometrics 11 14

Forecast based on an infinite number of observations For example for an AR(1): (1 φl)(y t µ) = ǫ t given φ and µ and Y t,y t 1, the value of ǫ t can be constructed from ǫ t = (Y t µ) φ(y t 1 µ) For an invertible MA(1): (1+θL) 1 (Y t µ) = ǫ t given an infinite number of obs on Y, we can compute: ǫ t = (Y t µ) θ(y t 1 µ)+θ 2 (Y t 2 µ) θ 3 (Y t 3 µ)+... Eduardo Rossi c - Time Series Econometrics 11 15

Forecast based on an infinite number of observations Under the conditions Ê[Y t+s Y t,y t 1,...] = µ+ [ ] ψ(l) L s the forecast of Y t+s as a function of lagged Y s. Using η(l) = [ψ(l)] 1 Ê[Y t+s Y t,y t 1,...] = µ+ [ ] ψ(l) L s Wiener-Kolmogorov prediction formula. + + η(l)(y t µ) [ψ(l)] 1 (Y t µ) Eduardo Rossi c - Time Series Econometrics 11 16

Wiener-Kolmogorov prediction formula - AR(1) For example for an AR(1): (1 φl)(y t µ) = ǫ t 1 ψ(l) = 1 φl = 1+φL+φ2 L 2 +...+φ s L s +... the annihilation operator is: [ ] ψ(l) L s = φ s +φ s+1 L 1 +φ s+2 L 2 +... = φs 1 φl + [ ] ψ(l) Ê[Y t+s Y t,y t 1,...] = µ+ where ǫ t = (1 φl)(y t µ). L s Ê[Y t+s Y t,y t 1,...] = µ+φ s (Y t µ) + η(l)(y t µ) = µ+ φs 1 φl (1 φl)(y t µ) the forecast decays geometrically from (Y t µ) toward µ as s increases. Eduardo Rossi c - Time Series Econometrics 11 17

Wiener-Kolmogorov prediction formula - AR(1) Given that ψ j = φ j, from the MSE of a MA( ), we have that the MSE s-period-ahead forecast error is: as s [1+φ 2 +...+φ 2(s 1) ]σ 2 MSE = σ2 1 φ 2 Eduardo Rossi c - Time Series Econometrics 11 18

Wiener-Kolmogorov prediction formula - AR(p) Stationary AR(p) process Y t+s µ = f (s) 11 (Y t µ)+f (s) 12 (Y t 1 µ)+...+f (s) 1p (Y t p+1 µ)+ ψ j = f (j) 11 ǫ t+s +ψ 1 ǫ t+s 1 + +ψ s 1 ǫ t+1 the optimal s-period-ahead forecast is Ŷ t+s t = µ+f (s) 11 (Y t µ)+...+f (s) 1p (Y t p+1 µ) forecast error Y t+s t Ŷt+s t = ǫ t+s +ψ 1 ǫ t+s 1 + +ψ s 1 ǫ t+1 Eduardo Rossi c - Time Series Econometrics 11 19

Wiener-Kolmogorov prediction formula - AR(p) To calculate the optimal forecast we use a recursion. Start with the forecast Ŷt+1 t Ŷ t+2 t+1 : Ŷ t+1 t µ = φ 1 (Y t µ)+...+φ p (Y t p+1 µ) Ŷ t+2 t+1 µ = φ 1 (Y t+1 µ)+...+φ p (Y t p+2 µ) Law of Iterated Projections: Forecast Ŷt+2 t+1 projected on date t information set then we obtain Ŷ t+2 t : Ŷ t+2 t µ = φ 1 (Ŷ t+1 t µ)+...+φ p (Y t p+2 µ) substituting Ŷ t+1 t Ŷ t+2 t µ = φ 1 [φ 1 (Y t µ)+...+φ p (Y t p+1 µ)]+ φ 2 (Y t µ)+...+φ p (Y t p+2 µ) Eduardo Rossi c - Time Series Econometrics 11 20

Wiener-Kolmogorov prediction formula - AR(p) Ŷ t+2 t µ = (φ 2 1 +φ 2 )(Y t µ)+(φ 1 φ 2 +φ 3 )(Y t 1 µ)+...+ (φ 1 φ p 1 +φ p )(Y t p+2 µ)+φ 1 φ p (Y t p+1 µ) The s-period-ahead forecast of an AR(p) process can be obtained by iterating on Ŷ t+j t µ = φ 1 (Ŷ t+j 1 t µ)+...+φ p (Ŷ t+j p t µ) Eduardo Rossi c - Time Series Econometrics 11 21

Wiener-Kolmogorov prediction formula - MA(1) Invertible MA(1) (Y t µ) = (1+θL)ǫ t with θ < 1. Wiener-Kolmogorov formula [ ] ψ(l) Ŷ t+s t = µ+ (1+θL) 1 (Y t µ) Forecast s = 1 [ (1+θL) L 1 ] + L s = θ + Ŷ t+1 t = µ+ θ 1+θL (Y t µ) = µ+θ(y t µ) θ 2 (Y t 1 µ)+θ 3 (Y t 2 µ)+... Eduardo Rossi c - Time Series Econometrics 11 22

Wiener-Kolmogorov prediction formula - MA(1) Alternatively ǫ t = (1+θL) 1 (Y t µ) in practice ǫ t = (Y t µ) θ ǫ t 1. For s = 2,3,... [ (1+θL) L s ] + = 0 Ŷ t+s t = µ Eduardo Rossi c - Time Series Econometrics 11 23

Wiener-Kolmogorov prediction formula - MA(q) (Y t µ) = θ(l)ǫ t θ(l) = (1+θ 1 L+θ 2 L 2 +...+θ q L q ) [ 1+θ1 L+...+θ q L q ] Ŷ t+s t = µ+ [ 1+θ1 L+...+θ q L q ] For L s + = L s Ŷ t+s t = µ+(θ s +θ s+1 L+...+θ q L q s ) ǫ t + 1 θ(l) (Y t µ) 1+θ s L+θ s+1 L 2 +...+θ q L q s s = 1,...,q 0 s = q +1,... ǫ t = (Y t µ) θ 1 ǫ t 1... θ q ǫ t q Eduardo Rossi c - Time Series Econometrics 11 24

Wiener-Kolmogorov prediction formula - ARMA(1,1) (1 φl)(y t µ) = (1+θL)ǫ t Stationarity: φ < 1. Invertibility: θ < 1. [ ] 1+θL 1 φl Ŷ t+s t = µ+ (1 φl)l s 1+θL (Y t µ) [ 1 (1 φl) = 1+φL+φ2 L 2 +... ] [ ] 1 = (1 φl)l + θl s (1 φl)l s 1+θL (1 φl)l s + = + [ (1+φL+φ 2 L 2 +...) L s + θl(1+φl+φ2 L 2 +...) L s = (φ s +φ s+1 L+φ s+2 L 2 +...)+ θ(φ s 1 +φ s L+φ s+1 L 2 +...) + ] + Eduardo Rossi c - Time Series Econometrics 11 25

Wiener-Kolmogorov prediction formula - ARMA(1,1) [ ] 1+θL (1 φl)l s + = φ s (1+φL+φ 2 L 2 +...)+ θφ s 1 (1+φL+φ 2 L 2 +...) = (φ s +θφ s 1 )(1+φL+φ 2 L 2 +...) = φs +θφ s 1 1 φl Eduardo Rossi c - Time Series Econometrics 11 26

Wiener-Kolmogorov prediction formula - ARMA(1,1) Ŷ t+s t = µ+ [ ] 1+θL (1 φl)l s + = µ+ φs +θφ s 1 1 φl 1 φl 1+θL (Y t µ) 1 φl 1+θL (Y t µ) = µ+ φs +θφ s 1 1+θL (Y t µ) For s = 2,3,... the forecast Ŷ t+s t µ = φ(ŷ t+s 1 t µ) the forecast decays geometrically at the rate φ toward the unconditional mean µ. The one-period-ahead forecast (s=1) is given by Ŷ t+1 t = µ+ φ+θ 1+θL (Y t µ) Eduardo Rossi c - Time Series Econometrics 11 27

Wiener-Kolmogorov prediction formula - ARMA(1,1) Ŷ t+1 t = µ+ φ(1+θl)+θ(1 φl) (Y t µ) 1+θL = µ+φ(y t µ)+ 1 φl 1+θL (Y t µ) ǫ t = 1 φl 1+θL (Y t µ) = (Y t µ) φ(y t 1 µ) θ ǫ t 1 ǫ t = Y t Ŷ t t 1 Eduardo Rossi c - Time Series Econometrics 11 28

Wiener-Kolmogorov prediction formula - ARMA(1,1) s = 2, Ŷ t+2 t = µ+ φ2 +θφ 1+θL (Y t µ) = µ+φ φ+θ 1+θL (Y t µ) = µ+φ(φ+θ)(1 θl+θ 2 L 2 θ 3 L 3 +...)(Y t µ) = µ+φ(φ+θ)(y t µ) φ(φ+θ)θ(y t 1 µ)+... Eduardo Rossi c - Time Series Econometrics 11 29

Wiener-Kolmogorov prediction formula - ARMA(p,q) ARMA(p,q): φ(l)(y t µ) = θ(l)ǫ t Ŷ t+1 t µ = φ 1 (Y t µ)+...+φ p (Y t p+1 µ)+θ 1 ǫ t +...+θ q ǫ t q+1 ǫ t = Y t Y t t 1 Ŷ τ t = Y τ τ t φ 1 (Ŷ t+s 1 t µ)+...+φ p (Y t+s p t µ)+θ 1 ǫ t +...+θ q ǫ t+s q for s = 1,...,q Ŷ t+s t µ = φ 1 (Ŷt+s 1 t µ)+...+φ p (Y t+s p t µ) for s = q +1,... Eduardo Rossi c - Time Series Econometrics 11 30

Forecasts based on a Finite number of observations {Y t,y t 1,...,Y t m+1 } observations. Presample ǫ s all equal to 0. Approximation Ê[Y t+s Y t,y t 1,...] = Ê[Y t+s Y t,...,y t m+1,ǫ t m = 0,ǫ t m 1 = 0,...] MA(q): ǫ t m = ǫ t m 1 =... = ǫ t m q+1 = 0 ǫ t m+1 = Y t m+1 µ ǫ t m+2 = Y t m+2 µ θ 1 ǫ t m+1 ǫ t m+3 = Y t m+3 µ θ 1 ǫ t m+2 θ 2 ǫ t m+1 The values are to be replaced in Ŷ t+s t = µ+(θ s +θ s+1 L+θ s+2 L 2 +...+θ q L q s ) ǫ t Eduardo Rossi c - Time Series Econometrics 11 31

Forecasts based on a Finite number of observations For s = q = 1: Ŷ t+s t = µ+θ(y t µ) θ 2 (Y t 1 µ)+...+( 1) m 1 θ m (Y t m+1 µ) truncated infinite AR. For m and θ small we have a good approximation. For θ = 1 the approximation may be poorer. Eduardo Rossi c - Time Series Econometrics 11 32

Exact Finite-sample Properties Exact projection of Y t+1 on its most recent values 1 Y t X t =. Linear Forecast If Y t is c.s. Y t m+1 α (m) X t = α m 0 +α m 1 Y t +...+α m my t m+1 E[Y t Y t j ] = γ j +µ 2 X t = [1,Y t,...,y t m+1 ] Eduardo Rossi c - Time Series Econometrics 11 33

Exact Finite-sample Properties implies α (m) = [ µ (γ 1 +µ 2 )... (γ m +µ 2 ) ] 1 µ... µ µ (γ 0 +µ 2 )... (γ m 1 +µ 2 )... µ (γ m 1 +µ 2 )... (γ 0 +µ 2 ) 1 when a constant term is included in X t it is more convenient to express variables in deviations from the mean. Eduardo Rossi c - Time Series Econometrics 11 34

Exact Finite-sample Properties Calculate the projection of (Y t+1 µ) on (Y t µ),(y t 1 µ),...,(y t m+1 µ) 1 α (m) = γ 0 γ 1... γ m 1... γ 1. γ m 1 γ m 2... γ 0 γ m s-period-ahead forecast Ŷ t+s t = µ+α (m,s) 1 (Y t µ)+...+α m (m,s) (Y t m+s µ) 1 α (m,s) 1 γ 0 γ 1... γ m 1 γ s. =.... α m (m,s) γ m 1 γ m 2... γ 0 γ s+m 1 Eduardo Rossi c - Time Series Econometrics 11 35

Exact Finite-sample Properties Inversion of an (m m) matrix. Two algorithms: 1. Kalman Filter to compute finite-sample forecast. 2. Triangular Factorization. Eduardo Rossi c - Time Series Econometrics 11 36