ARMA Estimation Recipes

Size: px
Start display at page:

Download "ARMA Estimation Recipes"

Transcription

1 Econ. 1B D. McFadden, Fall Preliminaries ARMA Estimation Recipes hese notes summarize procedures for estimating the lag coefficients in the stationary ARMA(p,q) model (1) y t = µ +a 1 (y t-1 -µ) + + a p (y t-p -µ) + g t + b 1 g t b q g t-q, where y t is observed for t = 1,, and the g t are unobserved i.i.d. disturbances with mean zero and finite variance. he mean µ, the lag coefficients a 1,,a p and b 1,,b q, and are the parameters of the model. By assumption, a p ú 0 and b q ú 0. Define the polynomials A(z) = a 1 z + + a p z p and B(z) = b 1 z + + b q z q, and the lag operator L that for any series x t is defined as Lx t = x t-1. hen the model can be written () (I - A(L))(y t - µ) = (I + B(L))g t. he model is stationary if and only if the polynomial 1 - A(z) is stable; i.e., all its roots lie outside the unit circle. If the model is stationary, then the lag polynomial I - A(L) is invertible, and there is a MA() representation of the model, written formally as I % B(L) (3) y t - µ = "g t. I & A(L) Let z 1,,z p denote the roots of 1 - A(z), some of which may be repeated. hen this polynomial can be written 1 - A(z) = (1 - z/z 1 )""(1 - z/z p ). hen (I - A(L)) -1 = p k k'1 z ks "L s. Alternately, write (I - A(L)) -1 (I + B(L)) = sl s / (L). he identity min(p,s) i'1 () I + B(L} = (I - A(L))" sl s = 0I + ( 1 -a 1 0 )"L + + ( s - a i s-i )"L s + min(p,s) i'1 implies 0 = 1, 1 = a 1 + b 1, and s = a i s-i + b s, where b s = 0 for s > q. Another derivation of these conditions starts by noting that the covariance of y t and g t-m is m < 0. Multiplying (1) by g t-m and taking expectations then gives m for m $ 0, and zero for 1

2 (5) m = a 1 m a p m-p + b m, with m-s = 0 for s > m, b 0 = 1, and we define b m = 0 for m > q. hese are called the Yule-Walker equations. hey can be used recursively to obtain the coefficients s in the MA() representation. An implication of the MA() representation is (6) 0 / Var(y t ) = " s and m / cov(y t,y t-m ) = " s s+m for m > 0. Since cov(y t+m,y t ) = cov(y t,y t-m ) for m > 0, one has -m = m. It is sometimes convenient to summarize the autocovariances of a stationary series x in terms of an autocovariance generating function (ACGF) s'& (7) g x (z) = cov(x t,x t- s )@z s. he ACGF has a useful convolution property: If a linear transformation C(L) = s'& c s L s is applied to a stationary series x t, then y t = C(L)x t has g y (z) = C(z)C(1/z)g x (z). o verify this, note that cov(y t,y t-m ) = i'& s'& '& c i t-i,x t-m- ), and cov(x t-i,x t-m- ) = cov(x t,x t-m+i- ). hen m'& i'& '& g y (z) / z m cov(y t,y t-m ) = z i t-i,x t-m- ) m'& i'& '& i'& '& = c i z c t-i,x t-m- )z m+i- = c i z c x (z), with the last equality holding since summing over m for each fixed i and gives g x (z). Let t = g t + b 1 g t b q g t-q. hen E t = (b b q ), E t t-m = (b m b b q b q-m ) for 1 < m # q, and zero for m > q, and E t y t-m = (b m b q q-m ) for 0 # m # q, zero for m > q. he i.i.d. series g t has the constant ACGF g g (z) = since all its autocovariances are zero. hen, applying the ACGF convolution formula to t = (I + B(L))g t yields (8) g (z) = {1+B(z))(1+B(1/z)) = q s'&q (b*s*b b q b q-*s*)z s. For example, q = 1 yields g (z) = ((1+b 1 ) + b 1 z + b 1 z -1 ), and q = yields g (z) = ((1+b 1 +b ) + b 1 (1+b )z + b 1 (1+b )z -1 + b z +b z -1 ). Applying the convolution formula to y t = (I - A(L)) -1 t, it has the ACGF

3 1 (9) g y (z) = "g (z) = = 1 1%B(z) 1&A(1/z) 1%B(1/z) 1&A(1/z) (z)" (1/z). If all the roots of the polynomial 1 + B(z) lie outside the unit circle, then I + B(L) is invertible, and there is an AR() representation of (1), written formally as I & A(L) (10) "y t = g t. I % B(L) It is not a condition for stationarity that I + B(L) be invertible. However, it is always possible to re-scale the g s and redefine B(L) so that it has the same ACGF, but all the roots of 1 + B(z) lie outside or on the unit circle. For example, t = (I + L/z 1 )g t has ACGF ((1+z 1 - ) + z/z 1 + 1/zz 1 ), while t = (I + z 1 L)(g t /z 1 ) has ACGF ( /z 1 )((1+z 1 ) + zz 1 + z 1 /z), which is the same. hen one can factor B(L) into an invertible term and a non-invertible term that has unit roots. For some purposes, it is convenient to rewrite the ARMA model (1) by defining the (p+q) 1 vectors hn = (1,0,,0) with a one in the first component, rn = (1,0,,0,1,0,,0) with ones in the first and p+1 components and zeros elsewhere, and t N = (y t -µ,,y t-p+1 -µ,g t,,g t-q+1 ). hen (11) t+1 = F t + rg t+1 and y t+1 -µ = hn t+1, where an bn (1) F = I p&1,p 0 p&1,q 0 1,p 0 1,q 0 q&1,p I q&1,q with an = (a 1,,a p ), bn =(b 1,,b q ), I rs an r s matrix with ones down the diagonal and zeros elsewhere, and 0 rs an r s matrix of zeros. his is called a state space representation of the ARMA(p,q) model, with t called the state vector, t+1 = F t + rg t+1 called the state equation, and y t+1 -µ = hn t+1 called the observation equation. State space representations are not unique, and the one given here is easy to interpret but not minimal in terms of dimensionality; see Harvey, p , for a more compact and commonly used state space representation for the ARMA model. 3

4 . Prediction Let G t denote all of history up through t, including the realizations of y s and g s for s # t. hen, (13) E(y t+1 *G t ) = a 1 E(y t *G t ) + + a p E(y t-p+1 *G t ) + E(g t-1 *G t ) + b 1 E(g t- *G t ) + + b q E(g t-q-1 *G t ) = a 1 y t + + a p y t-p+1 + b 1 g t + + b q g t-q+1. As a shorthand, let y t+1*t = E(y t+1 *G t ); this is the forecast of y t+1 given G t that minimizes mean square error. An implication of the formula for y t+1*t is that g t = y t - y t*t-1. Similarly, the minimum-mse m-period ahead forecast y t+m*t = E(y t+m *G t ) is obtained using the recursion (1) y t+m*t = a 1 "y t-1+m*t + + a p "y t-p+m*t + b 1 g t+m-1*t + + b q g t+m-q*t, where y t-i+m*t = y t-i+m and g t+i-m*t = g t+i-m if I $ m, and g t+i-m*t = 0 if I < m. he conditional variance of the forecast error t+m*t = y t+m - y t+m*t is (15) v t+m*t = E( t+m*t*g t ). A convenient way of summarizing the forecasting formulas is in terms of the state space representation above, where the minimum MSE forecast is (16) t+m*t = F t+m-1*t = F m t. he forecast error is t+m*t = t+m - t+m*t = m&1 he conditional covariance matrix of the forecast errors is (17) V t+m*t = FV t+m-1*tfn + U = m&1 F s UF Ns, F s rg t+m-s = rg t-m + F t-m-1*t, implying g t+1 = 1,t+1-1,t+1*t. where U is an array that has a one in the northwest corner and zeros elsewhere. he updating formulas t+m*t = F t+m-1*t and V t+m*t = FV t+m-1*tfn + U are versions of what is called the Kalman filter. his formulation has broader application in state space models for time series analysis, a topic that will be discussed elsewhere. For some purposes, it is useful to predict backwards to period t from information after t. Multiplying the equations (I-A(L))y t = t and t = (I+B(L))g t by L -p and L -q respectively gives (18) E(y t y t+1,, t+1,) = (y t+p - a 1 y t+p a p -1y t+1 - t+p )/a p, E( t g t+1,, t+1,) = ( t+q - g t+q - b 1 g t+q b q-1 g t+1 )/b q.

5 3. Method of Moments Estimation he model (1) can be written as a regression model (19) y t = c + a 1 y t a p y t-p + t where c = (1 - a a p )µ and the MA(q) disturbances t have a covariance matrix given by the formulas following (6). Since t is correlated with y t-1,,y t-q but uncorrelated with earlier y's, (19) can be estimated using SLS with 1,y t-q-1,,y t-q-p as instruments, and observations t = p+q+1,,. his method then loses the first p+q observations in order to get the instruments. o estimate the MA coefficients, first retrieve the residuals t from the SLS estimation of (19), and form the empirical ACGF, q s'&q (0) g (z) = sz s, t'p%q%*s*%1 1 where s = t t-*s*, is an empirical estimate of E t t-*s*. From (8), the theoretical ACGF for is a bilinear function of the MA coefficients. hen, estimating the MA coefficients so that the theoretical and empirical ACGF coincide is relatively practical. As noted earlier, the solution is not in general unique, and it is preferable to pick estimates that give a stable root rather than an unstable one. his can be achieved by conducting the search over the q possible roots of 1 + B(z) subect to the constraint that these roots lie outside or on the unit circle. For the example of a MA(1) component in the ARMA model, matching the ACGF terms yields 0 = (1 + b 1 ) and 1 = b 1, which yields the quadratic 1-0b 1 + 1b 1 = 0. his has the solution b 1 = 0 / 1 ± ( 0-1 ) ½ / 1. If 0 > 1, there are two real roots, and 0 / 1 - ( 0-1 ) ½ / 1 gives the stable root. If 0 # 1, there is no real root, and the empirical ACGF cannot be matched exactly by a MA(1) model. In this case, b 1 = sign( 1 ) yields the MA(1) model with a root on or outside the unit circle that is closest to the empirical ACGF. For the example of a MA() component, writing 1 + B(z) in terms of its roots and matching the ACGF terms yields 0 = (1 + 1/z 1 + 1/z ), 1 = (1/z 1 )(1 + 1/z 1 z + 1/z ), and = /z 1 z. he estimator would then be obtained by a search in, z 1, and z subect to the restriction that z 1 and z are in the complex plane, on or outside the unit circle, and either are both real, or are convex congugates. he estimators described above are not the most efficient available among those employing only moment conditions. However, it will be convenient to develop the alternatives as estimators for the case of normal disturbances, and then note that they will be consistent even without normality. 5

6 . Maximum Likelihood Estimation Suppose the disturbances g t in (1) are normal. hen, vector (y 1,,y ) is then multivariate normal with mean µ and a covariance matrix that is a band matrix G, with all coefficients s places off the diagonal equal to s : 0 1 & & &3 & 1 0 & &3 (1) =. : : : " : : & &3 & 0 1 &1 & &3 1 0 he autocovariances s in are functions of the deep parameters of the model, and the coefficients of A(L) and B(L); these are given by the coefficients of the ACGF (9). A brute force approach to estimating the model, which is efficient under the assumptions of normality, is then to maximize the log likelihood function () L = -(/)log( ) - (½)log(det( )) - (½)(y 1 -µ,,y -µ) -1 (y 1 -µ,,y -µ)n in the parameters µ,, a 1,, a p, b 1,,b q. he likelihood function is minimized in µ at the sample mean µ. Using the formulas for differentiation of determinants and inverses, the first-order-condition for a parameter is (3) ML/M = -½ -1 (I - uun -1 )"M /M, where un = (y 1 -µ,..,y -µ ) is the vector of deviations from sample mean. he practical problem with the brute force approach is that the parameters of the ARMA process appear in L non-linearly, deep within the matrix. herefore, considerable effort in time-series analysis goes to reformulating the maximum likelihood problem in ways that are more tractable computationally. he model (1) can be rewritten as () g t = y t - a 1 y t a p y t-p - b 1 g t b q g t-q. aking expectations conditioned on the information G t-1, this equation implies 0 = y t*t-1 - a 1 y t a p y t-p - b 1 g t b q g t-q, and hence 6

7 (5) g t = y t - y t*t-1. he mapping from g s to y s is linear, with a transformation matrix that is triangular (i.e., y t depends only on g s at or before t) with ones on the diagonal. hen, the Jacobean of the transformation from (y 1,,y ) to (g 1,,g ) is one, and the log density of (g 1,,g ) conditioned on earlier g s is t'1 (6) L = -(/)log( ) - (½)log( ) - ½ g t /, with the g t defined (recursively) as functions of the lag parameters from (). his is called the predictive error decomposition of the likelihood. One approach to estimation is to condition on (y 1,,y p ) and (g p-q,,g p-1 ), so that () gives the g's for t = p+1,,, and then to maximize the conditional log likelihood, which is equivalent to minimizing the conditional sum of squares, t'p%1 (7) CSS = g t, in the lag parameters. his gives a pre-estimator (because of dependence on g p-1,..) that is equivalent to the solution at convergence obtained by iteratively applying least squares to the equation below, with the lagged g's computed using () and the lag coefficient estimates from the previous round: (8) y t = a 1 y t a p y t-p + b 1 g t b q g t-q + t for t = p+1,,. he final step is to get rid of the dependence of the estimator on the initial g's. his is often done in the computationally expedient way of replacing them by their unconditional expectations, which are zero. A superior procedure, only moderately more difficult, is described later. For AR(p) models, a useful simplification of () comes from noting that the density of (y 1,,y ) can be written as the product of the p-dimensional multivariate normal density of (y 1,,y p ) and the 1-dimensional conditional densities of y t given y t-1,,y t-p for t = p+1,,. In this formulation, the log likelihood is -1 (9) L = -(/)log( ) - ½log(det( p )) - ½(y 1 -µ,,y p -µ) p (y 1 -µ,,y p -µ) - ½log( ) - (1/ t'p%1 ) (y t -a 1 y t-1 --a p y t-p ), where p is the covariance matrix of the first p observations. If, further, one conditions on (y 1,,y p ), the resulting log likelihood is maximized when the quadratic form t'p%1 (y t -a 1 y t-1 --a p y t-p ) is 7

8 minimized; this is exactly the same as the regression (19) in the case q = 0. Conditioning on the first p observations, maximization of this likelihood function is equivalent to (non-iterative) least squares applied to the model (8), or to minimizing CSS in (7). Now consider the full stationary ARMA(p,q) model (1), and its representation (11) in state space form. Let z t denote the conditional expectation of t given y t,y t-1,,y 1. his is the optimal predictor of t given this information. Given normality, this is a linear function of the y s. Let P t denote the conditional MSE of the deviation t - z t ; i.e., P t = E{( t - z t )( t - z t )N*y t,y t-1,,y 1 }. Similarly, define z t t-1 = E( t y t-1,,y 1 ) and P t t-1 = E{( t - z t )( t - z t )N*y t-1,,y 1 }. Given the state equation t = F t-1 + rg t and the observation equation y t = hn t, and taking conditional expectations, one obtains the formulas (30) z t*t-1 = Fz t-1, P t*t-1 = FP t-1 FN + rrn, y t*t-1 = hnz t*t-1, and from these the updating relationships for proections, (31) t / y t - y t*t-1 = hn( t - z t*t-1), f t = hnp t t-1 h z t = z t*t-1 + P t*t-1h(y t - hnz t*t-1)/f t P t = P t*t-1 - P t*t-1hhnp t*t-1/f t. hese formulas (with a different notation) are derived and discussed in Harvey, p , for a more general model that includes stationary ARMA as a special case. he formulas in (31) can be employed, with an initialization for z 0 and P 0, to calculate the exact oint normal density function of (y 1,,y ): t'1 t'1 (3) L = -(/)log(b) - ½ log(f t ) - ½ < t /f t his is a predictive error decomposition form of the log likelihood. he maximum likelihood estimates can also be given an interpretation of minimizing a CSS; see Harvey, p. 90. Harvey, p. 88, also describes the construction of starting values. For the stationary ARMA model, they are z 0 = (I - F) -1 c, where c is a vector with µ in the first p components and 0 in the remaining q components, and vec(p 0 ) = F (I - FqF) -1 vec(rrn). Absent normality, the predictors above remain best linear predictors, and the estimators continue to have an interpretation of minimum CSS estimators that use all of the sample information on the first two moments, with a Bayesian interpretation of the starting values. 8

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA Applied Time Series Analysis Wayne A. Woodward Southern Methodist University Dallas, Texas, USA Henry L. Gray Southern Methodist University Dallas, Texas, USA Alan C. Elliott University of Texas Southwestern

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Structural Macroeconometrics. Chapter 4. Summarizing Time Series Behavior

Structural Macroeconometrics. Chapter 4. Summarizing Time Series Behavior Structural Macroeconometrics Chapter 4. Summarizing Time Series Behavior David N. DeJong Chetan Dave The sign of a truly educated man is to be deeply moved by statistics. George Bernard Shaw This chapter

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models

Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models Fall 22 Contents Introduction 2. An illustrative example........................... 2.2 Discussion...................................

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

New Introduction to Multiple Time Series Analysis

New Introduction to Multiple Time Series Analysis Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations. Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,

More information

ARIMA Models. Richard G. Pierse

ARIMA Models. Richard G. Pierse ARIMA Models Richard G. Pierse 1 Introduction Time Series Analysis looks at the properties of time series from a purely statistical point of view. No attempt is made to relate variables using a priori

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2 Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

Lecture on ARMA model

Lecture on ARMA model Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Time-Varying Parameters

Time-Varying Parameters Kalman Filter and state-space models: time-varying parameter models; models with unobservable variables; basic tool: Kalman filter; implementation is task-specific. y t = x t β t + e t (1) β t = µ + Fβ

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Time Series Analysis Fall 2008

Time Series Analysis Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Introduction 1 14.384 Time

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49 State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Statistics 910, #15 1. Kalman Filter

Statistics 910, #15 1. Kalman Filter Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

ECE534, Spring 2018: Solutions for Problem Set #5

ECE534, Spring 2018: Solutions for Problem Set #5 ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

Vector Time-Series Models

Vector Time-Series Models T. Rothenberg Fall, 2007 1 Introduction Vector Time-Series Models The n-dimensional, mean-zero vector process fz t g is said to be weakly stationary if secondorder moments exist and only depend on time

More information

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong Modeling, Estimation and Control, for Telecommunication Networks Notes for the MGR-815 course 12 June 2010 School of Superior Technology Professor Zbigniew Dziong 1 Table of Contents Preface 5 1. Example

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

1 Teaching notes on structural VARs.

1 Teaching notes on structural VARs. Bent E. Sørensen February 22, 2007 1 Teaching notes on structural VARs. 1.1 Vector MA models: 1.1.1 Probability theory The simplest (to analyze, estimation is a different matter) time series models are

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Automatic Autocorrelation and Spectral Analysis

Automatic Autocorrelation and Spectral Analysis Piet M.T. Broersen Automatic Autocorrelation and Spectral Analysis With 104 Figures Sprin ger 1 Introduction 1 1.1 Time Series Problems 1 2 Basic Concepts 11 2.1 Random Variables 11 2.2 Normal Distribution

More information

Lecture note 2 considered the statistical analysis of regression models for time

Lecture note 2 considered the statistical analysis of regression models for time DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

ARMA Models: I VIII 1

ARMA Models: I VIII 1 ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Multivariate GARCH models.

Multivariate GARCH models. Multivariate GARCH models. Financial market volatility moves together over time across assets and markets. Recognizing this commonality through a multivariate modeling framework leads to obvious gains

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information