Time Series 3. Robert Almgren. Sept. 28, 2009

Similar documents
Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 4. Robert Almgren. Oct. 5, 2009

1 Linear Difference Equations

Ch. 14 Stationary ARMA Process

Lesson 9: Autoregressive-Moving Average (ARMA) models

3. ARMA Modeling. Now: Important class of stationary processes

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Introduction to ARMA and GARCH processes

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

3 Theory of stationary random processes

Chapter 6: Model Specification for Time Series

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Nonlinear time series

Forecasting with ARMA

Covariances of ARMA Processes

Chapter 4: Models for Stationary Time Series

at least 50 and preferably 100 observations should be available to build a proper model

ECON 616: Lecture 1: Time Series Basics

Discrete time processes

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Lecture 2: ARMA(p,q) models (part 2)

Chapter 3 - Temporal processes

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Econ 623 Econometrics II Topic 2: Stationary Time Series

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Introduction to Stochastic processes

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Exercises - Time series analysis

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

1 Class Organization. 2 Introduction

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Time Series Analysis

Identifiability, Invertibility

FE570 Financial Markets and Trading. Stevens Institute of Technology

Università di Pavia. Forecasting. Eduardo Rossi

STAT STOCHASTIC PROCESSES. Contents

Some Time-Series Models

2. An Introduction to Moving Average Models and ARMA Models

Modelling using ARMA processes

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Chapter 8: Model Diagnostics

7. MULTIVARATE STATIONARY PROCESSES

STAD57 Time Series Analysis. Lecture 8

Econometría 2: Análisis de series de Tiempo

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

A time series is called strictly stationary if the joint distribution of every collection (Y t

Empirical Market Microstructure Analysis (EMMA)

Lecture 2: Univariate Time Series

TMA4285 December 2015 Time series models, solution.

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

5: MULTIVARATE STATIONARY PROCESSES

Econ 424 Time Series Concepts

Volatility. Gerald P. Dwyer. February Clemson University

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Problem Set 2: Box-Jenkins methodology

Ch 4. Models For Stationary Time Series. Time Series Analysis

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

11. Further Issues in Using OLS with TS Data

Time Series Analysis -- An Introduction -- AMS 586

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Expressions for the covariance matrix of covariance data

Time Series Analysis Fall 2008

6.3 Forecasting ARMA processes

Univariate Time Series Analysis; ARIMA Models

Long-range dependence

LINEAR STOCHASTIC MODELS

Midterm Suggested Solutions

ARMA Models: I VIII 1

Time series models in the Frequency domain. The power spectrum, Spectral analysis

7. Forecasting with ARIMA models

Notes on Time Series Modeling

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

Applied time-series analysis

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Statistics 349(02) Review Questions

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Ch 6. Model Specification. Time Series Analysis

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

1 Introduction to Generalized Least Squares

Math Tools: Stochastic Processes Revised: November 30, 2015

1 Teaching notes on structural VARs.

Gaussian processes. Basic Properties VAG002-

Chapter 6. Random Processes

Autoregressive and Moving-Average Models

Quantitative Finance I

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Trend-Cycle Decompositions

Review Session: Econometrics - CLEFIN (20192)

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

Transcription:

Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E w 2 t ) = σ 2, and E w t w s ) = 0 for s t. MAq) x t = µ + w t + q θ j w t j = µ + w t + θ 1 w t 1 + + θ q w t q. j=1 x t is stationary with well-defined variance and covariances for any values of the coefficients θ 0,..., θ q, if q is finite. In the limit q, we require the rather weak condition that θ j exists. We are respecting the standard normalization condition that θ 0 = 1. ARp) p x t = c + w t + φ j x t j = c + w t + φ 1 x t 1 + + φ p x t p. j=1 The process is stationary if and only if the characteristic polynomial Pz) = 1 φ 1 z φ p z p has all its roots outside the unit disk. Again we set φ 0 = 1. ARMAp,q) It is straightforward to combine the above as x t = c + p φ j x t j + j=1 1 q θ j w t j j=0

Robert Almgren: Time Series 3 Sept. 28, 2009 2 It is stationary if the coefficients of the AR part satisfy the condition above. With this model you can get reasonably rich behavior with fairly low orders of p and q. We would now like to answer two particular questions about linear time series: 1. When can we convert an MA model into an AR model or conversely? Answer: Any stationary ARp) model can easily be converted to an MA ) model. For any MAq) model, we can find an equivalent MA representation of the same sequence x t that can be represented by an AR ) model.) 2. Given a sequence of numbers γ l, can we always find a linear model whose autocovariance is that sequence? Answer: yes, if the given covariance sequence is positive definite and decays.) 1 Invertibility It is convenient to streamline our notation a bit. Let B denote the lag or back-shift operator acting on infinite sequences of numbers or random variables {x t } t= : y = B x means y t = x t 1 for each t. Powers of B mean repeated shifts: y = B k x means y t = x t k for each t, including the identity B 0 = I and negative powers: y = B 1 x means y t = x t+1 for each t. Then a general ARMAp,q) model may be written PB) x = c + QB) w where the characteristic polynomial of the MAq) model is Qz) = 1 + θ 1 z + + θ q z q These polynomials both have P0) = Q0) = 1, but so far only Pz) has conditions on the locations of its roots. We assume that Pz) and Qz) have no common roots.

Robert Almgren: Time Series 3 Sept. 28, 2009 3 There is a standard mathematical relaxation: Polynomial Complex function analytic near zero. Specifically, as the order of a polynomial tends to infinity, it may be interpreted as a power series representation of a more general function. If the coefficients decay reasonably rapidly then the function is analytic in a neighborhood of zero. For example, if the coefficients are absolutely summable, then the series converges for any z with z 1 and hence defines an analytic function in that disk. Conversely, any function that is analytic in the unit disk has a power series with summable coefficients, and may therefore be interpreted as a polynomial of infinite order. 1.1 AR MA For any stationary ARp) model, the polynomial function Pz) is nonzero for z 1. Therefore, the function Qz) = 1 Pz) is analytic for z 1 and has a power series representation Qz) = 1 + θ 1 z + θ 2 z 2 + with summable coefficients. Therefore, at least formally, we may rearrange the ARp) model as x = 1 ) c + w = µ + QB) w PB) with µ = c/p1) since B is the identity applied to a constant. Thus the coefficients in the power series expansion of Qz) are the coefficients in the MA ) model. Let us note for future reference that Qz) has no zeros in the unit disk, since Pz) is finite there. In fact this works not only formally but in reality, because B is a bounded operator or just because the manipulations with powers of B map exactly to shifts of the sequences.

Robert Almgren: Time Series 3 Sept. 28, 2009 4 Example Last time we did the calculations explicitly for the AR1) model taking Pz) = 1 φz with φ < 1. Then Qz) = 1 Pz) = 1 1 φz = 1 + φz + φ2 z 2 + for z < 1 in fact the sum would converge for z < 1/ φ ). So the MA ) model has θ j = φ j which is summable, and µ = c/1 φ). 1.2 MA AR In principle, the same procedure should work in reverse. Given an MAq) model defined by its polynomial Qz) and its mean µ, we simply invert the operator to write 1 QB) x = 1 µ + w or PB) x = c + w QB) where Pz) is the power series expansion of 1/Qz) and c = µ/q1). But for this to make sense, we need Pz) to be a generalisation of a polynomial with zeros outside the unit disk. In particular, it should be analytic in the unit disk, and this will be true if and only if the polynomial Qz) has no zeros in the unit disk. But we did not need to impose that condition on Q to get a well-defined MA model: many perfectly fine MAq) models will not satisfy this condition. Example Last time we studied the MA1) model with Qz) = 1+αz, which was well behaved for any value of α. We would like to invert Pz) = 1 Qz) = 1 1 + αz = 1 αz + α2 z 2 α 3 z 3 + which converges for z < 1/ α. Only if α < 1 does this include the unit disk and give us a reasonable AR ) model. However, remember that the white noise w t is only a device that we have used to create the series x t, so the precise relationship between the real x t and our particular definition of w t is not important. All that is important is the moments of x t, in particular the autocorrelation γ l = σ 2 q l j=0 θ jθ j+l. For this model we calculated γ 0 = 1 + α 2) σ 2, γ 1 = α σ 2

Robert Almgren: Time Series 3 Sept. 28, 2009 5 and all higher covariances are zero. If we find a new process that has the same set of covariances then we can consider it equivalent to the original, and if we can invert the new process we are happy. Thus we define a new MA1) process, with a different coefficient α and a different noise variance σ 2. We choose α and σ so that 1 + α 2 ) σ 2 = 1 + α 2) σ 2 and α σ 2 = α σ 2 Indeed, a solution to this besides α = α and σ = σ ) is α = 1 α and σ 2 = α 2 σ 2, The new process has the same moments γ 0 = γ 0 and γ 1 = γ 1 and is hence indistinguishable. It is a different representation of the same stochastic process x t. But if the original process had α > 1 and was thus non-invertible, the new one has α < 1 and is invertible. If we relax the normalization condition to allow θ 0 1, then we can write the new process even more simply: instead of θ 0, θ 1 ) = 1, α), we set θ 0, θ 1 ) = α, 1) using the same noise volatility σ. In general, consider a polynomial Qz), of order q with Q0) = 1. We factor Q as Qz) = 1 λ 1 z ) 1 λ q z ) where λ 1,..., λ q are the inverses of the roots in the complex plane. If all λ j < 1 then the model is invertible. Let us suppose λ j > 1 for j = 1,..., r the problem roots), and let us suppose λ j < 1 for j = r + 1,..., q, ignoring the possibility that any λ j = 1. Following the logic above, we simply invert the roots that cause the problems, constructing a new polynomial Qz) = 1 zλ1 ) 1 z λ r ) 1 λr +1 z ) 1 λ q z ) and using a new noise process w with variance E w t 2 ) = σ 2 = λ1 2 λ r 2 σ 2. This new model has all roots outside the unit circle and is invertible. The rescaling of the noise variance is necessary only because we are enforcing θ 0 = 1 so Q0) = 1. If we relax that, then we can write Qz) = λ 1 z ) λ r z ) 1 λ r +1 z ) 1 λ q z ) and use the same noise variance as in the original model.

Robert Almgren: Time Series 3 Sept. 28, 2009 6 Lemma: x = QB) w has the same covariances as x = QB)w. Proof: Let us write the two models as x = I λ 1 B ) I λ q B ) w x = λ 1 B ) λ r B ) 1 λ r +1 B ) 1 λ q B ) w where we use the same noise sequence w. We go from w to x in q steps: x q), x q 1),..., x 1), x 0), with x q) = w and x 0) = x, and similarly for x k). At each step, we set x j 1) = I λ j B ) x j) for j = 1,..., q and x j 1) = I λj B ) x j), j = r + 1,..., q λj B ) x j), j = 1,..., r. At the beginning of this process, x q) has the same covariance as x q) since both equal the same noise w. We want to show that each x j) has the same covariance as x j), so that finally x = x 0) and x = x 0) have the same covariance and are equivalent. Clearly, this is true as far as x r ). Streamlining notation somewhat, we need to show that if z has the same covariance as z, then ỹ = λ B ) z has the same covariance as y = I λb ) z This notation simply means that ỹ t = λ z t z t 1 and y t = z t λz t 1. Let us denote by χ l the common covariance of z and z. Then we easily calculate the covariances of y and of ỹ as ) ) ) ) γ l = E ỹ t ỹ t l = E λ zt z t 1 λ zt l z t l 1 = E λ 2 z ) t z t l λ z t z t l 1 λ z t 1 z t l + z t 1 z t l 1 = 1 + λ 2) χ l λ ) χ l 1 + χ l+1 and γ l ) zt ) ) = E y ) t y t l = E λz t 1 zt l λz t l 1 ) = E z t z t l λ z t 1 z t l λ z t z t l 1 + λ 2 z t 1 z t l 1 = 1 + λ 2) χ l λ ) χ l 1 + χ l+1 which are the same. QED

Robert Almgren: Time Series 3 Sept. 28, 2009 7 To summarize, if x and x are constructed from w using different formulas, then they will give different outputs if fed with the same realization of w. But considered as statistical processes over all the possible realizations of w, their statistical properties are equivalent and there is no possible way we could distinguish x from x by observation since w is not observed. For any proposed MAq) model, we may always choose to use an invertible representation, and then we may freely go back and forth between MA and AR models. 2 Reconstruction from covariance Suppose we are given a sequence of covariances {γ l } l=, with even symmetry so that γ l = γ l. Can we always determine coefficients θ 1, θ 2,..., and φ 1, φ 2,... of an ARMA model, along with the white noise variance σ 2, so that that model has the given covariances? As noticed above, the answer is yes, as long as the given covariances have the necessary property of positive definiteness. 2.1 Necessary condition: positive semidefiniteness A real-valued even sequence {γ l } l= is positive semi-definite if for all n 0 and all real-valued vectors z 1,..., z n, z i γ i j z j 0 Any sequence that is the covariance function of a stationary time series process x t must be positive semidefinite. For then, taking mean zero for convenience, we have γ i j = E ) x i x j and then z i γ i j z j = E ) z i x i z j x j = Var z i x i 0. i=1 Example Last time we considered the MA1) model x t = w t +αw t 1 with E wt 2 ) = σ 2 and we computed its autocovariance sequence 1 + α 2 )σ 2 for l = 0, γ l = ασ 2 for l = ±1, and 0 for l 2.

Robert Almgren: Time Series 3 Sept. 28, 2009 8 We observed that the correlation ρ 1 = γ 1 /γ 0 = α/1 + α 2 ) took the maximum absolute value ρ max = 1 2. If we are given γ 0 and γ 1, with γ 1 γ 0 /2 and γ l = 0 for l > 1), then we can determine α and σ so that this MA1) model has the specified covariance function. But if we are given this covariance sequence, but with γ 0 and γ 1 having 2γ 1 > γ 0 we must always have γ 0 0 by symmetry), might there be some more complicated ARMA model having this as its covariance sequence? The answer is no, because this covariance sequence is not positive semi-definite for γ 1 > γ 0 /2. For n > 0, consider the n-vector z = 1, 1,..., 1) n 1). Then z i γ i j z j = n γ 0 2n 1)γ 1 = n γ 0 1 1 ) ) 2γ 1. n If 2γ 1 > γ 0 then this becomes negative for n large enough. And if 2γ 1 < γ 0, then we do the same calculation with z = 1, 1,..., 1 ). 2.2 Sufficiency Above, we showed that the covariance sequence of any stationary stochastic process is positive semidefinite. The more difficult question is the converse: is any positive semidefinite even sequence the covariance sequence of some stationary stochastic process? And if this is true, can we always write that stationary stochastic process as an ARMA model in terms of some white noise w t? The answer is yes to both, but we shall give only a slightly unsatisfying answer to the first question. Unsatisfying, because it uses the very general Kolmogorov s theorem. Kolmogorov s theorem applies to collections of n-variable cumulative distribution functions F x 1,..., x n ; t 1,..., t n ). Here t 1,..., t n are a collection of n integer-valued time points, x 1,..., x n are the process values at those points, and F is the cumulative probability function. Kolmogorov s theorem tells us that for any such family of cumulative distribution functions, there exists a stochastic process having these as its distribution functions if and only if the marginal distributions are correct at all orders: for all n 1, all x = x 1,..., x n ) and all t = t 1,..., t n ), lim x i F x; t ) = F x i) ; t i) ),

Robert Almgren: Time Series 3 Sept. 28, 2009 9 where x i) and t i) are the vectors after removing the ith component. We also need symmetry of the distribution functions under permutations but this is pretty clear.) To apply this theorem, we suppose we are given an even positive semi-definite sequence {γ l } l=. For each n, each t 1,..., t n ), and each x 1,..., x n ), consider the distribution function whose characteristic function is Φ ) u 1,..., u n ; t 1,..., t n = exp 1 2 u i γ ti t j u j Since γ is positive semi-definite, this is the characteristic function of a normal distribution with mean zero and covariance matrix Γ, Γ ij = γ ti t j. The consistency condition is easy to check by taking u i 0, and then Kolmogorov s theorem tells us that there exists a stochastic process, in fact a Gaussian process, with these distributions. I am still looking for a good way to show that there is in fact an ARMA process having the given distributions. We may make two relevant observations: 1. If we require that the ARMA process be of finite order finite values of p and q) then we must require that the covariance function decay exponentially as l ±. It would be possible to fit the behavior to any desired degree of precision, but with a finite number of parameters we will never be able to fit with complete precision. 2. If we admit processes of infinite order, then we may as well simply consider MA ) processes since stationary ARMAp,q) processes are always equivalent. Then the question is whether any stationary Gaussian random process can be represented as a linear sum of Gaussian white noises. I believe this to be true but do not have a concise proof. The question is less interesting than it might appear, since the MA coefficients {θ j } may decay less slowly than we would like.