Multivariate Time Series

Size: px
Start display at page:

Download "Multivariate Time Series"

Transcription

1 Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form {X t }, where, for any t, X t is a vector of the same order. We denote the individual elements with two subscripts: X it denotes the i th element of the multivariate time series at time t. Warning: I m not promising I ll do this! This is Tsay s notation, and one of the two obvious notations. One common introductory-graduate-level textbook on time series is by Shumway and Stoffer, and another is by Brockwell and Davis. Shumway and Stoffer use X ti. (I like this.) Brockwell and Davis use X it. One further thing on notation: you do not need to use the transpose (as Tsay and many others do) on X t = (X 1t,..., X kt ). It s a column vector, but you re not drawing a picture! 1

2 Marginal Model Components In a multivariate time series {X t : t..., 1,0,1,...}, each univariate series {X it : t..., 1,0,1,...} is called a marginal time series. The model for {X it : t..., 1,0,1,...} is called a marginal model component of the model for {X t : t..., 1,0,1,...}. This illustrates one of the disadvantages of not having a special notation for vectors! 2

3 Stationary Multivariate Time Series First of all, let s establish that in time series, stationary means weakly stationary. ( Strictly stationary means strictly stationary or strongly.) If a time series is stationary, then its first and second moments are time invariant. (This is not the definition.) For a stationary time series {X t }, we will generally denote the first moment as the vector µ: µ = E(X t ) and the variance-covariance as Γ 0 : Γ 0 = V(X t ) = E ( (X t µ)(x t µ) T). (Note on notation: Γ, with or without serifs, is used to denote the gamma function; it is a reserved symbol. Γ is used to denote various things.) 3

4 Stationary Multivariate Time Series If a time series is stationary, then its first and second moments are time invariant. Tsay (page 390) states the converse of this. That is true, if second moments means second auto-moments at fixed lags (including the 0 lag). I looked back to see how clear Tsay has been on that point. He has not been clear. On page 39 (in the middle), he makes an argument that is based on the time invariance of the first and second moments and the finiteness of the autocovariance imply stationarity. In other places, he indicates that time invariance of the autocorrelations is also required, but because he does not write as a mathematician, it is sometimes not clear. 4

5 Stationary Multivariate Time Series So let s be very clear. Analogously to the autocovariance γ s,t, let s define the cross-autocovariance matrix Γ s,t : ( Γ s,t = E (X s µ s )(X t µ) T). ( cross-autocovariance matrix is quite a mouthful, so we ll just call it the cross-covariance matrix.) Now, suppose Γ s,t is constant if s t is constant. (Notice I did not say if s t is constant.) Under that additional condition (together with the time-invariance of the first two moments), we say that the multivariate time series is stationary. 5

6 Stationary Multivariate Time Series In the case of stationarity, we can use the notation Γ h, which is consistent with the notation Γ 0 used before. We refer to the h = 0 case as concurrent or contemporaneous. The matrix Γ 0 is the concurrent cross-covariance matrix. We now introduce another notational form so that we can easily refer to the elements of the matrices. We equate Γ(h) with Γ h. Now, we can denote the ij th element of Γ(h) as Γ ij (h). (There is an alternative meaning of Γ p. We have used it (I think) to refer to the p p symmetric matrix of autocovariances of orders 0 through p. It is often seen in the Yule-Walker equations.) The meaning is made clear in the context. 6

7 Stationary Multivariate Time Series Notice that stationarity of the multivariate time series implies stationarity of the individual univariate time series. The univariate autocovariance functions are the diagonal elements of Γ h. We sometimes use the phrase jointly stationary to refer to a stationary multivariate time series. (This excludes the case of a multivariate time series each of whose components is stationary, but the cross-covariances are not constant at constant lags.) 7

8 Cross-Correlation Matrix The standard deviation of the i th component of a multivariate time series in the standard notation is Γ ii (0). ( For a k-variate time series, the matrix D = diag Γ11 (0),..., ) Γ kk (0) is very convenient. All variances are assumed to be positive, so D 1 exists, and D 1 Γ h D 1 is the cross-correlation matrix of lag h. (If h = 0, of course, it is the concurrent cross-correlation matrix.) Tsay denotes it as ρ 0 or ρ l. I like to use uppercase rho, R 0 or R h. Of course, either way, we may use an alternative notation, ρ ij (l) or R ij (h). Furthermore, note that in my notation, I may use ρ ij (h) in place of R ij (h). 8

9 Properties of the Cross-Covariance and Cross-Correlation Matrices Notice that Γ 0 is symmetric; it s an ordinary variance-covariance matrix. On the other hand, Γ h is not necessarily symmetric; in fact, Γ T h = Γ h. The elements of the matrix Γ(h) have directional meanings, and these have simple interpretations. First of all, we need to be clear about what kind of relationships that covariance or correlation relates to. Covariance or correlation relates to linear relationships. For X centered on 0, the correlation between X and Y = X 2 is 0. 9

10 Properties of the Cross-Covariance and Cross-Correlation Matrices Consider a given i and j representing the i th and j th marginal component time series. The direction of time is very important in characterizing the relationships between the i th and j th series. In the following, which is consistent with the nontechnical language in time series analysis, we will use the term depend loosely. (We need the linear qualifier.) If Γ ij (h) = 0 for all h, then the series X it does not depend on the past values of the series X jt. If Γ ij (h) 0 for some h, then the series X it does depend on the past values of the series X jt. In this case we say X jt leads X it, or X jt is a leading indicator of X jt. If Γ ij (h) = Γ ji (h) = 0 for all h, then neither depends on the past values of the other series, and we say the series are uncoupled. 10

11 The Cross-Covariance Matrix in a Strictly Stationary Process Strict stationarity is a restrictive property. Notice, first of all, that it requires more than just time-invariance of the first two moments; it requires time-invariance of the whole distribution. It also requires stronger time-invariance of auto-properties. An iid process is obviously strictly stationary, and such a process is often our choice for examples. The following process, however, is also strictly stationary:..., X, X, X, X,... 11

12 The Cross-Covariance Matrix in a Strictly Stationary Process and in a Serially Uncorrelated Process The cross-covariance matrix alone does not tell us whether the process is stationary; we also need time-invariance of the first two moments. Given stationarity, it is not possible to tell from the cross-covariance matrix whether or not a process is strictly stationary. In a serially uncorrelated process, Γ h is a hollow matrix for all h 0. 12

13 Sample Cross-Covariance and Cross-Correlation Matrices Given a realization of a multivariate time series {x t : t = 1,...,n}, where each x t is a k-vector, the sample cross-covariance and cross-correlation matrices are formed from in the obvious ways. We use the hat notation to indicate that these are sample quantities, and also because they are often used as estimators of the population quantities. We also use the notation x to denote x t /n, and ˆσ i to denote t(x it x i ) 2 /n. (Note the divisor. It s not really important, of course.) 13

14 Sample Cross-Covariance and Cross-Correlation Matrices The sample cross-covariance matrix is (Note the divisor.) Γ h = 1 n n i=1+h (x t x)(x t h x) T. Letting D = diag(ˆσ 1,...,ˆσ k ), the sample cross-correlation matrix is R h = D 1 Γ h D 1. 14

15 Sample Cross-Correlation Matrices There is a nice simplified notation that Tiao and Box introduced to indicate leading and lagging indicators as measured by a sample cross-covariance matrix. Each cell has an indicator of significant positive, significant negative, insignificant sample correlation. Here, significant is defined with respect to twice the asymptotic 5% critical value of a sample correlation coefficient under the assumption that the true correlation coefficient is 0. The comparison value is 2/ n, and the indicators are +,, and. ; thus, at a specific lag, a correlation matrix for three component time series may be represented as

16 Multivariate Portmanteau Tests Recall the test statistic for the portmanteau test of Ljung and Box (first, recall what the portmanteau test tests): Q(m) = n(n + 2) m h=1 1 n hˆρ2 h. The multivariate version for a k-variate time series is Q k (m) = n 2 m h=1 1 n h tr ( Γ T h Note the similarities and the differences. Γ 1 0 Γ h ) 1 Γ 0. Under the null hypothesis and some regularity conditions, this has an asymptotic distribution of chi-squared with k 2 m degrees of freedom. 16

17 VAR Models In time series, VAR means vector autoregressive. In finance generally, VaR means value at risk. A VAR(1) model is X t = φ 0 + ΦX t 1 + A t, where X t, φ 0, and X t 1 are k-vectors, Φ is a k k matrix, and {A t } is a sequence of serially uncorrelated k-vectors with 0 mean and constant positive definite variance-covariance matrix Σ. Note that the systematic term may be bigger than it looks. There are k linear terms. The key is that they only go back in time one step. This form is called the reduced-form of a VAR model. 17

18 Structural Equations of VAR Models The relationships among the component time series arise from the off-diagonal elements of Σ. To exhibit the concurrent relationships among the component time series, we do a diagonal decomposition of Σ, writing it as Σ = LGL T, where L is a lower triangular matrix whose diagonal entries are all 1 (which means that it is of full rank), and G is a diagonal matrix with positive entries. (Such a decomposition exists for any positive definite matrix.) Note that G = L 1 Σ(L 1 ) T. 18

19 Structural Equations of VAR Models Now we transform the reduced-form equation by premultiplying each term by L 1 : L 1 X t = L 1 φ 0 + L 1 ΦX t 1 + L 1 A t = φ 0 + Φ X t 1 + B t. This is one of the most fundamental transformations in statistics. The important result is that the variance-covariance that ties the component series together concurrently, that is, V(A t ), has been replaced by V(B t ), which is diagonal. Because of the special structure of L, we can see concurrent linear dependencies of the k th series on the others. And by rearranging the terms in the series, we can make any component series the last one. 19

20 Structural Equations of VAR Models The last row of L 1 has a 1 in the last position. Call the other elements w k1, w k2, etc. Then the last equation in the matrix equation can be written as X kt + k 1 i=1 L 1 X t = φ 0 + Φ X t 1 + B t. w ik X it = φ k,0 + k i=1 φ k,i X it 1 + B kt. This shows the concurrent relationship of X kt on the other series. 20

21 Properties of a VAR(1) Process There are several important properties we can see easily. Because the {A } are serially uncorrelated, Cov(A t, X t h ) = 0 for all h > 0. Also, we see Cov(A t, X t ) = V(A t ) = Σ. Also, we see the X t depends on the j th previous X (and A by Φ j. The process would be explosive (i.e., the variance would go to infinity) unless Φ j 0 as j. This will be guaranteed if all eigenvalues of Φ are less than 1 in modulus. (Remember this?) Also, just as in the univariate case, we have the recursion from which we get Γ h = ΦΓ h 1 Γ h = Φ h Γ 0. 21

22 VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with Φ p 0, and {A t } is a sequence of serially uncorrelated k-vectors with 0 mean and constant positive definite variancecovariance matrix Σ. We also can write this using the back-shift operator as or (I Φ 1 B Φ p B p )X t = φ 0 + A t, Φ(B)X t = φ 0 + A t, We also can work out the autocovariance for a VAR(p) process: Γ h = Φ 1 Γ h Φ p Γ h p. This is the multivariate Yule-Walker equations. 22

23 The Yule-Walker Equations Let s just consider an AR(p) model. We have worked out the autocovariance function of an AR(p) model. It is and γ(h) = φ 1 γ(h 1) + + φ p γ(h p) σ 2 A = γ(0) φ 1γ(1) φ p γ(p). 23

24 The Yule-Walker Equations The equations involving the autocovariance function are called the Yule-Walker equations. There are p such equations, for h = 1,...,p. For an AR(p) process that yields the two sets of equations on the previous slide, we can write them in matrix notation as and Γ p φ = γ p σ 2 A = γ(0) φt γ p 24

25 Yule-Walker Estimation of the AR Parameters After we compute the sample autocovariance function for a given set of observations, we merely solve the Yule-Walker equations to get our estimators: and ˆφ = Γ 1 p ˆγ p ˆσ 2 A = ˆγ(0) ˆφ Tˆγ p Instead of using the sample autocovariance function, we usually use the sample ACF: ˆφ = R 1 p ˆρ p. 25

26 Large Sample Properties of the Yule-Walker Estimators A result that can be shown for the Yule-Walker estimator ˆφ is n(ˆφ φ) d N p (0,σ 2 w Γ 1 p ) and ˆσ 2 A p σ 2 A. 26

27 Yule-Walker Equation for a VAR(p) Model The multivariate Yule-Walker equations, can also be used in estimation. Γ h = Φ 1 Γ h Φ p Γ h p They are often expressed in the correlation from where Υ i = D 1/2 Φ i D 1/2. R h = Υ 1 R h Υ p R h p, 27

28 Companion Matrix We can sometimes get a better understanding of a k-dimensional VAR(p) process by writing it as a kp VAR(1). It is Y t = Φ X t 1 + B t where 0 I I I 0 0 Φ p Φ p 1 Φ p 2 Φ 1. This is sometimes called the companion matrix. 28

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Lecture Note of Bus 41202, Spring 2006: Multivariate Time Series Analysis. x 1t x 2t. X t = Cov(X t, X t j ) = Γ j

Lecture Note of Bus 41202, Spring 2006: Multivariate Time Series Analysis. x 1t x 2t. X t = Cov(X t, X t j ) = Γ j Lecture Note of Bus 41202, Spring 2006: Multivariate Time Series Analysis Forcus on two series (Bivariate) Time series: Data: x 1, x 2,, x T. Weak stationarity: X t = x 1t x 2t. E(X t ) = µ Cov(X t, X

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. 9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Vector autoregressive Moving Average Process. Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem

Vector autoregressive Moving Average Process. Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem Vector autoregressive Moving Average Process Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem Road Map 1. Introduction 2. Properties of MA Finite Process 3. Stationarity of MA Process 4. VARMA

More information

VAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where:

VAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where: VAR Model (k-variate VAR(p model (in the Reduced Form: where: Y t = A + B 1 Y t-1 + B 2 Y t-2 + + B p Y t-p + ε t Y t = (y 1t, y 2t,, y kt : a (k x 1 vector of time series variables A: a (k x 1 vector

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

An algorithm for robust fitting of autoregressive models Dimitris N. Politis

An algorithm for robust fitting of autoregressive models Dimitris N. Politis An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to

More information

Statistics 910, #5 1. Regression Methods

Statistics 910, #5 1. Regression Methods Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

2. Multivariate ARMA

2. Multivariate ARMA 2. Multivariate ARMA JEM 140: Quantitative Multivariate Finance IES, Charles University, Prague Summer 2018 JEM 140 () 2. Multivariate ARMA Summer 2018 1 / 19 Multivariate AR I Let r t = (r 1t,..., r kt

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables.

Random vectors X 1 X 2. Recall that a random vector X = is made up of, say, k. X k. random variables. Random vectors Recall that a random vector X = X X 2 is made up of, say, k random variables X k A random vector has a joint distribution, eg a density f(x), that gives probabilities P(X A) = f(x)dx Just

More information

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr R Tsay An effective procedure for building empirical time series models is the Box-Jenkins approach, which consists of three stages: model

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2 Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column

More information

I = i 0,

I = i 0, Special Types of Matrices Certain matrices, such as the identity matrix 0 0 0 0 0 0 I = 0 0 0, 0 0 0 have a special shape, which endows the matrix with helpful properties The identity matrix is an example

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14 Introduction to Econometrics (3 rd Updated Edition) by James H. Stock and Mark W. Watson Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14 (This version July 0, 014) 015 Pearson Education,

More information

The purpose of this section is to derive the asymptotic distribution of the Pearson chi-square statistic. k (n j np j ) 2. np j.

The purpose of this section is to derive the asymptotic distribution of the Pearson chi-square statistic. k (n j np j ) 2. np j. Chapter 9 Pearson s chi-square test 9. Null hypothesis asymptotics Let X, X 2, be independent from a multinomial(, p) distribution, where p is a k-vector with nonnegative entries that sum to one. That

More information

Recall that the AR(p) model is defined by the equation

Recall that the AR(p) model is defined by the equation Estimation of AR models Recall that the AR(p) model is defined by the equation X t = p φ j X t j + ɛ t j=1 where ɛ t are assumed independent and following a N(0, σ 2 ) distribution. Assume p is known and

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Chapter 5. The multivariate normal distribution. Probability Theory. Linear transformations. The mean vector and the covariance matrix

Chapter 5. The multivariate normal distribution. Probability Theory. Linear transformations. The mean vector and the covariance matrix Probability Theory Linear transformations A transformation is said to be linear if every single function in the transformation is a linear combination. Chapter 5 The multivariate normal distribution When

More information

New Introduction to Multiple Time Series Analysis

New Introduction to Multiple Time Series Analysis Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2

More information

White Noise Processes (Section 6.2)

White Noise Processes (Section 6.2) White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Unit roots in vector time series. Scalar autoregression True model: y t 1 y t1 2 y t2 p y tp t Estimated model: y t c y t1 1 y t1 2 y t2

Unit roots in vector time series. Scalar autoregression True model: y t 1 y t1 2 y t2 p y tp t Estimated model: y t c y t1 1 y t1 2 y t2 Unit roots in vector time series A. Vector autoregressions with unit roots Scalar autoregression True model: y t y t y t p y tp t Estimated model: y t c y t y t y t p y tp t Results: T j j is asymptotically

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Kernel-based portmanteau diagnostic test for ARMA time series models

Kernel-based portmanteau diagnostic test for ARMA time series models STATISTICS RESEARCH ARTICLE Kernel-based portmanteau diagnostic test for ARMA time series models Esam Mahdi 1 * Received: 01 October 2016 Accepted: 07 February 2017 First Published: 21 February 2017 *Corresponding

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Nonlinear Time Series

Nonlinear Time Series Nonlinear Time Series Recall that a linear time series {X t } is one that follows the relation, X t = µ + i=0 ψ i A t i, where {A t } is iid with mean 0 and finite variance. A linear time series is stationary

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Lecture 7a: Vector Autoregression (VAR)

Lecture 7a: Vector Autoregression (VAR) Lecture 7a: Vector Autoregression (VAR) 1 2 Big Picture We are done with univariate time series analysis Now we switch to multivariate analysis, that is, studying several time series simultaneously. VAR

More information

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs Questions and Answers on Unit Roots, Cointegration, VARs and VECMs L. Magee Winter, 2012 1. Let ɛ t, t = 1,..., T be a series of independent draws from a N[0,1] distribution. Let w t, t = 1,..., T, be

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

4.1 Order Specification

4.1 Order Specification THE UNIVERSITY OF CHICAGO Booth School of Business Business 41914, Spring Quarter 2009, Mr Ruey S Tsay Lecture 7: Structural Specification of VARMA Models continued 41 Order Specification Turn to data

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

Financial Econometrics

Financial Econometrics Financial Econometrics Multivariate Time Series Analysis: VAR Gerald P. Dwyer Trinity College, Dublin January 2013 GPD (TCD) VAR 01/13 1 / 25 Structural equations Suppose have simultaneous system for supply

More information

10. Time series regression and forecasting

10. Time series regression and forecasting 10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Large Sample Properties of Estimators in the Classical Linear Regression Model

Large Sample Properties of Estimators in the Classical Linear Regression Model Large Sample Properties of Estimators in the Classical Linear Regression Model 7 October 004 A. Statement of the classical linear regression model The classical linear regression model can be written in

More information

This property turns out to be a general property of eigenvectors of a symmetric A that correspond to distinct eigenvalues as we shall see later.

This property turns out to be a general property of eigenvectors of a symmetric A that correspond to distinct eigenvalues as we shall see later. 34 To obtain an eigenvector x 2 0 2 for l 2 = 0, define: B 2 A - l 2 I 2 = È 1, 1, 1 Î 1-0 È 1, 0, 0 Î 1 = È 1, 1, 1 Î 1. To transform B 2 into an upper triangular matrix, subtract the first row of B 2

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Final Exam. Economics 835: Econometrics. Fall 2010

Final Exam. Economics 835: Econometrics. Fall 2010 Final Exam Economics 835: Econometrics Fall 2010 Please answer the question I ask - no more and no less - and remember that the correct answer is often short and simple. 1 Some short questions a) For each

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Asymptotic distribution of GMM Estimator

Asymptotic distribution of GMM Estimator Asymptotic distribution of GMM Estimator Eduardo Rossi University of Pavia Econometria finanziaria 2010 Rossi (2010) GMM 2010 1 / 45 Outline 1 Asymptotic Normality of the GMM Estimator 2 Long Run Covariance

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information