Multivariate Time Series: VAR(p) Processes and Models
|
|
- Clarence Gaines
- 6 years ago
- Views:
Transcription
1 Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with Φ p 0, and {A t } is a sequence of serially uncorrelated k-vectors with 0 mean and constant positive definite variancecovariance matrix Σ. We also can write this using the back-shift operator as (I Φ 1 B Φ p B p )X t = φ 0 + A t, or Φ(B)X t = φ 0 + A t, 1
2 Companion Matrix We can sometimes get a better understanding of a k-dimensional VAR(p) process by writing it as a kp VAR(1). It is Y t = Φ X t 1 + B t where 0 I I I 0 0 Φ p Φ p 1 Φ p 2 Φ 1. This is sometimes called the companion matrix. The key fact here is that stationarity can be assessed by looking at the eigenvalues of Φ. 2
3 Number of Terms in Time Series Models The two common general types of time series models incorporate past history either through linear combinations of past observations (AR) or of previous errors (shocks) in the system (MA). To use either type of model, we need to decide on the order. For an AR model, we do that by using a sequence of partial models. Recall: We consider the models R t R t. R t φ 0,1 + φ 1,1 R t 1 φ 0,2 + φ 1,2 R t 1 + φ 2,2 R t 2 φ 0,p + φ 1,p R t φ 2,p R t p The coefficients φ i,i constitute the partial autocorrelation function (PACF). (The argument for the function is the index.) 3
4 The Partial Autocorrelation Function (PACF) in AR Models The PACF is useful for an AR model because we can partial out the dependence. Consider AR(1): R t = φr t 1 + A t. We have γ(2) = φ 2 γ(0) for R t and R t 2. Could we get a covariance of something to go to 0 at lag 2? Consider R t φr t 1 and R t 2 φr t 1. The covariance is 0. This is the idea behind the PACF; for R t and R t+h, regress each on the R k s between them. The important result is that the PACF in an AR(p) model is 0 beyond lag p; that is φ p+1,p+1 = 0; hence, we can use it to identify p. 4
5 The Partial Autocorrelation Function (PACF) in AR Models The question is how to use the sample PACF. Often, since after all, we use it to build a model, we just use simple graphs to decide at what order the sample PACF has died off. More formally, if the errors in the AR(p) model are iid with mean 0, then the sample PACFs beyond p are asymptotically iid N(0,1/n). 5
6 Number of Terms in a VAR(p) Model We use similar ideas to determine the p in a VAR(p) model. Instead of scalar partial correlations, however, we have partial covariance matrices. It s a little harder even to get started. We take a parametric approach using a multivariate normal distribution. Residuals from a partial true model have a PDF of the form f(r) = 1 (2π) k/2 Σ 1/2 exp ( (r µ r ) T Σ 1 (r µ r )/2 ). 6
7 Sequential Tests for Φ j = 0 in a VAR(p) Model We consider a sequence of VAR models, X t = φ 0 + Φ 1 X t 1 + A t X t = φ 0 + Φ 1 X t 1 + Φ 2 X t 2 + A t. X t = φ 0 + Φ 1 X t Φ i X t i + A t. where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ i are k k matrices, with Φ i 0, and {A t } is a sequence of serially uncorrelated k-vectors with 0 mean and constant positive definite variancecovariance matrix Σ. We test sequentially that Φ h = 0, using likelihood ratio tests. The likelihood ratio leads to two similar tests, Wald tests and score tests (also called Rao tests and Lagrange multiplier tests ). In a Wald test, we use the MLE under the hypothesized model. 7
8 Sequential Tests for Φ j = 0 in a VAR(p) Model We ll use a Wald test, using given data x 1,..., x n. That means to test a model with i 1 terms versus a model with i 1 terms, the log of the likelihood ratio only involves ( ) Σ log i, Σ i 1 where the Σ s are the MLEs of the variance-covariance of the errors in the model with the appropriate number of terms. When i = 1, Σ 0 is just the sample variance of the x s. With proper normalizing factors shown in equation (8.18) on page 406 (derived by Tiao and Box), the log likelihood ratio has an asymptotic chi-squared distribution with k 2 degrees of freedom under the null hypothesis. (This asymptotic distribution holds under what I call the Le Cam regularity conditions see Gentle (2013), page 169. These are satisfied if our likelihood is correct in the first place!) 8
9 Sequential Tests for Φ j = 0 in a VAR(p) Model We test H 0 : Φ 1 = 0 versus H 1 : Φ 1 0. What next? In practice, whether or not we reject, we may try H 0 : Φ 2 = 0, but usually we don t we proceed to the next model only if we reject the preceding hypothesis. I am not sure whether there is an R function that does these tests directly, but the output of VAR in the var package can easily be used to compute the statistic. 9
10 The ARCH Effect in a VAR(p) Model An extension to the VAR model allows for the volatility to vary as in an ARCH or GARCH model. The R function serial.test in the var package computes the portmanteau test statistic for the ARCH effect (at least if the model is VAR+ARCH). 10
11 Forecasting with a VAR(p) Model Forecasting with a VAR(p) model is similar to the same thing in a univariate model. Given x t,..., x t p+1, the 1-step-ahead forecast at time t is X t (1) = φ 0 + and the forecast error is A t+1. p i=1 Φ i X t+1 i, Substituting, we get the 2-step-ahead forecast at time t as X t (2) = φ 0 + Φ 1 X t (1) + p i=2 and the forecast error is A t+2 + Φ 1 A t+1. Φ i X t+2 i, 11
12 Impulse Response Function We can also express a causal VAR(p) as an infinite moving average model model just as we did with a univariate model: X t = θ 0 + A t + Ψ 1 A t 1 + Ψ 2 A t 2 + The coefficient matrices in such an infinite MA model are called impulse response functions. What s causal? 12
13 Vector Moving-Average or VMA(q) Models and VARMA(p, q) Models The vector moving-average or VMA(q) model is the obvious extension of the univariate MA model. We can write it as X t = θ 0 + A t Θ 1 A t 1 Θ q A t q or X t = θ 0 + Θ(B)A t. The differences between a VMA and an MA are similar to the differences between a VAR and an AR. Also, just as we combine an AR model and an MA model, we combine a VAR and a VMA to get a vector ARMA or VARMA(p, q) model. 13
14 Marginal Models of Components of VMA(q) Models The marginal models of a VMA(q) model are just MA(q) models. We see this because the cross-correlation matrix of X t vanishes after lag q, and so we can write X it as X it = θ i0 + q j=1 θ i,j B i,t j, where {B i,t j } is a sequence of uncorrelated random variable with 0 mean and constant variance. 14
15 Marginal Models of Components of VAR(p) Models One approach to studying the marginal components of VAR(p) models is by use of the structural equations. These are formed by diagaonalizing the variance-covariance matrix of A t, as we discussed last week for a VAR(p) model. This approach shows the concurrent relationships of one component to all the others. Another approach is to obtain explicit representations of all of the component series as AR models. We can do this if we can diagonalize the AR polynomial coefficient matrix in a VAR(p) model. 15
16 Marginal Models of Components and Diagonalizing Matrices Some technical notes are in order here. A nonnegative definite matrix can always be diagonalized by a Cholesky decomposition, but not all square matrices can be diagonalized. A matrix that can be diagonalized is called a regular matrix. (See Gentle, 2007, pages 116 and following, for conditions and general discussion of the problem.) One general method of diagonalizing a regular matrix A is to use the matrix V whose columns are linearly independent eigenvectors of A and C is the diagonal matrix whose elements are the eigenvalues of A. This requires both premultiplication and postmultiplication of A, and if the matrix is not of full rank, requires some rearrangement of the rows and columns. 16
17 Marginal Models of Components We ll just do the example in the text for the VAR(1) case for k = 2. The bivariate VAR(1) model is [ 1 Φ11 B Φ 12 B Φ 21 B 1 Φ 22 B ] [ X1t X 2t We premultiply both sides by [ 1 Φ22 B Φ 12 B Φ 21 B 1 Φ 11 B ] ] =. [ A1,t A 2,t ] This gives us the marginal models, in which each AR component has a coefficient of (1 Φ 11 B)(1 Φ 22 B) Φ 12 Φ 21 B 2. Note, however, that we have AR(2) models on the left side and we have MA(1) models on the right side; that is, a bivariate VAR(1) model became two marginal ARMA(2,1) models. 17
18 Marginal Models of Components This idea generalizes (with a lot of tedious algebra). A k-variate VAR(p) model yields k ARMA(kp,(k 1)p) models. The VMA(q) part of the original VARMA may add up to q addition MA components. In general, however, the number of MA components is min((k 1)p, q). We next consider some other ways that decomposing a VAR(p) model can lead to new insights about the process in some cases. This is the case where we have cointegration. 18
19 Unit-Root Nonstationarity Many economic time series exhibit either (apparent) random walk behavior, P t = P t 1 + A t, or random walk with a drift behavior, P t = µ + P t 1 + A t, where {A t } is iid with variance σ 2 A. Either of these processes has unit-root nonstationarity. These processes can be made stationary by differencing; that is, the series is integrated. We speak of integrated series of order d, and denote as I(d), if d differences result in a stationary process. Notice the effects of the nonstationarity. 19
20 Simple Random Walk Process In the simple random walk process, the k-step ahead forecast is It is not mean reverting. P t (k) = E(P t+k p t, p t 1,..., p 0 ) = p t. The forecast error is e t (k) = a t+k + + a t+1 Its variance is V(E t (k)) = kσa 2. The forecast has no value. 20
21 Random Walk Process with Drift In the simple random walk process, the k-step ahead forecast is P t (k) = E(P t+k p t, p t 1,..., p 0 ) = kµ + p 0. It is not mean reverting. The conditional variance of P t is tσa 2, which grows without bound. I should mention one more type of nonstationary process. It is the trend-stationary process, P t = α 0 + α 1 t + A t. Notice that this process is not stationary because of its mean; its variance, however, is time invariant. This process can be made stationary by detrending, that is, by subtracting βt. 21
22 Spurious Regressions First, consider two trend-stationary processes, and Y t = α 0 + α 1 t + A t X t = δ 0 + δ 1 t + B t, that have nothing to do with each other (i.e., everything is independent ). Now, consider the regression of Y t on X t : Y t = β 0 + β 1 X t + ɛ t = β 0 + β 1 (δ 0 + δ 1 t + B t ) + ɛ t = γ 0 + (β 1 δ 1 )t + ɛ t. The regression test will probably be significant. This results from the trends. It is spurious, however. Everybody knows this. 22
23 Spurious Regressions Next, consider two random walks, and Y t = y t 1 + A t X t = x t 1 + B t, that have nothing to do with each other (i.e., everything is independent ). For simplicity, assume that A t and B t are iid N(0,1). Now, consider the regression of Y t on X t (without intercept): Y t = βx t + ɛ t. We see that β = Cov(Y t, X t )/V(X t ) and ɛ t N(0, t). 23
24 Spurious Regressions of Random Walks Granger and Newbold, in a very famous Monte Carlo study in 1974, found that the standard t test of H 0 : β = 0 rejected 76% of the time. This example is very different from the spurious regressions of one trend-stationary series on another. The problem here is unit-root nonstationarity. 24
25 Spurious Regressions of Random Walks: A Technical Aside Consider a regression model of the form Y t = βx t + ɛ t, with the usual assumption of 0 correlations between all ɛ t and V(ɛ t ) = σ 2. What about the relationship between x t and ɛ t? The asymptotic properties (relating to normality) will hold if x t and ɛ t are independent. This is OK if x t is a constant. What about if x t is a random variable? This happens all the time in financial applications. In these applications, however, we cannot assume that x t and ɛ t are independent. Can we find a weaker condition? 25
26 Spurious Regressions of Random Walks: A Technical Aside (continued) A weaker sufficient condition is called the martingale difference assumption: E(ɛ t x t, ɛ t 1 x t 1,..., ɛ 1, x 1 ) = 0, for all t and lim t E(ɛ2 t x t, ɛ t 1 x t 1,..., ɛ 1, x 1 ) = σ 2, almost surely. The punchline is that the second condition is not satisfied in the regression of one random walk on another. The problem is that n 2 x 2 t has a nondegenerate limiting distribution. 26
27 Unit-Root Nonstationarity and Cointegration The spurious regression problem (as well as other issues) makes consideration of unit-root nonstationarity in multivariate time series important. Now let s consider unit-root nonstationarity in the context of a VARMA. There are different kinds of situations. In some cases the component time series may not have any relationships to each other (although spurious regressions may exist). In some interesting cases, however, even though the component series are unit-root nonstationary, a linear combination of some of them is stationary. This phenomenon is called cointegration. 27
28 Unit-Root Nonstationarity and Cointegration The example in the text (p 428) is a good simple one to illustrate the idea. We have the bivariate ARMA(1,1) model [ ] [ X1t X 2t X t ΦX t 1 ] [ ] X1,t 1 X 2,t 1 = A t ΘA t 1 = [ ] [ A1,t A 2,t We first determine the eigenvalues of the AR matrix ] [ ] A1,t 1 A 2,t 1 > phi <- matrix(c(0.50,-0.25,-1.00,0.50),nrow=2) > eigen(phi)$values [1] e e-20 We note that the AR coefficient matrix is singular. Also, we see that the other eigenvalue is 1. (This is a necessary condition of an idempotent matrix, but it is not sufficient. We note in this case, however, the the coefficient matrix is idempotent.) As illustrated on the previous slides, we write the model in the form that uses the backshift operator, than then we obtain the marginal components by premultiplication by [ ] B 1.00B. 0.25B B 28
29 Unit-Root Nonstationarity and Cointegration This premultiplication yields the coefficient matrix on the left as [ ] 1 B 0 ; 0 1 B hence, we see that each component is unit-root nonstationary. Now we seek a linear combination of the component time series that is stationary. Following Tsay, we transform the system as in equation (8.32). 29
30 Unit-Root Nonstationarity and Cointegration By premultiplying by the generalized inverse of the coefficient matrix, [ ] we get equation (8.32), which has two linear combinations of X 1t and X 2t, [ ] [ ] [ ] [ ] [ ] [ ] Y1t Y1,t 1 B1,t B1,t 1 = Y 2t Y 2,t 1 B 2,t B 2,t 1 The two linear combinations of X 1t and X 2t, that is, Y 1t and Y 2t, are uncoupled. Their concurrent correlation is the correlation between B 1t and B 2t (which is not 0). Y 1t is unit-root nonstationary, but Y 2t is stationary. 30
31 Cointegration Y 1t = X 1t 2X 2t is called the common trend of X 1t and X 2t. In Y 2t = 0.5X 1t + 1X 2t = b T (X 1t, X 2t ), the vector b = (0.5,1.0), which yields a stationary process, is called the cointegration vector. In general, cointegration or order m exists within a multivariate time series whenever all of the component series are unit-root nonstationary, but there exist m > 0 linearly independent cointegration vectors. A financial interpretation of a cointegrated multivariate time series is that the components have some common threads that result in linear combinations that have long-run equilibrium even though the individual components are nonstationary and have variances diverging to. 31
32 Error Corrections Unit-root nonstationarity problems can often be overcome by differencing. For the multivariate ARMA(p, q) process {X t } that is cointegrated of order m, we seek some meaningful representation of X t = X t X t 1. In a cointegrated time series, we represent the differenced time series as X t = CB T X t 1 + p 1 j=1 Φ j X t j + A t q i=1 Θ i A t i, where the C and B are k m full rank matrices, the columns of B are the cointegrating vectors, and for j = 1,...,p 1, Φ j = p Φ i i=j+1 In this representation, B T X t 1 is stationary. 32
33 Error Correction Model (ECM) for a VAR(p) Process Let {X t } be an I(0) or I(1) VAR(p) process. Following the form of representation for the cointegrated ARMA(p, q) process on the previous slide, we write the model in the form where Π = CB T. X t = µ t + ΠX t 1 + p 1 j=1 Φ j X t j + A t, The term ΠX t 1 is call the error correction term, and the model is called the error correction model or ECM. The rank of Π determines the extent of cointegration. If rank(π) = 0, there is no cointegration, and the process is actually VAR(p 1). 33
34 If rank(π) = k, that is the matrix is full-rank, there is no cointegration, and the process is just VAR(p). If < 0rank(Π) = m < k, there is cointegration of order m.
35 Johansen s Test To test for cointegration in a nominal VAR(p) process is essentially to test the rank of Π. There is a likelihood ratio test for this, called Johansen s test. It is in the R function ca.jo in the urca package. 34
36 Cointegrated Financial Time Series The only way to receive returns uniformly above a risk-adjusted rate is by arbitrage. In a fair and stable market, there is no arbitrage. Whenever cointegrated time series exist, there is often the possibility that the two series do not reflect true value. An example is pairs trading. 35
Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]
1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector
More informationVector error correction model, VECM Cointegrated VAR
1 / 58 Vector error correction model, VECM Cointegrated VAR Chapter 4 Financial Econometrics Michael Hauser WS17/18 2 / 58 Content Motivation: plausible economic relations Model with I(1) variables: spurious
More informationBooth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm
Booth School of Business, University of Chicago Business 41914, Spring Quarter 017, Mr Ruey S Tsay Solutions to Midterm Problem A: (51 points; 3 points per question) Answer briefly the following questions
More informationECON 4160, Spring term Lecture 12
ECON 4160, Spring term 2013. Lecture 12 Non-stationarity and co-integration 2/2 Ragnar Nymoen Department of Economics 13 Nov 2013 1 / 53 Introduction I So far we have considered: Stationary VAR, with deterministic
More informationTHE UNIVERSITY OF CHICAGO Booth School of Business Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay
THE UNIVERSITY OF CHICAGO Booth School of Business Business 494, Spring Quarter 03, Mr. Ruey S. Tsay Unit-Root Nonstationary VARMA Models Unit root plays an important role both in theory and applications
More informationNew Introduction to Multiple Time Series Analysis
Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More information2. Multivariate ARMA
2. Multivariate ARMA JEM 140: Quantitative Multivariate Finance IES, Charles University, Prague Summer 2018 JEM 140 () 2. Multivariate ARMA Summer 2018 1 / 19 Multivariate AR I Let r t = (r 1t,..., r kt
More informationAdvanced Econometrics
Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate
More informationVector Auto-Regressive Models
Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions
More informationVAR Models and Applications
VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions
More informationVector autoregressions, VAR
1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,
More informationECON 4160, Lecture 11 and 12
ECON 4160, 2016. Lecture 11 and 12 Co-integration Ragnar Nymoen Department of Economics 9 November 2017 1 / 43 Introduction I So far we have considered: Stationary VAR ( no unit roots ) Standard inference
More informationVAR Models and Cointegration 1
VAR Models and Cointegration 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot The Cointegrated VAR
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationEcon 423 Lecture Notes: Additional Topics in Time Series 1
Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes
More informationChapter 2: Unit Roots
Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity
More informationCointegrated VARIMA models: specification and. simulation
Cointegrated VARIMA models: specification and simulation José L. Gallego and Carlos Díaz Universidad de Cantabria. Abstract In this note we show how specify cointegrated vector autoregressive-moving average
More informationMultivariate Time Series: Part 4
Multivariate Time Series: Part 4 Cointegration Gerald P. Dwyer Clemson University March 2016 Outline 1 Multivariate Time Series: Part 4 Cointegration Engle-Granger Test for Cointegration Johansen Test
More information7. Integrated Processes
7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider
More informationVAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where:
VAR Model (k-variate VAR(p model (in the Reduced Form: where: Y t = A + B 1 Y t-1 + B 2 Y t-2 + + B p Y t-p + ε t Y t = (y 1t, y 2t,, y kt : a (k x 1 vector of time series variables A: a (k x 1 vector
More informationBooth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm
Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm Chicago Booth Honor Code: I pledge my honor that I have not violated the Honor Code during
More informationMultivariate forecasting with VAR models
Multivariate forecasting with VAR models Franz Eigner University of Vienna UK Econometric Forecasting Prof. Robert Kunst 16th June 2009 Overview Vector autoregressive model univariate forecasting multivariate
More informationUnit roots in vector time series. Scalar autoregression True model: y t 1 y t1 2 y t2 p y tp t Estimated model: y t c y t1 1 y t1 2 y t2
Unit roots in vector time series A. Vector autoregressions with unit roots Scalar autoregression True model: y t y t y t p y tp t Estimated model: y t c y t y t y t p y tp t Results: T j j is asymptotically
More informationCointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56
Cointegrated VAR s Eduardo Rossi University of Pavia November 2013 Rossi Cointegrated VAR s Financial Econometrics - 2013 1 / 56 VAR y t = (y 1t,..., y nt ) is (n 1) vector. y t VAR(p): Φ(L)y t = ɛ t The
More informationNonstationary Time Series:
Nonstationary Time Series: Unit Roots Egon Zakrajšek Division of Monetary Affairs Federal Reserve Board Summer School in Financial Mathematics Faculty of Mathematics & Physics University of Ljubljana September
More informationQuestions and Answers on Unit Roots, Cointegration, VARs and VECMs
Questions and Answers on Unit Roots, Cointegration, VARs and VECMs L. Magee Winter, 2012 1. Let ɛ t, t = 1,..., T be a series of independent draws from a N[0,1] distribution. Let w t, t = 1,..., T, be
More informationTitle. Description. var intro Introduction to vector autoregressive models
Title var intro Introduction to vector autoregressive models Description Stata has a suite of commands for fitting, forecasting, interpreting, and performing inference on vector autoregressive (VAR) models
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More information9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.
9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts
More informationTIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.
TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION
More informationTESTING FOR CO-INTEGRATION
Bo Sjö 2010-12-05 TESTING FOR CO-INTEGRATION To be used in combination with Sjö (2008) Testing for Unit Roots and Cointegration A Guide. Instructions: Use the Johansen method to test for Purchasing Power
More informationStationarity and Cointegration analysis. Tinashe Bvirindi
Stationarity and Cointegration analysis By Tinashe Bvirindi tbvirindi@gmail.com layout Unit root testing Cointegration Vector Auto-regressions Cointegration in Multivariate systems Introduction Stationarity
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationMA Advanced Econometrics: Spurious Regressions and Cointegration
MA Advanced Econometrics: Spurious Regressions and Cointegration Karl Whelan School of Economics, UCD February 22, 2011 Karl Whelan (UCD) Spurious Regressions and Cointegration February 22, 2011 1 / 18
More informationMEI Exam Review. June 7, 2002
MEI Exam Review June 7, 2002 1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB)
More informationIntroduction to Algorithmic Trading Strategies Lecture 3
Introduction to Algorithmic Trading Strategies Lecture 3 Pairs Trading by Cointegration Haksun Li haksun.li@numericalmethod.com www.numericalmethod.com Outline Distance method Cointegration Stationarity
More informationLecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem
Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Defining cointegration Vector
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationG. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication
G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?
More informationY t = ΦD t + Π 1 Y t Π p Y t p + ε t, D t = deterministic terms
VAR Models and Cointegration The Granger representation theorem links cointegration to error correction models. In a series of important papers and in a marvelous textbook, Soren Johansen firmly roots
More informationLecture 7a: Vector Autoregression (VAR)
Lecture 7a: Vector Autoregression (VAR) 1 2 Big Picture We are done with univariate time series analysis Now we switch to multivariate analysis, that is, studying several time series simultaneously. VAR
More informationThis chapter reviews properties of regression estimators and test statistics based on
Chapter 12 COINTEGRATING AND SPURIOUS REGRESSIONS This chapter reviews properties of regression estimators and test statistics based on the estimators when the regressors and regressant are difference
More informationBooth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm
Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay Midterm Chicago Booth Honor Code: I pledge my honor that I have not violated the Honor Code during
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More informationLecture 7a: Vector Autoregression (VAR)
Lecture 7a: Vector Autoregression (VAR) 1 Big Picture We are done with univariate time series analysis Now we switch to multivariate analysis, that is, studying several time series simultaneously. VAR
More informationBCT Lecture 3. Lukas Vacha.
BCT Lecture 3 Lukas Vacha vachal@utia.cas.cz Stationarity and Unit Root Testing Why do we need to test for Non-Stationarity? The stationarity or otherwise of a series can strongly influence its behaviour
More informationFinancial Econometrics
Financial Econometrics Long-run Relationships in Finance Gerald P. Dwyer Trinity College, Dublin January 2016 Outline 1 Long-Run Relationships Review of Nonstationarity in Mean Cointegration Vector Error
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationStatistics 910, #5 1. Regression Methods
Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known
More informationIt is easily seen that in general a linear combination of y t and x t is I(1). However, in particular cases, it can be I(0), i.e. stationary.
6. COINTEGRATION 1 1 Cointegration 1.1 Definitions I(1) variables. z t = (y t x t ) is I(1) (integrated of order 1) if it is not stationary but its first difference z t is stationary. It is easily seen
More informationMFE Financial Econometrics 2018 Final Exam Model Solutions
MFE Financial Econometrics 2018 Final Exam Model Solutions Tuesday 12 th March, 2019 1. If (X, ε) N (0, I 2 ) what is the distribution of Y = µ + β X + ε? Y N ( µ, β 2 + 1 ) 2. What is the Cramer-Rao lower
More informationLecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem
Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Stochastic vs. deterministic
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationTime Series Methods. Sanjaya Desilva
Time Series Methods Sanjaya Desilva 1 Dynamic Models In estimating time series models, sometimes we need to explicitly model the temporal relationships between variables, i.e. does X affect Y in the same
More informationChapter 5. Analysis of Multiple Time Series. 5.1 Vector Autoregressions
Chapter 5 Analysis of Multiple Time Series Note: The primary references for these notes are chapters 5 and 6 in Enders (2004). An alternative, but more technical treatment can be found in chapters 10-11
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationE 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test
E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October
More informationCHAPTER 21: TIME SERIES ECONOMETRICS: SOME BASIC CONCEPTS
CHAPTER 21: TIME SERIES ECONOMETRICS: SOME BASIC CONCEPTS 21.1 A stochastic process is said to be weakly stationary if its mean and variance are constant over time and if the value of the covariance between
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationCointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Fin. Econometrics / 31
Cointegrated VAR s Eduardo Rossi University of Pavia November 2014 Rossi Cointegrated VAR s Fin. Econometrics - 2014 1 / 31 B-N decomposition Give a scalar polynomial α(z) = α 0 + α 1 z +... + α p z p
More information1 Teaching notes on structural VARs.
Bent E. Sørensen February 22, 2007 1 Teaching notes on structural VARs. 1.1 Vector MA models: 1.1.1 Probability theory The simplest (to analyze, estimation is a different matter) time series models are
More informationEconometrics. Week 11. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague
Econometrics Week 11 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 30 Recommended Reading For the today Advanced Time Series Topics Selected topics
More informationChapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes
Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.
More informationVolatility. Gerald P. Dwyer. February Clemson University
Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use
More informationTime Series Econometrics 4 Vijayamohanan Pillai N
Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationElements of Multivariate Time Series Analysis
Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series
More informationIntroduction to Eco n o m et rics
2008 AGI-Information Management Consultants May be used for personal purporses only or by libraries associated to dandelon.com network. Introduction to Eco n o m et rics Third Edition G.S. Maddala Formerly
More informationNon-Stationary Time Series, Cointegration, and Spurious Regression
Econometrics II Non-Stationary Time Series, Cointegration, and Spurious Regression Econometrics II Course Outline: Non-Stationary Time Series, Cointegration and Spurious Regression 1 Regression with Non-Stationarity
More informationVector Autoregression
Vector Autoregression Jamie Monogan University of Georgia February 27, 2018 Jamie Monogan (UGA) Vector Autoregression February 27, 2018 1 / 17 Objectives By the end of these meetings, participants should
More informationHeteroskedasticity; Step Changes; VARMA models; Likelihood ratio test statistic; Cusum statistic.
47 3!,57 Statistics and Econometrics Series 5 Febrary 24 Departamento de Estadística y Econometría Universidad Carlos III de Madrid Calle Madrid, 126 2893 Getafe (Spain) Fax (34) 91 624-98-49 VARIANCE
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationEconomics 618B: Time Series Analysis Department of Economics State University of New York at Binghamton
Problem Set #1 1. Generate n =500random numbers from both the uniform 1 (U [0, 1], uniformbetween zero and one) and exponential λ exp ( λx) (set λ =2and let x U [0, 1]) b a distributions. Plot the histograms
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationEC408 Topics in Applied Econometrics. B Fingleton, Dept of Economics, Strathclyde University
EC48 Topics in Applied Econometrics B Fingleton, Dept of Economics, Strathclyde University Applied Econometrics What is spurious regression? How do we check for stochastic trends? Cointegration and Error
More informationCointegration, Stationarity and Error Correction Models.
Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationTime Series 4. Robert Almgren. Oct. 5, 2009
Time Series 4 Robert Almgren Oct. 5, 2009 1 Nonstationarity How should you model a process that has drift? ARMA models are intrinsically stationary, that is, they are mean-reverting: when the value of
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More informationVector Autogregression and Impulse Response Functions
Chapter 8 Vector Autogregression and Impulse Response Functions 8.1 Vector Autogregressions Consider two sequences {y t } and {z t }, where the time path of {y t } is affected by current and past realizations
More informationRegression with random walks and Cointegration. Spurious Regression
Regression with random walks and Cointegration Spurious Regression Generally speaking (an exception will follow) it s a really really bad idea to regress one random walk process on another. That is, if
More informationBootstrapping the Grainger Causality Test With Integrated Data
Bootstrapping the Grainger Causality Test With Integrated Data Richard Ti n University of Reading July 26, 2006 Abstract A Monte-carlo experiment is conducted to investigate the small sample performance
More informationUnivariate, Nonstationary Processes
Univariate, Nonstationary Processes Jamie Monogan University of Georgia March 20, 2018 Jamie Monogan (UGA) Univariate, Nonstationary Processes March 20, 2018 1 / 14 Objectives By the end of this meeting,
More informationMultivariate Time Series
Multivariate Time Series Fall 2008 Environmental Econometrics (GR03) TSII Fall 2008 1 / 16 More on AR(1) In AR(1) model (Y t = µ + ρy t 1 + u t ) with ρ = 1, the series is said to have a unit root or a
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More information