UNIVERSITY OF CALIFORNIA, SAN DIEGO DEPARTMENT OF ECONOMICS

Similar documents
Representation of cointegrated autoregressive processes 1 Representation of cointegrated autoregressive processes with application

Cointegration Lecture I: Introduction

Notes on Time Series Modeling

(Y jz) t (XjZ) 0 t = S yx S yz S 1. S yx:z = T 1. etc. 2. Next solve the eigenvalue problem. js xx:z S xy:z S 1

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails

by Søren Johansen Department of Economics, University of Copenhagen and CREATES, Aarhus University

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Fin. Econometrics / 31

Bootstrapping the Grainger Causality Test With Integrated Data

Fractional integration and cointegration: The representation theory suggested by S. Johansen

1 Regression with Time Series Variables

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56

Equivalence of several methods for decomposing time series into permananent and transitory components

1 First-order di erence equation

Tests for Cointegration, Cobreaking and Cotrending in a System of Trending Variables

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Stochastic Trends & Economic Fluctuations

When Do Wold Orderings and Long-Run Recursive Identifying Restrictions Yield Identical Results?

Cointegrated VARIMA models: specification and. simulation

Markov-Switching Models with Endogenous Explanatory Variables. Chang-Jin Kim 1

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT

Panel cointegration rank testing with cross-section dependence

Chapter 2. GMM: Estimating Rational Expectations Models

Y t = ΦD t + Π 1 Y t Π p Y t p + ε t, D t = deterministic terms

Testing for Regime Switching: A Comment

Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models

NUMERICAL DISTRIBUTION FUNCTIONS OF LIKELIHOOD RATIO TESTS FOR COINTEGRATION

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Title. Description. var intro Introduction to vector autoregressive models

Multivariate Time Series: Part 4

The Role of "Leads" in the Dynamic Title of Cointegrating Regression Models. Author(s) Hayakawa, Kazuhiko; Kurozumi, Eiji

Multivariate Time Series: VAR(p) Processes and Models

ECON 4160, Spring term Lecture 12

On Econometric Analysis of Structural Systems with Permanent and Transitory Shocks and Exogenous Variables

References. Press, New York

A New Approach to Robust Inference in Cointegration

Linear Algebra. James Je Heon Kim

ARIMA Models. Richard G. Pierse

Problem set 1 - Solutions

A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED

New Introduction to Multiple Time Series Analysis

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix

Testing the null of cointegration with structural breaks

Modelling of Economic Time Series and the Method of Cointegration

Estimation and Testing for Common Cycles

Cointegration Tests Using Instrumental Variables Estimation and the Demand for Money in England

Test for Structural Change in Vector Error Correction Models 1

Chapter 1. GMM: Basic Concepts

Matrix Polynomial Conditions for the Existence of Rational Expectations.

ECON 4160, Lecture 11 and 12

Modeling Expectations with Noncausal Autoregressions

Structural Breaks in the Cointegrated Vector. Autoregressive Model

A Test of Cointegration Rank Based Title Component Analysis.

Cointegration, Stationarity and Error Correction Models.

Residuals-based Tests for Cointegration with GLS Detrended Data

ECONOMETRIC METHODS II TA session 2 Estimating VAR(p) processes

Identi cation and Frequency Domain QML Estimation of Linearized DSGE Models

Vector error correction model, VECM Cointegrated VAR

TESTS FOR STOCHASTIC SEASONALITY APPLIED TO DAILY FINANCIAL TIME SERIES*

Bayesian Inference in the Time Varying Cointegration Model

Estimating the Number of Common Factors in Serially Dependent Approximate Factor Models

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

The Predictive Space or If x predicts y; what does y tell us about x?

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

On Standard Inference for GMM with Seeming Local Identi cation Failure

Numerically Stable Cointegration Analysis

Modeling Expectations with Noncausal Autoregressions

Introduction to Algorithmic Trading Strategies Lecture 3

Unit Roots in White Noise?!

SUPPLEMENTARY TABLES FOR: THE LIKELIHOOD RATIO TEST FOR COINTEGRATION RANKS IN THE I(2) MODEL

The Kuhn-Tucker Problem

An Autoregressive Distributed Lag Modelling Approach to Cointegration Analysis

Reduced rank regression in cointegrated models

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Modeling Expectations with Noncausal Autoregressions

Parametric Inference on Strong Dependence

Economics 241B Review of Limit Theorems for Sequences of Random Variables

Introduction to the Mathematical and Statistical Foundations of Econometrics Herman J. Bierens Pennsylvania State University

Vector Auto-Regressive Models

Nonsense Regressions due to Neglected Time-varying Means

VAR Models and Applications

7. MULTIVARATE STATIONARY PROCESSES

Notes on Generalized Method of Moments Estimation

On the Error Correction Model for Functional Time Series with Unit Roots

Chapter 2. Dynamic panel data models

Econometría 2: Análisis de series de Tiempo

Testing for non-stationarity

ADVANCED ECONOMETRICS I (INTRODUCTION TO TIME SERIES ECONOMETRICS) Ph.D. FAll 2014

Introduction to Linear Algebra. Tyrone L. Vincent

Tests of the Co-integration Rank in VAR Models in the Presence of a Possible Break in Trend at an Unknown Point

VAR Models and Cointegration 1

GMM estimation of spatial panels

Estimation and Inference with Weak Identi cation

Vector autoregressions, VAR

FINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 2015

Simple Estimators for Semiparametric Multinomial Choice Models

Trend-Cycle Decompositions

Lecture 11: Eigenvalues and Eigenvectors

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011

Identi cation and Frequency Domain QML Estimation of Linearized DSGE Models

Transcription:

2-7 UNIVERSITY OF LIFORNI, SN DIEGO DEPRTMENT OF EONOMIS THE JOHNSEN-GRNGER REPRESENTTION THEOREM: N EXPLIIT EXPRESSION FOR I() PROESSES Y PETER REINHRD HNSEN DISUSSION PPER 2-7 JULY 2

The Johansen-Granger Representation Theorem: n Explicit Expression for I() Processes Peter Reinhard Hansen y U San Diego and rown University Email: reinharducsdedu Phone: (4) 863-3836 July, 2 bstract The Johansen-Granger representation theorem for the cointegrated vector autoregressive process is derived using the companion form This approach yields an explicit representation of all coe cients and initial values This result is useful for impulse response analysis, common feature analysis and asymptotic analysis of cointegrated processes y I thank Graham Elliott, James D Hamilton, Hans hristian Kongsted, nders Rahbek, and Halbert White for valuable comments ll errors remain my responsibility Financial support from the Danish Social Science Research ouncil and the Danish Research cademy is gratefully acknowledged

Introduction The Johansen-Granger representation theorem states that a vector autoregressive process (L)X t = " t, integrated of order one, has the representation X t = P t i= " i + (L)" t + ; where f(l)" t g is stationary if f" t g is stationary and where depends on initial values (X ;X ;:::); (see Johansen (99, 996)) Johansen s result gives explicit values of whereas the coe cients of the lag polynomial, (L); and the initial value, ; are given implicitly This representation of cointegrated processes is known as the Granger representation and is synonymous with the Wold representation for stationary processes ecause the representation divides X t into a random walk and a stationary process, it can be viewed as multivariate everidge- Nelson decomposition where the labels are permanent and transitory components, (see everidge and Nelson (98)) The Granger representation is valuable in the asymptotic analysis of cointegrated processes, where typically only an explicit expression for is needed Explicit values for the coe cients in (L) are useful in common feature analysis, (see Engle and Kozicki (993)), and in impulse response analysis, (see Lütkepohl and Reimers (992), Warne (993), and Lütkepohl and Saikkonen (997)), where the coe cients of (L) are interpreted as the transitory e ects of the shocks " t Similarly, in asymptotic analysis of the model with structural breaks, see Hansen (2), it is valuable to have an explicit value for : In this paper, explicit values of coe cients as well as initial values are found using the companion form, making use of the algebraic structure that characterizes this model From Johansen (996) we adopt the following de nitions: for an m n matrix a with full column rank n, we de ne ¹a = a(a a) and let the orthogonal complement of a; be the full rank m (m n) matrix a? that has a? a =: In Section 2 the explicit representation is derived In Section 3 we consider deterministic aspects of the representation Section 4 contains concluding remarks and the appendix contains relevant algebra The original Granger representation theorem, given by Engle and Granger (987), asserts the existence of an error correction representation of X t; under the assumptions that X t and X t have stationary and invertible VRM representations, for some matrix : The Johansen-Granger representation theorem, of Johansen (99, 996), makes assumptions on the autoregressive parameters, that precisely characterizes I() processes, and states results on the moving average representation of X t: 2

2 The Granger Representation for utoregressive Processes Integrated of Order One We consider the p-dimensional vector autoregressive process of order k X t = X t + 2 X t 2 + + k X t k + D t + " t ; t =;:::;T; where the process deterministic terms are contained in D t and where " t, t =;:::;T is a sequence of independent identically distributed stochastic variables with mean zero 2 The process can be re-written in error correction form: k X X t = X t + i X t i + D t + " t ; t =;:::;T i= where = I + P k i= i and i = P k j=i+ j: The conditions that ensure that X t is integrated of order one, referred to as X t being I(), are stated in the following assumption: ssumption 2 The assumptions of the Johansen-Granger representation theorem are: (i) The roots of the characteristic polynomial det((z)) = det(i z 2 z 2 k z k ) are either outside the unit circle or equal to one (ii) The matrix has reduced rank r<p;and can therefore be expressed as the product = where and are p r matrices of full column rank r: (iii) The matrix?? has full rank, where =I P k i= i and where? and? are the orthogonal complements to and : The rst assumption ensures that the process is not explosive (roots in the unit circle) or seasonally cointegrated (roots on the boundary of the unit circle di erent from z = ), (see Hylleberg, Engle, Granger, and Yoo (99) or Johansen and Schaumburg (998)) The second ensures that there are at least p r unit roots and induces cointegration whenever r : The 2 The Granger representation is not relying on the assumptions on " t; since it is entirely an algebraic derivation However the iid assumption is important for some of the interpretations of the representation 3

third assumption restricts the process from being I(2); because (iii) together with (ii) ensures that the number of unit roots is exactly p r: Under these assumptions, Johansen (99) showed that X t has the representation X t = P t i= (" i + D i )+(L)(" t + D t )+; where =? (??)?: y using the companion form of the process, it is possible to obtain explicit values for the coe cients of the lag polynomial (L) = + L + 2 L 2 + ; and the initial values contained in ; as I show below The following lemma will be useful Lemma 22 Let a and b be m n matrices, m n with full column rank n; and let a? and b? be their orthogonal complements, respectively The following ve statements are equivalent (i) The matrix (I + b a) does not have as an eigenvalue (ii) Let v be a vector in R m : Then (b a)v =implies v =: (iii) The matrix b a has full rank (iv) The m m matrix (b; a? ) has full rank (v) The matrix b? a? has full rank Proof The equivalence of (i), (ii) and (iii) is straightforward, and the identity j(a; a? )jj(b; a? )j = j(a; a? ) (b; a? )j = j a b j = ja bjja?a? j a? b a? a? proves that (iii) holds if and only if (iv) holds Finally, the identity j(b; b? )jj(b; a? )j = j(b; b? ) (b; a? )j = j b b j = jb bjjb? a?j b? b b? a? completes the proof 2 The ompanion Form We transform the process into the companion form, by de ning X t = Xt ;X t ;:::;X t k+ 4

so that with suitable de nitions X t = X t + t + " t = X t + t + " t ; which converts the process to a vector autoregressive process of order one The needed de nitions are = + 2 k k 2 k I I I I I ; = k I I ; = I I I I I ; " t = " t ; t = D t : It is easily veri ed that the orthogonal complements of and are given by? =?? k? ;? =?? : Lemma 23 Let ; ; and be de ned as above, and assume that ssumption 2 holds Then the eigenvalues of the matrix (I + ) are all less than one in absolute value 5

Proof y ssumption 2 (iii); the identity?? =?(I k )? shows that?? has full rank, and by Lemma 22, we have that is not an eigenvalue of (I + ) However we need to show that the eigenvalues are smaller than one in absolute value Therefore consider an eigenvector v =(v ;:::;v k ) 6=of (I + ); eg (I + )v = v: The upper r + p rows of (I + )v yields v + ( v + v 2 + + k v k ) = v ( v + v 2 + + k v k ) = v 2 which implies v 2 =( )v ; and the remaining part implies v 2 = v 3 = = k 2 v k : The case =clearly ful lls j j < so assume 6= : Multiply the second set of equations by ( )= k and substitute z == to obtain [I( z) z ( z)z k ( z)z k ]v k =: This is equivalent to I( z) z k X i ( z)z i =; i= and since ssumption 2 has jzj > we conclude that j j < : The result has the implication that under ssumption 2 the sum P i= ( + ) i is convergent with limit ( ), such that a process de ned by Y t = P i= ( + ) i u t i is stationary whenever u t is stationary Lemma 24 With the de nitions above we have the identities: (I ) = (I ) ¹ I = (I ) ¹ + ( I)+ ¹?? : Proof Since I = ( ) +?(??)? = ¹ +?¹?; the rst identity follows from (I ) = (I )( ¹ +?¹? ) 6

= (I ) ¹ +?¹?? (??)?? ¹? = (I ) ¹ ; and the second follows by applying the rst identity and that = ¹?? : We are now ready to formulate the main result Theorem 25 (The Johansen-Granger representation theorem) Let a process be given by the equation k X X t = X t + i X t i + D t + " t ; i= and assume that ssumption 2 holds Then the process has the representation tx X t = (" i + D i )+(L)(" t + D t )+(X X k X k+ ) i= where =? (??)? and where the coe cients of (L) are given by i = GQ i E ;2 where G = ((I ); ;:::; k ) Q = I + k 2 k k 2 k I I E ;2 = (I p ;I p ; ; ; ) : Proof Under ssumption 2 the pk pk matrix ( ;?) has full rank We can therefore obtain the Granger representation for X t by nding the moving average representation for the processes X t and? X t individually and then stacking them and multiplying by ( ;? ) : First, consider the process X t =(I + ) X t + (" t + t ): 7

Since all the eigenvalues of (I + ); according to Lemma 23, are smaller than one in absolute value, the process has the stationary representation X t = (L)(" t + t ) where i =(I+ ) i, and where by stationary we mean that X t E X t is stationary Next consider the random walk? X t =? X t +? (" t + t ) tx =?X + i=?(" i + i ): representationforx t is now obtained as X t =( ;?) (L)(" t + t ) P t i=? (" i + i )+? X : Theentirematrix( ;? ) is given in the ppendix, for our purposes we only need its upper p rows that de ne the equation for X t Theserowsaregivenby (I ) ¹ ; s ;:::; s k ;¹? with the de nition s i = i + + k : For simplicity, we de ne F = (I ) ¹ ; s ;:::; s k and obtain the representation for X t : X t = (F; ¹? ) (L)(" t + t ) P t i=? (" i + i )+? X = F (L)(" t + t )+ ¹? tx = (L)(" t + D t )+ i=?(" i + i )+ ¹??X tx (" i + D i )+; i= 8

where the initial value is explicitly given by = ¹??X = (X X k X (k ) ); and the coe cients of the polynomial (L) are given by i = F (I + ) i E = FD i E ;2 ; with the additional de nitions D = (I + ) = I ;E = I p and E ;2 = I p I p : ecause (I + ) = (I + ) we have that D = Q where Q is as given in the theorem Thus, the coe cients can be written as i = FD i E ;2 = F Q i E = GQ i E ;2 where G = F = (I ); s ;:::; s k ; where we applied the identity (I ) ¹ =(I ) of Lemma 24 This completes the proof orollary 26 The coe cients of (L) can be obtained recursively from the formula ix i = i + ( + j ) i j ; i =; 2;:::; j= 9

where = I and = I and where we set j =for j k: Proof From the proof of the Johansen-Granger representation theorem we have that i = GQ i E ;2 So by de ning i = ;i 2;i k;i = Q i = Q i E ;2 ; tedious algebra (given in the ppendix) leads to the relation i = i + 2;i ; = ; i =; 2;::: and the structure of Q yields the equation ix 2;i = ( + j ) 2;i j ; 2; = I i =; 2;:::: j= y inserting 2;i j = i j i j we nd the equation of the corollary one s a special case we formulate the representation for the vector autoregressive process of order orollary 27 Let X t = X t + " t be a process ful lling ssumption 2 Then we have the representation tx X X t = " i +( ) i I + "t i + X i= i= where =? (?? )? : The result of orollary 27 is derived directly in Johansen (996) by dividing the process into its stationary and non-stationary part with the identity I = +? (??)?: The proof of Theorem 25 made use of the more general identity I =(I ) ¹ + ( I)+ ¹?? of Lemma 24, which simpli es to the identity in Johansen (996) when =I; as is the case for a VR() process 3 Deterministic Terms In this section we study the stationary polynomial s role for the deterministic term The deterministic part plays an important role for the asymptotic analysis of this model, because the limits

of some test statistics depend on the deterministic term The literature has developed a notation for models with di erent deterministic terms which we shall adopt First we analyze the model H : This model contains only a constant D t = ¹ ; which in general will give rise to a linear trend in the process X t Next, we also analyze its sub-model H ; which has the deterministic term D t = ½ This is equivalent to the restriction on the constant ¹ =; which is precisely what is needed for X t not to have a linear trend We also analyze the models H and H : Model H has a linear deterministic trend D t = ¹ + ¹ t; which gives rise to a quadratic trend in the process X t, and the sub-model H, has the deterministic trend restricted to D t = ¹ + ½ t, which prevents the X t from having a quadratic trend 3 The Models H and H When the deterministic term is simply a constant ¹ = D t ; the Granger representation is given by tx X t = " t i + (L)" t + ()¹ + ¹ t + : i= So unless ¹ =; the constant ¹ leads to a deterministic linear trend in the process X t The matrix () is calculated in the appendix and is found to be () = (I ) ¹ ¹ (I ) = ª; Ã k! X i i i= where =(I ) ¹ ; =¹ ( I) and ª= P k i= i i = P k P k i= j=i j: This result encompasses two ndings from Hansen and Johansen (998) The rst is that E( X t )= ()¹ =¹ ( I)¹ ; and the second is that in H ; where ¹ = ½ ; the linear trend vanishes while the constant in the processisgivenby()¹ = (I ) ¹ ½:

32 Models H and H When the deterministic term contains a linear trend, D t = ¹ + ¹ t; the deterministic part of the Granger representation is given by 2 ¹ t 2 + ¹ + 2 ¹ t + (L)(¹ + ¹ t) ; (see Hansen and Johansen (998)) This can be re-written as 2 ¹ t 2 + ¹ + Ã! 2 + () X ¹ t + ()¹ i i ¹ : (3) So unless? ¹ =the linear trend ¹ leads to a quadratic deterministic trend in the process X t : The only term of (3) not derived previously, is P i= i i: This term is derived in the appendix and is given by i= + ª + ª ª ªª k(k ) ª: 2 In model H where the linear trend is restricted to ¹ = ½ ; (3) reduces to ¹ (I ) ¹ ½ t + ()¹ + ( I) ¹ () ½ which encompasses a result from Johansen (996, equation 52), because the expression for ; in Johansen (996, equation 52), equals ¹ (I ) ¹ ½ 4 onclusion We gave an explicit expression of the moving average representation for processes integrated of order one using the companion form for the process The explicit expression is useful to have in studies of impulse response functions and in common features analysis s a side bene t the approach gives a new proof of the Johansen-Granger representation theorem, a proof that some might nd more intuitive and easy to follow than previous proofs 2

ppendix : Proofs The Inverse of ( ;? ) In the proof of the Johansen-Granger representation theorem we need an explicit expression for the rst p rows of ( ;? ) : The entire matrix is given by ( ;?) = (I ) ¹ s s k ¹? (I ) ¹ s I s k ¹? (I ) ¹ s s k ¹? (I ) ¹ s s k ¹? (I ) ¹ s s k I ¹? ; which is veri ed by multiplying it by ( ;? ) and using the identity (I ) ¹ =(I ) from Lemma 24 2 The Expression for i In orollary 26 we asserted the relation i = i + 2;i ;i=; 2;:::: This relation is proved as follows First, notice from the equation for i given by i = ;i 2;i k;i = I + k 2 k k 2 k I I i ; = E ;2 that k;i = 2;i k+2 k 2; and that ;i = ;i + 2;i,why ;i = P i j= 2;j: So that ix 2;i = ( + j ) 2;i j ; 2; = I i =; 2;:::; j= and note that 2;i = 2;i + + k k;i : Next consider i = G i =(I ) ;i s 2;i s k k;i 3

= (I ) ( 2;i + ;i )+( I) 2;i s 2 3;i s k k;i = (I ) 2;i +(I ) ;i s 2 3;i s k k;i = (I ) 2;i +(I ) ;i ( s ) 2;i ( s k 2 k 2) k ;i ( k k ) k;i = G i + 2;i = i + 2;i : Which completes the proof 3 n Expression for () In the analysis of the deterministic terms we need to calculate X () = F I + i E = F E : i= The inverse of is given by = 2 k 2 k I 2 k 2 k I I I I I ; = ¹ (I ) ¹ ¹ (I ) s ¹ (I ) s k (I ) ¹ s I s k (I ) ¹ s I s k I : So 4

= ¹ (I ) ¹ (I ) s ¹ (I ) s k s + I s k s s k + I and therefore we nd E = ¹ (I ) ; () and nally that () = ¹ (I ) (I ) ¹ ; s ;:::; s k = (I ) ¹ ¹ k X Xk ( I) j : = ª i= j=i where =(I ) ¹ ; =¹ ( I) and ª= P k P k i= j=i j = P k i= i i: 4 n Expression for P i= i i Inthecasewherethedeterministictermisgivenby D t = ¹ + ¹ t we make use of X i= I + i i = ³ 2 + : 5

Thesecondtermiscalculatedinthecasewith D t = ¹, and the term we need to add is given by 2 E = ª + ª + + ª +(k ) thus F 2 E = ª ª ªª k(k ) ª 2 so that X ³ i i = F + 2 i= E = + ª + ª ª ªª k(k ) ª: 2 References everidge, S, and R Nelson (98): New pproach to Decompositions of Time Series Into Permanent and Transitory omponents with Particular ttentions to Measurement of the usiness ycle, Journal of Monetary Economics, 7, 5 74 Engle, R F, and W J Granger (987): o-integration and Error orrection: Representation, Estimation and Testing, Econometrica, 55, 25 276 Engle, R F, and S Kozicki (993): Testing for ommon Features, Journal of usiness and Economic Statistics,, 369 38 Hansen, P, and S Johansen (998): Workbook on ointegration Oxford University Press, Oxford Hansen, P R (2): Structural hanges in ointegrated Processes, PhD thesis, University of alifornia at San Diego Hylleberg, S, R F Engle, W J Granger, and S Yoo (99): Seasonal Integration and ointegration, Journal of Econometrics, 44, 25 238 Johansen, S (99): Estimation and Hypothesis Testing of ointegration Vectors in Gaussian Vector utoregressive Models, Econometrica, 59, 55 58 (996): Likelihood ased Inference in ointegrated Vector utoregressive Models Oxford University Press, Oxford, 2nd edn 6

Johansen, S, and E Schaumburg (998): Likelihood nalysis of Seasonal ointegration, Journal of Econometrics, 88, 3 339 Lütkepohl, H, and H-E Reimers (992): Impulse Response nalysis of ointegrated Systems, Journal of Enonomic Dynamics and ontrol, 6, 53 78 Lütkepohl, H, and P Saikkonen (997): Impulse Response nalysis in In nite Order ointegrated Vector utoregressive Processes, Journal of Econometrics, 8, 27 57 Warne, (993): ommon Trends Model: Identi cation, Estimation and Inference, Seminar Paper No 555, IIES, Stockholm University 7