Representation of cointegrated autoregressive processes 1 Representation of cointegrated autoregressive processes with application

Size: px
Start display at page:

Download "Representation of cointegrated autoregressive processes 1 Representation of cointegrated autoregressive processes with application"

Transcription

1 Representation of cointegrated autoregressive processes Representation of cointegrated autoregressive processes with application to fractional processes Søren Johansen Institute of Mathematical Sciences and Department of Economics University of Copenhagen Short title: Representation of cointegrated autoregressive processes Abstract We analyze vector autoregressive processes using the matrix valued characteristic polynomial. The purpose of this paper is to give a survey of the mathematical results on inversion of a matrix polynomial in case there are unstable roots, to study integrated and cointegrated processes. The new results are in the I(2) representation, which contains explicit formulas for the rst two terms and a useful property of the third. We de ne a new error correction model for fractional processes and derive a representation of the solution. Keywords: Granger representation, error correction models, integration of order and 2, fractional autoregressive model, JEL Classi cation: C32 I would like to thank the Danish Social Sciences Research Council for continuing support, the members of the ESF-EMM network for discussions and Morten Ørregaard Nielsen for useful comment. The paper is partly based on a lecture presented at the Joint Statistics Meeting, San Francisco 2003, in a session in honour of T. W. Anderson.

2 Representation of cointegrated autoregressive processes 2 Introduction The integration and cointegration properties of autoregressive processes are studied by analyzing the inverse function of the matrix polynomial generating the coe cients. This is an well known technique for stationary processes, and for I() processes this was also the approach in the original proof of the Granger Representation Theorem, see Engle and Granger (987). The purpose of this paper is to give a survey of mathematical results on the inverse of a matrix polynomial. This is given in section 2. In section 3 we then discuss the application of the results to the well known models for I(), I(2); explosive, and seasonally integrated processes. The I(2) representation is extended to give the rst two terms explicitly and a useful property of the third, which implies that the cointegrating relations are I(0) not just stationary. As an application of the representation results, we consider in section 4 a new error correction model for fractionally cointegrated processes and derive a representation of the solution, see Granger (986). 2 Expansion and inversion of matrix polynomials To establish some notation consider the p dimensional autoregressive process X t given by X t = kx i X t i + " t ; t = ; : : : ; T: () i= The solution X t is a function of initial values and the shocks " t ; which we assume to be a sequence of i.i.d. variables with mean zero and variance. We de ne the

3 Representation of cointegrated autoregressive processes 3 characteristic polynomial (z) as the p p matrix function (z) = I p kx i z i ; i= and the characteristic equation by j(z)j = 0; where j(z)j = det((z)): Let the roots of the characteristic equation be z ; : : : ; z N : We call a root z i stable if jz i j > ; unstable if jz i j and explosive if jz i j < : Thus stable = minfjz i j : z i stableg > because it is the minimum of a nite set of numbers greater than one. Note that (0) = I p implies z i 6= 0: The expression (z) = Adj((z)) ; z 6= fz ; : : : ; z N g j(z)j shows that (z) has a pole at z = fz ; : : : ; z N g; since j(z m )j = 0: If the pole is of order one it has the form C m ( z=z m ) ; so that the function H(z) de ned by H(z) = (z) C m z=z m ; z 6= fz ; : : : ; z N g has no pole at z m and has a convergent power series in a neighborhood of z = z m. We rst prove two theorems on expansion of matrix polynomials, which translated into results about the autoregressive process () will give the error correction formulations of the autoregressive equations. Note that the points of expansion need not

4 Representation of cointegrated autoregressive processes 4 be roots of the polynomial. The rst result can be interpreted as an interpolation formula, due to Legendre, which nds a polynomial which at n + points takes prescribed values. Theorem Let z i ; i = 0; ; : : : ; n be distinct and z 0 = 0. The matrix polynomial (z) has the expansion around z 0 ; z ; : : : ; z n (z) = I p p(z) + (z m ) p m(z) z + p(z)z (z); p m (z m ) z m m= where (z) is a matrix polynomial and p(z) = ny ( z=z i ); p m (z) = i= ny ( z=z i ): (2) i= i6=m Proof. The matrix polynomial R(z) = (z) I p p(z) (z m ) p m(z) z ; p m (z m ) z m m= is zero for z = z j : R(z j ) = (z j ) I p p(z j ) (z m ) p m(z j ) z j = 0; j = 0; : : : ; n; p m (z m ) z m m= because (0) = I p ; p(0) = ; p(z j ) = 0 and p m (z j ) = p m (z m ) fj=mg ; j = ; : : : ; m. Hence each entry of R(z) has p(z)z as a factor so that R(z) = p(z)z (z) for some matrix polynomial (z).

5 Representation of cointegrated autoregressive processes 5 Next we give an expansion around two points, where we can prescribe the values of the function and the rst derivative at one of the points. We denote by (z) _ the derivative and by (z) the second derivative with respec to to z. A similar result can be given for more than two points and higher derivatives, Johansen (93). Theorem 2 Let z 6= z 2 : The polynomial has the expansion (z) = (z z ) 2 (z 2 z ) 2 (z 2) + ( (z z ) 2 (z 2 z ) 2 )(z ) +(z z ) (z z 2) (z z 2 ) _ (z ) + (z z ) 2 (z z 2 ) (z) for some matrix polynomial (z): Proof. The matrix polynomial R(z) = (z) (z z ) 2 (z 2 z ) 2 (z 2) ( (z z ) 2 (z 2 z ) )(z ) (z z 2 ) (z z 2) (z (z z 2 ) _ ); satis es R(z ) = R(z 2 ) = 0 and R(z _ ) = 0: Thus each entry of R(z) has the factor (z z 2 )(z z ) 2, so that R(z) = (z z 2 )(z z ) 2 (z); as was to be proved. Corresponding to the two types of expansions in Theorems and 2, we next give a representation of the inverse polynomial under various assumptions on the roots. We apply the notation (z) = m + (z z m ) _ m + 2 (z z m) 2 m + (z z m ) 3 (z); (3) and in case the root is z = ; we leave out the subscript m.

6 Representation of cointegrated autoregressive processes 6 These results have essentially been derived before, see Johansen and Schaumburg (998) and Nielsen (2005b). We give the extra term C in the expansion (7), see Rahbek and Mosconi (999) for the I() model, and as a new result a complete expression for C in (3) and a partial result on C 0 ; (4), for the I(2) model. For a p r matrix of rank r(< p) we use the notation a = a(a 0 a) ; and a? is a p (p r) matrix of rank p r for which a 0 a? = 0: We say that the root z m has multiplicity n; if det((z)) = ( z=z m ) n g(z); g(z m ) 6= 0: Theorem 3 Let z 0 be a root of the characteristic polynomial of multiplicity n and let 0 = 0 of rank r. If the I ( ) condition j 0? _ 0? j 6= 0 (4) is satis ed, then the multiplicity n equals p r and (z) = C + C + ( z=z 0 )H(z); 0 < jz z 0 j < ; (5) z=z 0 for some > 0; where H(z) is regular and has no singularity at z = z 0 and C = z 0? ( 0? _ 0? ) 0?; (6) C = 0 + C _ _ 0 C C( _ 0 0 _ )C: (7) The proof is given in the Appendix. Note that the root z 0 need not be unstable, and that there may be other roots in the characteristic equation. This implies that we can

7 Representation of cointegrated autoregressive processes 7 only expand H(z) in a neighborhood of z 0 : When we apply the result to autoregressive processes we need to take into account the position of the other roots: Corollary 4 Let fz m ; m = ; : : : ; ng be n roots of the characteristic polynomial of multiplicity n m ; m = ; : : : ; n and let (z m ) = m 0 m of rank r m. If the I() condition (4) is satis ed for all m, then n m = p r m and (z) = m= C m z=z m + K(z); (8) for some > 0; where K(z) is has no singularities for z = z ; : : : ; z n and C m is given by (6) with (; ; z 0 ; _ 0 ) replaced by ( m ; m ; z m ; _ m ). Proof. By repeated application of Theorem 3 we can remove the poles z m ; m = ; : : : ; n: If we remove all the unstable roots, then H(z) is regular on jzj < stable ; and we can hence expand around z = 0; with exponentially decreasing coe cients because stable >. These coe cients will be used to de ne a stationary process in the proof of Granger s representation theorem. Finally let us consider the case where the I() condition (4) fails. For notational reasons we formulate the I(2) result in Theorem 5 only for the root z = : We let () = 0 have rank r < p, but also assume that 0? _? = 0 ; has reduced rank s < p r; so that the I() condition (4) is not satis ed. We de ne the mutually

8 Representation of cointegrated autoregressive processes 8 orthogonal directions (; ; 2 ) = (;? ;?? ) and (; ; 2 ) = (;? ;?? ): (9) Theorem 5 Let z = be a root so that () = 0 of rank r; and assume that 0? _? = 0 of rank s: Then, if the I(2) condition j 0 2[ _ 0 _ + 2 ] 2 j 6= 0 (0) is satis ed at z =, it holds that (z) = C 2 ( z) + C 2 z + C 0 + ( z)h(z); 0 < jz j < ; () where H(z) is regular for jz j <, and C and C 2 are given by C 2 = 2 ( ) 0 2; (2) C = 0 + [ 0 0 _ ]C2 + C 2 [ 0 _ 0 ] +C 2 [ _ _ 0 _ 0 _ 0 _... + ]C 2 ; (3) 6 where = _ 0 _ + 2 : Finally C 0 satis es 0 C 0 = I r + 0 C 2 : (4)

9 Representation of cointegrated autoregressive processes 9 Note that C satis es 0 C = 0 _ C2 ; C = C 2 _ ; (5) 0 C = 0 (I p C 2 ); 0 C = I s : (6) The proof is given in the Appendix. Note that in (36) of the proof appears the relation j(z)j = ( z) p r jk(z)jja 0 Bj : This is relation (i) (det(g()) = r g(); = z) from Lemma in Engle and Granger (987) 2. The I() condition is equivalent to g(0) = jkjja 0 Bj 6= 0; or that the multiplicity of the root is the dimension minus the rank of : Thus the idea of applying a general theorem about inversion of polynomials to discuss the solution of the error correction model is found in Engle and Granger (987). The approach taken here gives some information on the conditions under which the matrix function can be inverted and leads to the explicit formulae for some of the coe cients. We start with the error correction model which we usually t to data, and then nd conditions under which the solution is I() or I(2). The approach taken by Granger was to start with an I() process, which exhibits cointegration, and then derive the (in nite lag) error correction model that it satis es. In both cases the proof of the result involves the inversion of the matrix function. For I(d) processes similar 2 A counter example to Lemma in the paper by Engle and Granger (987) is given by taking G() = diag( + ; 2 ; 2 ): What is missing in Lemma is a condition corresponding to the I() condition of cointegration (see Johansen 996, Theorem 4.5.)

10 Representation of cointegrated autoregressive processes 0 results were developed by la Cour (998). A systematic exposition of the mathematics behind the representations for I() and I(2) processes is given by Faliva and Zoia (2006). Another type of proof of these results applies the Smith form of a matrix polynomial, which is a reduction of a matrix polynomial to diagonal form using matrix polynomials with determinant one, so that the question of inverting the matrix is reduced to discussing a diagonal matrix, see Engle and Yoo (99) or Ahn and Reinsel (994) and the survey by Haldrup and Salmon (998). In this context, the I() condition is most naturally given as the assumption that the number of unit roots of (z) equals p minus the rank of (). The equivalence of the two conditions is given in Johansen (996, Corollary 4.3), see also Anderson (2003) for a simple proof. Yet another way of considering these results is via the Jordan representation of the companion form, where the I() condition is formulated as the absence of Jordan blocks (corresponding to the unit root) of order two in the companion form of the matrix polynomial, see Archontakis (998), Gregoire (999), Bauer and Wagner (2005), and Hansen (2005) for further details. 3 The autoregressive model and its solution The simplest example of the relation between the solution of the autoregressive equations and the inverse of the characteristic polynomial is given by the classical representation of a stationary autoregressive process, see for instance Anderson (974).

11 Representation of cointegrated autoregressive processes Theorem 6 If the roots of the characteristic equation are stable, then (z) has no singularities for jzj < stable : The expansion (z) = X C i z i ; jzj < stable i=0 de nes exponentially decreasing coe cients fc i g i=0: The stationary solution of the equation (L)X t = " t is given by X t = X C i " t i : i=0 3. Error correction models The error (or equilibrium) correction formulation is a convenient way of rewriting the autoregressive equations when one takes into account the existence of roots. We apply the results of Theorem and Theorem 2 to formulate two error correction models. Corollary 7 Let z = z m ; m = ; : : : ; n be distinct roots of (z), and let (z m ) = m 0 m: Then the process X t satis es the error correction model: p(l)x t = m 0 p m (L) m z m p m (z m ) X t m= Xk n + p(l) i= ix t i + " t ; (7) for some matrix coe cients i; i = ; : : : ; k n; where p(z) and p m (z) are given by (2). Proof. This follows from Theorem by expanding around (z) around 0; z ; : : : ; z n.

12 Representation of cointegrated autoregressive processes 2 Corollary 8 Let z = be a root so that () = 0 : Then the process X t satis es the error correction model 2 X t = 0 X t X t + Xk 2 i= i 2 X t i + " t ; (8) where = _ 0 ; for some matrix coe cients i; i = ; : : : ; k 2: Proof. This follows by choosing z = and z 2 = 0 in Theorem 2. Note that the formulation (8) has no assumption about the process being I(2); only that z = is a root. We next turn to the solution of the error correction models where we have to be more precise about the roots and the assumptions on the coe cients. 3.2 Granger s representation theorem The next result is a representation of the solution of (L)X t = " t. The equations determine the process X t as a function of the errors " i (i = ; : : : ; t) and the initial values of the process, but the integration properties of X t depends on the unstable roots I() processes We rst give a general result for I() processes and then apply it to some special cases Theorem 9 Let z m ; m = ; : : : ; n be the distinct unstable roots of (z). Then, if the I() condition is satis ed for all the unstable roots, X t is non-stationary, but p(l)x t and p m (L) 0 mx t can be made stationary with mean zero by a suitable choice of initial

13 Representation of cointegrated autoregressive processes 3 values. In this case, X t has the representation X t = C m S mt + m= m= z t m A m + Y t ; t = ; : : : ; T; (9) where S mt = z t m P t i= zi m" i ; Y t is stationary, C m is given by (6), and nally A m depends on initial values and are chosen so that 0 ma m = 0: If z m = is the only unstable root we nd X t = C tx " i + C " t + Z t + A; (20) i= where Z t is stationary, and C and C are given by (6) and (7), and 0 A = 0. Proof. Under the I() condition it follows from (8), by multiplying by p(z); that p(z) (z) = C m p m (z) + p(z)h(z); jzj < stable ; m= for some regular function H with no singularities for jzj < stable because we have removed all the unstable roots. We then expand H(z) = P i=0 C i z i ; jzj < stable with exponentially decreasing coe cients and de ne the stationary process Y t = P i=0 C i " t i : We nd, by multiplying (L)X t = " t by p(l) (L); that X t satis es the di erence equation We rst show that X t p(l)x t = C m p m (L)" t + p(l)y t : (2) m= = P n m= C ms mt + Y t is a solution of this equation. Because the process S mt = z t m P t i= zi m" i is a solution of the equation ( L=z m )S mt = " t ; and

14 Representation of cointegrated autoregressive processes 4 p(z) = ( z=z m )p m (z); we get that p(l)xt = C m p m (L)( L=z m )S mt + p(l)y t = C m p m (L)" t + p(l)y t : m= m= The complete solution is then given by adding the solution X 0 t to the homogeneous equation p(l)x 0 t = 0; which is given by X 0 t = P n m= z t m A m for arbitrary coe cient matrices A m : Thus X t = X t + X 0 t represents all possible solutions to (2). We see that p(l)x t = p(l)x t is stationary for any choice of coe cients A m ; and next turn to the stationarity of p k (L) 0 kx t p k (L) 0 kx t = 0 kc m p k (L)S mt + m= m= p k (L)z t m 0 ka m + p k (L) 0 ky t : (22) We use the relation p k (z) = p km (z)( z=z m ); p km (z) = Q j6=k;m ( z=z j) and show that the rst term is stationary: 0 kc m p k (L)S mt = m= 0 kc m p km (L)( L=z m )S mt = m= m6=k 0 kc m p km (L)" t m= m6=k because 0 kc k = 0 and ( L=z m )S mt = " t : The second term of (22) is p k (L)z t k 0 ka k + p km (L)( m= m6=k L=z m )z t m 0 ka m = p k (z k )z t k 0 ka k ; because ( L=z m )z t m = 0; and p k (L)z t k = p k (z k )z t k : Hence, if we choose 0 ka k = 0; the second term vanishes. Finally the third term of (22) is stationary, so that p k (L) 0 kx t

15 Representation of cointegrated autoregressive processes 5 is stationary with mean zero. If z m = is the only stable root we get from (5) the representation (20) in the same way. We next give some special cases for well known error correction models. The I() model. When z = is the only unstable root we get the usual I() model where the error correction form is given by (7) in Corollary 7 for z m = ; so that p(l) = L = ; and the error correction model becomes X t = 0 X t + Xk i= ix t i + " t : (23) The I() condition (4) asserts that j 0??j 6= 0; where = I p P k i= i: We nd the representation (20). For inference in this model, see for example Johansen (996). Note that from (20) it follows that X t and 0 X t = 0 C " t + 0 Y t are stationary. In fact 0 C = of X t can be I( I r ; so that 0 C 6= 0 and 0 X t is I(0); and no linear combination ): Similarly there can not be multi cointegration in the I() model in the sense that X t cannot cointegrate in a non-trivial sense with P t i= 0 X i ; see Granger and Lee (990). In case 0 P t i= 0 X i + 0 2X t were stationary, we would have 0 0 C + 0 2C = 0: Multiplying by we get 0 = 0 0 C = 0 ; so that = 0: For details see Engsted and Johansen (999). The model for seasonal roots. The I() model for quarterly seasonal roots is found if there are roots at z = ; i; see Hylleberg, Engle, Granger and Yoo (990), Ahn and Reinsel (994), and Schaumburg and Johansen (998), where inference is discussed. We illustrate here the simpler case of half year data with roots at z = ;

16 Representation of cointegrated autoregressive processes 6 see Lee (992). We de ne () = 0 ; ( ) = 0 : The error correction form of the equation is given by (7) in Corollary 7 with z = ; z 2 = : Because p(l) = ( L)( + L) = ( L 2 ); p (L) = ( + L); p (L) = ( L) we nd the error correction equation ( L 2 )X t = 2 ( + L) 0 X t 2 ( L) Xk 2 0 X t + ( L 2 ) i X t i= i + " t and the solution, if the I() condition (4) is satis ed at z =, is given by (9), which in this case becomes X t = C tx tx " i + C ( ) t ( ) i " i + A + ( ) t A + Y t ; i= i= where C and C are given by (6) for z m = ; and 0 A = 0 and 0 A = 0. The non-stationary behavior of the process X t is governed by the two processes S t = tx tx " i and S t = ( ) t ( ) i " i : i= i= The rst is a random walk, and P t i= ( )i " i is also a random walk, if " t has a symmetric

17 Representation of cointegrated autoregressive processes 7 distribution, but the factor ( ) t shows that S t is not a random walk. It is a nonstationary process that satis es the basic equation ( + L)S t = " t ; so that yearly aggregates of S ( ) t are stationary, whereas ( L)S t = " t shows that the di erence within a year of S t is stationary. The interpretation of cointegration in this case is not so simple. What we see is a process, which is non-stationary and even the di erence X t X t is non-stationary. If we take instead the yearly data X year t = X t + X t ; then the di erence X year t = X t + X t (X t + X t 2 ) = X t X t 2 = ( L 2 )X t is stationary. Cointegration means that even though X t X t is non-stationary, there are linear combinations for which 0 (X t X t ) is stationary. Similarly there are combinations for which 0 (X t + X t ) is stationary. Models for explosive roots. If the characteristic polynomial has one real explosive root < ; and a root at z = ; then the error correction model is found from (7) by choosing z = ; z 2 = gives p(l) = ( L)( L) = ; p (L) = ( L); p () = ; p (L) = ( L); p () = :

18 Representation of cointegrated autoregressive processes 8 The error correction model is given by X t = ( =) 0 X t + ( ) Xk 2 0 X t + i X t i + " t : i= The process has the representation, see (9), X t = C tx tx " i + C t i " i + t A + A + Y t ; i= i= provided the I() condition (4) is satis ed at z = and z =. Note that the process is non-stationary because of the random walk, but this time there is also an explosive process t P t i=0 i " i : The sum P t i=0 i " i actually converges for t! ; and the explosiveness is due to the factor t! : Note that 0 X t is not stationary. It is explosive and only ( L=) 0 X t is stationary. Similarly 0 X t is not stationary but I(): For inference the main references are the papers by Nielsen (2002, 2005a, 2005b), see also Anderson (959) I(2) processes We next turn to I(2) processes and assume that z = is a root, so that = 0 of rank r, but 0? _? = 0 of rank s; so that the I() condition is not satis ed. This determines orthogonal directions (; ; 2 ) and (; ; 2 ); which are needed in the formulation of the results, see (9). Theorem 0 Let z = be the only unstable root, and assume that the I() condition (4) does not hold. If the I(2) condition (0) is satis ed for z =, then X t is non-

19 Representation of cointegrated autoregressive processes 9 stationary, but 2 X t ; (; ) 0 X t ; and 0 X t + 0 _ Xt can be made I(0) by a suitable choice of initial values, and then X t has the representation X t = C 2 tx ix tx " j + C " i + Y t + A + Bt; (24) i= j= i= where C and C 2 are given by (2) and (3), Y t is I(0) and A and B depend on initial values and are chosen so that 0 A + 0 _ B = 0 and (; ) 0 B = 0: The linear combination 0 X t + 0 _ Xt is called the polynomial cointegrating relation. Proof. We follow the proof of Theorem 9. From () we nd ( z) 2 (z) = C 2 + C ( z) + ( z) 2 H(z); jzj < stable ; where H(z) is a regular function with no singularities for jzj < stable ; and hence given by H(z) = P i=0 C i z i ; jzj < stable ; where C i are exponentially decreasing. We de ne the stationary process Y t = P i=0 C i " t i ; and note that C 0 = P i=0 C i 6= 0 because of (4), so that Y t is I(0): Next apply 2 (L) to (L)X t = " t : We then nd the di erence equation 2 X t = C 2 " t + C " t + 2 Y t ; t = ; : : : ; T: The process X t = C 2 tx ix tx " j + C " i + Y t i= j= i=

20 Representation of cointegrated autoregressive processes 20 is clearly a solution, and the complete solution is found by adding the solution to the homogenous equation 2 X 0 t = 0; which is given by X 0 t = A + Bt; that is, any solution has the form X t = X t + A + Bt: It follows that 2 X t = 2 X t is stationary with mean zero for any choice of A and B: We want to choose A and B so that (; ) 0 X t ; and 0 X t + 0 _ Xt become stationary with mean zero. We nd (; ) 0 X t = (; ) 0 C " t + (; ) 0 Y t + (; ) 0 B; which is stationary if B is chosen so that (; ) 0 B = 0: Moreover it is I(0) because, by (6), (; ) 0 C = (0; I s ) 6= 0: Next tx tx 0 X t + 0 Xt _ = 0 C " i + 0 Y t + 0 A + 0 Bt + 0 (C2 _ " i + C " t + Y t + B) i= i= = 0 Y t + 0 _ C " t + 0 _ Yt + 0 A + 0 _ B; because 0 B = 0 and 0 C + 0 C2 _ = 0; see (5). We next choose 0 A + 0 B _ = 0; that is, A = 0 B: _ This completes the proof of the representation (24). Note that 2 X t is I(0) because C 2 6= 0; (; ) 0 X t is I(0) because (; ) 0 C 6= 0; and nally 0 X t + 0 Xt _ is I(0) because the sum of the coe cients is 0 C C _ 6= 0; because (4) and (5) show that 0 C _ C = I r + 0 C 2 0 _ C2 _ = I r + 0 C 2 0 C 2 = Ir :

21 Representation of cointegrated autoregressive processes 2 For I(2) processes we usually apply the error correction model (8), where satis es 0?? = 0 : Inference can be found in Boswijk (2000), Johansen (997, 2006) and Paruolo (996, 2000). Note that the representations (9) and (24) are chosen so that it follows directly from (9) that X t and 0 X t are stationary, and from (24) it is seen that 2 X t ; (; ) 0 X t and 0 X t + 0 _ Xt are stationary. 4 Fractional processes In this section we discuss some models for cofractional processes that have been analyzed in the literature and suggest a modi cation of the model by Granger (986), which admits a feasible representation theory. We present here the rst results for this new model and they will be followed up in a companion paper Johansen (2007), with the subsequent aim of developing model based inference for fractional and cofractional processes based upon the model. 4. Basic de nitions The concept of the fractional di erence was introduced by Granger and Joyeux (980) and Hosking (98) to discuss processes with a long memory. The fractional operator is de ned by the expansion d = ( L) d = X k=0 d ( ) k L k ; k

22 Representation of cointegrated autoregressive processes 22 and we shall use the de nition Xt d d + = ( ) k L k ; k k=0 so that d +X t = d X t ft 0g; see for instance Robinson and Iacone (2005). We denote by I(0) + the class of asymptotically stationary processes of the form Xt X t = ft 0g C i " t i ; i=0 where 0 < tr( P i=0 C i) 0 ( P i=0 C i) <. De nition We say that Y t is fractional of order d; and write Y t 2 F(d) if d +Y t t 2 I(0) + for some function, t ; of the initial values. De nition 2 If Y t 2 F(d) with d > 0 and there exists a linear combination 6= 0 so that 0 Y t 2 F(d b) for 0 < b d we call Y t cofractional, F(d; b), with cofractional vector, and the number of linearly independent such vectors is the cofractional rank. 4.2 Some models for cofractional processes A parametric model for cofractional processes, which satisfactorily describes the variation of the data, should ideally have the properties The model is a dynamic model that generates F(d; b) processes The cofractional relations as well as the adjustment towards these are modelled

23 Representation of cointegrated autoregressive processes 23 The models for di erent cofractional rank should be nested A representation theory for the solution should allow a discussion of the properties of the process, depending on the parameter values If we can nd such a model we can conduct inference by deriving estimators and tests of relevant hypotheses on long-run and short-run parameters, based upon the Gaussian likelihood function. The rst parametric model for cofractional processes F(d; b) was suggested by Granger (986) as a modi cation of the usual error correction model: A (L) d X t = ( b ) d b 0 X t + d(l)" t ; Here A (L) is a lag polynomial and d(l) is a scalar function. If we set d(l) = and change X t to X t and 0 to 0 ; we get the model A (L) d X t = ( b ) d b 0 X t + " t : (25) This is a dynamic parametric model which models both short-run and long-run parameters. The model was applied by Davidson (2002), and Lasak (2005) derives a likelihood ratio test for the hypothesis r = 0 when d = and nds the asymptotic distribution.

24 Representation of cointegrated autoregressive processes 24 Most authors consider the simpler regression type model, which explicitly models the order of cofractionality: X t = 0 X 2t + d+b + u t (26) X 2t = d + u 2t ; where u t is a general invertible I(0) process. Here X has dimension r and X 2 dimension p r: There is a large literature, based upon (26), aimed at developing (a semiparametric) estimation theory for fractional and cofractional processes, see for instance Robinson and Marinucci (200), Chen and Hurvich (2003) or Nielsen and Shimotsu (2006) to mention a few. In this literature the purpose is mainly to conduct inference on the order of fractionality and test for the occurrence of cofractionality. The relations between (26) and (25) is best seen by de ning the matrix = ( I r ; 0) 0 and = (I r ; 0 ) 0 : Then (26) can be written in an error correction form as follows: d X t = ( b ) d b 0 X t + v t ; (27) where v t = (u 0 t + u 0 2t; u 0 2t) 0 : For parametric inference in (26), a model for v t is needed, and Sowell (989) suggested that v t could be modelled as an invertible and stationary ARMA model and derived a computable expression for the variance of the fractional stationary process. For this model Dueker and Startz (998) applied Gaussian maximum likelihood esti-

25 Representation of cointegrated autoregressive processes 25 mation, using the results by Sowell. Note that for an ARMA error A(L)v t = B(L)" t ; with " t i.i.d.(0; ), we get the model A(L) d X t = A(L)( b ) d b 0 X t + B(L)" t ; which is close to the error correction formulation (25). Breitung and Hassler (2002) suggested to analyze model (27), where is an unknown parameter and construct test for the rank of and based on the assumption that d is known, and v t is an AR process (B(L) = I p ) with i.i.d. Gaussian errors. The representation theory for model (26) is easy, as X t 2 F(d; b); and X t 0 X 2t = 0 X t 2 F(d b): The adjustment ; however, is assumed known, and so is the cofractional rank r, so the models for di erent cofractional rank are not nested. The representation theory for Granger s model (25) is not so easy due to the occurrence of both the lag operator and the fractional lag operator L b = b ; which appears natural for fractional processes. We therefore propose a di erent model d X t = ( b ) 0 d b X t + kx i= i d ( b ) i X t + " t : (28) In this model the polynomial in the lag operator is replaced by a polynomial in the fractional lag operator. This model satis es the requirements for a parametric model for cofractional processes, as we shall now show that there is a feasible representation theory.

26 Representation of cointegrated autoregressive processes A representation theorem for cofractional processes We associate with (28) the (regular) characteristic function (z) = ( z) d I p 0 ( ( z) b )( z) d b kx i= i( ( z) b ) i ( z) d but also the polynomial (u) = ( u)i p 0 u kx i= iu i ( u); which is related to (z) via the substitution u = ( z) b : (u) = ( z) b d (z): Note that ( z) b and hence (z) has a singularity at z = ; unless d and b are nonnegative integers. By introducing the variable u = ( z) b the function ( z) b d (z) becomes a polynomial and the conditions on the roots are in terms of this polynomial, which is the characteristic polynomial for the usual I() model, see (23). We want to nd the solution of equation (28) under the I() condition for (u). A condition for the process to be fractional of order d and cofractional of order d b is given in Johansen (2007). We de ne = I p P k i= i: Theorem 3 If u = is the only unstable root of j (u)j = 0; if () = 0 ; and

27 Representation of cointegrated autoregressive processes 27 if the I() condition holds j 0?? j 6= 0; (29) then for jzj < ; (z) = X D i ( z) d+bi + ( z) d+b(n+) Di z i ; (30) i=0 i=0 where n = [(d b)=b] is chosen so that d + nb < 0 d + (n + )b: The rst two matrices are given by D 0 =? ( 0?? ) 0?; (3) D = [ D 0 + D D 0 ( + kx i i )D 0 ]: (32) i= The solution of (L)X t = " t has the representation X t = i=0 D i d+ib + " t + d+(n+)b + Y + t + t ; t = ; 2; : : : (33) where the rst n + terms are fractional of order d; d b; : : : ; d nb and Y + t = P t i=0 D i " t i. The function t depends on initial values. If Y t 2 I(0) + ; it follows that X t is F(d) and 0 X t is F(d b): Proof. Under the I() condition (29), (u) can be inverted using (5) with z 0 = ; D 0 = C; D = C ; so that ( u) (u) = D 0 + D ( u) + ( u) 2 H(u)

28 Representation of cointegrated autoregressive processes 28 is a regular function for juj < stable : We expand as ( u) (u) = X D i ( u) i + ( u) n+ C i u i ; juj < stable ; i=0 i=0 where n is chosen so that d + nb < 0 d + (n + )b; and the coe cients C i are exponentially decreasing. For z small we nd that ( z) d (z) = = X D i ( z) ib + ( z) (n+)b C i ( ( z) b ) i i=0 X D i ( z) ib + ( z) (n+)b Di z i ; (34) i=0 i=0 i=0 which proves (30). Here D 0 and D are given by (6) and (7), and substituting _ = 0 ; 2 = kx i= i i we get (3) and (32). In order to prove (33), we write (28) as " t = (L)X t = + (L)X t + (L)X t and multiply by (L) + to nd (L) + " t = X t t ; (35) where t = + (L) (L)X t is a function of initial values of X t. We let Y + t =

29 Representation of cointegrated autoregressive processes 29 P t i=0 D i " t i and nd the representation (33) X t = t + (L) + " t = t + i=0 D i d+ib + " t + d+(n+)b + Y + t : It follows that d +X t = d + t + D 0 " t + D b +" t + D 2 2b + " t + + D n nb + " t + (n+)b + Y + t : If Y + t 2 I(0) + ; we see that X t is fractional of order d because D 0 6= 0; and that 0 X t is fractional of order d b because 0 D 0 = 0; and 0 D 6= 0: The representation in (33) for d = b = ; and hence n = 0; is X t = D 0 tx Xt " i + Di " t i + t ; t = ; 2; : : : i= i=0 This should be compared with (20), where we have chosen a representation in terms of the stationary process C " t + Z t and given a more explicit discussion of the role of the initial values, rather than just the term t. 5 Conclusion This paper discusses the connection between the solution to the autoregressive equations and the inverse of the generating function of the coe cients under di erent conditions on the coe cient matrices. We apply these results to nd representations of cointegrated vector autoregressive processes which are I(); I(2), seasonally integrated

30 Representation of cointegrated autoregressive processes 30 and explosive. For cofractional processes we suggest a modi cation of the model suggested by Granger (986) for which the above results can be applied to give a representation theory. 6 Appendix Proof of Theorem 3. There is no loss of generality in assuming that z 0 = ; as the polynomial (uz 0 ) has a root at u = ; and satis es the I() condition at u = ; if (z) does at z = z 0 : Note that the derivative for u = is z 0 _ (z0 ): We therefore assume that z 0 = and leave out the subscript 0 for notational convenience. Let = 0 have rank r; and let C and C be given by (6) and (7). Note that (z) is a regular function with singularities at the zeros of the characteristic equation. We want to show that the function H(z) = ((z) C z C )=( z); 0 < jz j < ; extended by continuity at z = is regular for jz j < : Multiplying (3) by A = (;? ) and B = ;? we obtain 0 0 A 0 (z)b = I r + (z ) _ 00 (z ) _ 0 (z ) _ 0 (z ) _ C A + (z ) C A +(z ) 3 A 0 (z)b;

31 Representation of cointegrated autoregressive processes 3 where _ 00 = 0 _ ; _ 0 = 0? _ ; etc. We can factorize (z ) from the last p r columns by multiplying from the right by F (z) = 0 I r 0 0 (z ) I p r C A ; and we nd that K(z) = A 0 (z)bf (z) = K + (z ) _ K + (z ) 2 K (z) where 0 0 K = I r _ 0 0 _ C A ; _K = _ _ 0 2 C A : From jk(z)j = ja 0 Bjj(z)j( z) (p r) ; z 6= ; (36) it is seen that K(z) has the same roots as (z); except for z = ; where jk()j = jkj = j _ j = j 0? _? j 6= 0; if the I() condition (4) is satis ed for z = : It follows that the multiplicity of the root at z = is p r; and that K(z) is invertible for jz j < " for some small : K (z) = K (z )K _KK + (z ) 2 H (z)

32 Representation of cointegrated autoregressive processes 32 where H (z) is regular for jz j < ; and K = K _KK = 0 0 I r _ 0 _ 0 _ C A _ 00 0 _ 0 ( 0 + _ 0 _ ) _ 0 C A with ij = _ i0 _ 0j + 2 ij ; i; j = 0; : (37) Multiplying by F (z) from the left we nd F (z)k (z) = (z ) M + M 0 + (z )H 2 (z); z 6= with M = _ 0 C A ; M 0 = I r _ 0 _ 0 C A for a function H 2 (z) which is regular for jz j < : Therefore, multiplying by B and A 0 ; we nd (z) = BF (z)(a 0 (z)bf (z)) A 0 = BF (z)k(z) A 0 = (z )? _ 0? 0 +? _ 0 _ 0?? 0? +(z )BH 2 (z)a 0 = C( z) 0 + C _ _ C C( _ 0 _ + 2 )C + (z )H 3 (z);

33 Representation of cointegrated autoregressive processes 33 where C is given in (6) and H 3 (z) is a regular function for j zj < : Thus the pole at z = can be removed by subtracting the term C( z) : This proves (5), (6), and (7). Proof of Theorem 5. We de ne A = (; ; 2 ) and B = ( ; ; 2 ): Because = 0 and 0? _? = 0 a calculation will show that A 0 (z)b = 0 I r + (z ) _ 00 (z ) _ 0 (z ) _ 02 (z ) _ 0 (z )I s 0 (z ) _ (z )2 f ij g + 6 (z )3 f... ijg + (z C A ) 4 (z): Next we multiply from the right by the matrix 0 F (z) = I r 0 (z ) _ 02 0 (z ) I s (z ) 2 I p r s C A to get K(z) = A 0 (z)bf (z) = K + (z ) _ K + (z ) 2 (z)

34 Representation of cointegrated autoregressive processes 34 where K = _K = 0 0 I r _ 0 _ 00 _ I s _ 0 _ _ 20 _ _ _ + 6 _ _ + 6 _ _ C A : C A Note that K has full rank because j _ 20 _ j = j 0 2( _ 0 _ + 2 ) 2 j 6= 0: For 0 < jz j < K (z) = K (z )K _KK + (z ) 2 H (z); where H (z) is regular for jz j < : With ij given in (37) we nd 0 0 K = I r _ I s ; K = C B I r _ 0 ( 02 _ 0 2 ) 22 0 I s ; C A so that F (z)k (z) = ( z) 2 M 2 + ( z) M + M 0 + ( z)h 2 (z)

35 Representation of cointegrated autoregressive processes 35 where H 2 (z) is regular for jz j < and M 2 = ; M = B C _ I s _ = 22 [ _ _ 02 _ + 2 _ _ ] 6 ; C A 22 : The matrix M 0 is rather complicated, but it su ces to nd that it has the form 0 M 0 = I r + _ _ 20 : C A Finally we use (z) = BF (z)(a 0 (z)bf (z)) A 0 = BF (z)k(z) A 0 to nd C 2 = ; C = 0 + ( 2 _ 02 ) ( 2 0 _ 20 0 ) ; 0 C 0 = I r + 0 _ C2 _ which reduces to (2), (3), and (4). This completes the proof of Theorem 5:

36 Representation of cointegrated autoregressive processes 36 7 References References [] Ahn, S. K., Reinsel, G. C. (994). Estimation of partially non-stationary vector autoregressive models with seasonal behavior. Journal of Econometrics 62: [2] Anderson, T. W. (959). On asymptotic distributions of estimates of parameters of stochastic di erence equations. Annals of Mathematical Statistics 30: [3] Anderson, T. W. (97). The Statistical Analysis of Time Series. New York: Wiley. [4] Anderson, T. W. (2003). A simple condition for cointegration. Discussion paper, Stanford University. [5] Archontakis, F. (998). An alternative proof of Granger s representation theorem for I() systems through Jordan matrices. Journal of the Italian Statistical Society 7: 27. [6] Bauer, D., Wagner, M. (2005). A canonical form for unit root processes in the state space framework, Discussion paper, University of Bern. [7] Boswijk, P. H. (2000). Mixed Normality and Ancillarity in I(2) Systems. Econometric Theory 6:

37 Representation of cointegrated autoregressive processes 37 [8] Chen, W. W., Hurvich, C. M. (2003). Semiparametric estimation of multivariate fractional cointegration. Journal of the American Statistical Association 98: [9] Davidson, J. (2002). A model of fractional cointegration, and tests for cointegration using the bootstrap. Journal of Econometrics 0: [0] Dittmann, I. (2004). Error correction models for fractionally cointegrated time series. Journal of Time Series Analysis 25: [] Duecker, M., Startz, R. (998). Maximum likelihood estimation of fractional cointegration with an application to US and Canada bond rates. Review of Economics and Statistics 83, [2] Engle, R. F., Granger, C. W. J. (987). Co-integration and error correction: representation, estimation and testing. Econometrica 55: [3] Engle, R. F., Yoo, S. B. (99). Cointegrated economic time series: an overview with new results. In: Long-run Economic Relations: Readings in Cointegration. Oxford: Oxford University Press, pp [4] Engle R. F., Granger, C. W. J., Hylleberg, S., and H. S. Lee, (993). Seasonal cointegration: The Japanese consumption function. Journal of Econometrics 55:

38 Representation of cointegrated autoregressive processes 38 [5] Engsted, T., Johansen, S. (999). Granger s representation theorem and multicointegration. In: Cointegration, Causality and Forecasting. A Festschrift in Honour of Clive W. J. Granger. Oxford: Oxford University Press, pp [6] Faliva, M., Zoia, M. G. (2006). Topics in Dynamic Model Analysis. Lecture Notes in Economics and Mathematical Statistics, Springer, New York. [7] Granger, C. W. J. (986). Developments in the study of cointegrated economic variables. Oxford Bulletin of Economics and Statistics 48: [8] Granger, C. W. J., Lee, T. -H. (990). Multicointegration. In: Advances in Econometrics: Cointegration, Spurious Regressions and unit Roots. Greenwich: JAI Press, pp [9] Granger, C. W. J., Jouyoux, R. (980). An introduction to long memory time series models and fractional di erencing. Journal of Time Series Analysis :5 29. [20] Gregoir, S. (999). Multivariate time series with various hidden unit roots, Part I, Integral operator algebra and representation theorem. Econometric Theory 5: [2] Hansen, P. R. (2005). Granger s representation theorem. A closed form expression for I() processes. The Econometrics Journal 8: [22] Haldrup, N., Salmon, M. (998). Representations of I(2) cointegrated systems using the Smith-McMillan form. Journal of Econometrics 84: [23] Hosking, J. R. M. (98). Fractional di erencing. Biometrika 68:65 76.

39 Representation of cointegrated autoregressive processes 39 [24] Hylleberg, S., Engle, R. F., Granger, C. W. J., Yoo, S. B. (990). Seasonal integration and cointegration. Journal of Econometrics 44: [25] Johansen, P. (93). Über osculierende Interpolation. Skandinavisk Aktuarietidsskrift, [26] Johansen, S. (992). A representation of vector autoregressive processes integrated of order 2. Econometric Theory 8: [27] Johansen, S. (996). Likelihood Based Inference on Cointegration in the Vector Autoregressive Model., Oxford: Oxford University Press. [28] Johansen, S. (997). Likelihood Analysis of the I(2) Model. Scandinavian Journal of Statistics 24: [29] Johansen, S. (2007). A representation theory for a class of vector autoregressive models for fractional processes. Forthcoming Econometric Theory. [30] Johansen, S. (2006). The statistical analysis of hypotheses on the cointegrating relations in the I(2) model, Journal of Econometrics 32:8 5. [3] Johansen, S., Schaumburg, E. (998). Likelihood analysis of seasonal cointegration. Journal of Econometrics 88: [32] la Cour, L. (998). A parametric characterization of integrated vector autoregressive (VAR) processes. Econometric Theory 4: [33] Lee, H. S., (992). Maximum likelihood inference on cointegration and seasonal cointegration. Journal of Econometrics 54: 47.

40 Representation of cointegrated autoregressive processes 40 [34] Nielsen, B. (200). The asymptotic distribution of unit root tests of unstable autoregressive processes. Econometrica 69:2-29. [35] Nielsen, B. (2005a). Strong consistency results for least squares estimators in general vector autoregressions with deterministic terms. Econometric Theory 2: [36] Nielsen, B. (2005b). Analysis of co-explosive processes. Discussion paper, W8, Nu eld, Oxford. [37] Nielsen, M. Ø., Shimotsu, K. (2006). Determining the cointegrating rank in nonstationary fractional systems by the exact local Whittle approach. Forthcoming in Journal of Econometrics. [38] Paruolo, P. (996). On the determination of integration indices in I(2) systems. Journal of Econometrics, 72: [39] Paruolo, P. (2000). Asymptotic e ciency of the two step estimator in I(2) systems. Econometric Theory 6: [40] Rahbek, A. C., Mosconi, R. (999). Cointegration rank inference with stationary regressors in VAR models. Econometrics Journal 2:76 9. [4] Robinson, P. M., Iacone, F. (2005). Cointegration in fractional systems with deterministic trends. Journal of Econometrics 29: [42] Robinson, P. M., Marinucci D. (200). Semiparametric fractional cointegration analysis. Journal of Econometrics 05:

UNIVERSITY OF CALIFORNIA, SAN DIEGO DEPARTMENT OF ECONOMICS

UNIVERSITY OF CALIFORNIA, SAN DIEGO DEPARTMENT OF ECONOMICS 2-7 UNIVERSITY OF LIFORNI, SN DIEGO DEPRTMENT OF EONOMIS THE JOHNSEN-GRNGER REPRESENTTION THEOREM: N EXPLIIT EXPRESSION FOR I() PROESSES Y PETER REINHRD HNSEN DISUSSION PPER 2-7 JULY 2 The Johansen-Granger

More information

(Y jz) t (XjZ) 0 t = S yx S yz S 1. S yx:z = T 1. etc. 2. Next solve the eigenvalue problem. js xx:z S xy:z S 1

(Y jz) t (XjZ) 0 t = S yx S yz S 1. S yx:z = T 1. etc. 2. Next solve the eigenvalue problem. js xx:z S xy:z S 1 Abstract Reduced Rank Regression The reduced rank regression model is a multivariate regression model with a coe cient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure,

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

by Søren Johansen Department of Economics, University of Copenhagen and CREATES, Aarhus University

by Søren Johansen Department of Economics, University of Copenhagen and CREATES, Aarhus University 1 THE COINTEGRATED VECTOR AUTOREGRESSIVE MODEL WITH AN APPLICATION TO THE ANALYSIS OF SEA LEVEL AND TEMPERATURE by Søren Johansen Department of Economics, University of Copenhagen and CREATES, Aarhus University

More information

Parametric Inference on Strong Dependence

Parametric Inference on Strong Dependence Parametric Inference on Strong Dependence Peter M. Robinson London School of Economics Based on joint work with Javier Hualde: Javier Hualde and Peter M. Robinson: Gaussian Pseudo-Maximum Likelihood Estimation

More information

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails GMM-based inference in the AR() panel data model for parameter values where local identi cation fails Edith Madsen entre for Applied Microeconometrics (AM) Department of Economics, University of openhagen,

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Bootstrapping the Grainger Causality Test With Integrated Data

Bootstrapping the Grainger Causality Test With Integrated Data Bootstrapping the Grainger Causality Test With Integrated Data Richard Ti n University of Reading July 26, 2006 Abstract A Monte-carlo experiment is conducted to investigate the small sample performance

More information

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT MARCH 29, 26 LECTURE 2 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT (Davidson (2), Chapter 4; Phillips Lectures on Unit Roots, Cointegration and Nonstationarity; White (999), Chapter 7) Unit root processes

More information

Cointegration Lecture I: Introduction

Cointegration Lecture I: Introduction 1 Cointegration Lecture I: Introduction Julia Giese Nuffield College julia.giese@economics.ox.ac.uk Hilary Term 2008 2 Outline Introduction Estimation of unrestricted VAR Non-stationarity Deterministic

More information

1 Regression with Time Series Variables

1 Regression with Time Series Variables 1 Regression with Time Series Variables With time series regression, Y might not only depend on X, but also lags of Y and lags of X Autoregressive Distributed lag (or ADL(p; q)) model has these features:

More information

A representation theory for a class of vector autoregressive models for fractional processes

A representation theory for a class of vector autoregressive models for fractional processes A representation theory for a class of vector autoregressive moels for fractional processes Søren Johansen Department of Applie Mathematics an Statistics, University of Copenhagen November 2006 Abstract

More information

Reduced rank regression in cointegrated models

Reduced rank regression in cointegrated models Journal of Econometrics 06 (2002) 203 26 www.elsevier.com/locate/econbase Reduced rank regression in cointegrated models.w. Anderson Department of Statistics, Stanford University, Stanford, CA 94305-4065,

More information

ECON 4160, Spring term Lecture 12

ECON 4160, Spring term Lecture 12 ECON 4160, Spring term 2013. Lecture 12 Non-stationarity and co-integration 2/2 Ragnar Nymoen Department of Economics 13 Nov 2013 1 / 53 Introduction I So far we have considered: Stationary VAR, with deterministic

More information

NUMERICAL DISTRIBUTION FUNCTIONS OF LIKELIHOOD RATIO TESTS FOR COINTEGRATION

NUMERICAL DISTRIBUTION FUNCTIONS OF LIKELIHOOD RATIO TESTS FOR COINTEGRATION JOURNAL OF APPLIED ECONOMETRICS J. Appl. Econ. 14: 563±577 (1999) NUMERICAL DISTRIBUTION FUNCTIONS OF LIKELIHOOD RATIO TESTS FOR COINTEGRATION JAMES G. MACKINNON, a * ALFRED A. HAUG b AND LEO MICHELIS

More information

Chapter 1. GMM: Basic Concepts

Chapter 1. GMM: Basic Concepts Chapter 1. GMM: Basic Concepts Contents 1 Motivating Examples 1 1.1 Instrumental variable estimator....................... 1 1.2 Estimating parameters in monetary policy rules.............. 2 1.3 Estimating

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Problem set 1 - Solutions

Problem set 1 - Solutions EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed

More information

Testing for non-stationarity

Testing for non-stationarity 20 November, 2009 Overview The tests for investigating the non-stationary of a time series falls into four types: 1 Check the null that there is a unit root against stationarity. Within these, there are

More information

Likelihood inference for a nonstationary fractional autoregressive model

Likelihood inference for a nonstationary fractional autoregressive model Likelihood inference for a nonstationary fractional autoregressive model Søren Johansen y University of Copenhagen and CREATES and Morten Ørregaard Nielsen z Cornell University and CREATES Preliminary

More information

Simple Estimators for Semiparametric Multinomial Choice Models

Simple Estimators for Semiparametric Multinomial Choice Models Simple Estimators for Semiparametric Multinomial Choice Models James L. Powell and Paul A. Ruud University of California, Berkeley March 2008 Preliminary and Incomplete Comments Welcome Abstract This paper

More information

Tests for Cointegration, Cobreaking and Cotrending in a System of Trending Variables

Tests for Cointegration, Cobreaking and Cotrending in a System of Trending Variables Tests for Cointegration, Cobreaking and Cotrending in a System of Trending Variables Josep Lluís Carrion-i-Silvestre University of Barcelona Dukpa Kim y Korea University May 4, 28 Abstract We consider

More information

a11 a A = : a 21 a 22

a11 a A = : a 21 a 22 Matrices The study of linear systems is facilitated by introducing matrices. Matrix theory provides a convenient language and notation to express many of the ideas concisely, and complicated formulas are

More information

Panel cointegration rank testing with cross-section dependence

Panel cointegration rank testing with cross-section dependence Panel cointegration rank testing with cross-section dependence Josep Lluís Carrion-i-Silvestre y Laura Surdeanu z September 2007 Abstract In this paper we propose a test statistic to determine the cointegration

More information

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011 SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER By Donald W. K. Andrews August 2011 COWLES FOUNDATION DISCUSSION PAPER NO. 1815 COWLES FOUNDATION FOR RESEARCH IN ECONOMICS

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

Testing for Regime Switching: A Comment

Testing for Regime Switching: A Comment Testing for Regime Switching: A Comment Andrew V. Carter Department of Statistics University of California, Santa Barbara Douglas G. Steigerwald Department of Economics University of California Santa Barbara

More information

QED. Queen s Economics Department Working Paper No. 1244

QED. Queen s Economics Department Working Paper No. 1244 QED Queen s Economics Department Working Paper No. 1244 A necessary moment condition for the fractional functional central limit theorem Søren Johansen University of Copenhagen and CREATES Morten Ørregaard

More information

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56 Cointegrated VAR s Eduardo Rossi University of Pavia November 2013 Rossi Cointegrated VAR s Financial Econometrics - 2013 1 / 56 VAR y t = (y 1t,..., y nt ) is (n 1) vector. y t VAR(p): Φ(L)y t = ɛ t The

More information

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Fin. Econometrics / 31

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Fin. Econometrics / 31 Cointegrated VAR s Eduardo Rossi University of Pavia November 2014 Rossi Cointegrated VAR s Fin. Econometrics - 2014 1 / 31 B-N decomposition Give a scalar polynomial α(z) = α 0 + α 1 z +... + α p z p

More information

Robust testing of time trend and mean with unknown integration order errors

Robust testing of time trend and mean with unknown integration order errors Robust testing of time trend and mean with unknown integration order errors Jiawen Xu y Shanghai University of Finance and Economics and Key Laboratory of Mathematical Economics Pierre Perron z Boston

More information

Nonparametric Cointegration Analysis of Fractional Systems With Unknown Integration Orders

Nonparametric Cointegration Analysis of Fractional Systems With Unknown Integration Orders Nonparametric Cointegration Analysis of Fractional Systems With Unknown Integration Orders Morten Ørregaard Nielsen Cornell University and CREATES June 12, 27 Abstract In this paper a nonparametric variance

More information

CREATES Research Paper A necessary moment condition for the fractional functional central limit theorem

CREATES Research Paper A necessary moment condition for the fractional functional central limit theorem CREATES Research Paper 2010-70 A necessary moment condition for the fractional functional central limit theorem Søren Johansen and Morten Ørregaard Nielsen School of Economics and Management Aarhus University

More information

Estimation and Inference with Weak Identi cation

Estimation and Inference with Weak Identi cation Estimation and Inference with Weak Identi cation Donald W. K. Andrews Cowles Foundation Yale University Xu Cheng Department of Economics University of Pennsylvania First Draft: August, 2007 Revised: March

More information

Fractional integration and cointegration: The representation theory suggested by S. Johansen

Fractional integration and cointegration: The representation theory suggested by S. Johansen : The representation theory suggested by S. Johansen hhu@ssb.no people.ssb.no/hhu www.hungnes.net January 31, 2007 Univariate fractional process Consider the following univariate process d x t = ε t, (1)

More information

THE UNIVERSITY OF CHICAGO Booth School of Business Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay

THE UNIVERSITY OF CHICAGO Booth School of Business Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay THE UNIVERSITY OF CHICAGO Booth School of Business Business 494, Spring Quarter 03, Mr. Ruey S. Tsay Unit-Root Nonstationary VARMA Models Unit root plays an important role both in theory and applications

More information

Introduction: structural econometrics. Jean-Marc Robin

Introduction: structural econometrics. Jean-Marc Robin Introduction: structural econometrics Jean-Marc Robin Abstract 1. Descriptive vs structural models 2. Correlation is not causality a. Simultaneity b. Heterogeneity c. Selectivity Descriptive models Consider

More information

A New Approach to Robust Inference in Cointegration

A New Approach to Robust Inference in Cointegration A New Approach to Robust Inference in Cointegration Sainan Jin Guanghua School of Management, Peking University Peter C. B. Phillips Cowles Foundation, Yale University, University of Auckland & University

More information

Cointegration Tests Using Instrumental Variables Estimation and the Demand for Money in England

Cointegration Tests Using Instrumental Variables Estimation and the Demand for Money in England Cointegration Tests Using Instrumental Variables Estimation and the Demand for Money in England Kyung So Im Junsoo Lee Walter Enders June 12, 2005 Abstract In this paper, we propose new cointegration tests

More information

On the Error Correction Model for Functional Time Series with Unit Roots

On the Error Correction Model for Functional Time Series with Unit Roots On the Error Correction Model for Functional Time Series with Unit Roots Yoosoon Chang Department of Economics Indiana University Bo Hu Department of Economics Indiana University Joon Y. Park Department

More information

Numerical Distribution Functions of. Likelihood Ratio Tests for Cointegration. James G. MacKinnon. Alfred A. Haug. Leo Michelis

Numerical Distribution Functions of. Likelihood Ratio Tests for Cointegration. James G. MacKinnon. Alfred A. Haug. Leo Michelis Numerical Distribution Functions of Likelihood Ratio Tests for Cointegration by James G. MacKinnon Department of Economics Queen s University Kingston, Ontario, Canada K7L 3N6 jgm@qed.econ.queensu.ca Alfred

More information

ASYMPTOTIC NORMALITY OF THE QMLE ESTIMATOR OF ARCH IN THE NONSTATIONARY CASE

ASYMPTOTIC NORMALITY OF THE QMLE ESTIMATOR OF ARCH IN THE NONSTATIONARY CASE Econometrica, Vol. 7, No. March, 004), 4 4 ASYMPOIC NORMALIY OF HE QMLE ESIMAOR OF ARCH IN HE NONSAIONARY CASE BY SØREN OLVER JENSEN AND ANDERS RAHBEK We establish consistency and asymptotic normality

More information

ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications

ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications Yongmiao Hong Department of Economics & Department of Statistical Sciences Cornell University Spring 2019 Time and uncertainty

More information

Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models

Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models Fall 22 Contents Introduction 2. An illustrative example........................... 2.2 Discussion...................................

More information

Nonparametric Trending Regression with Cross-Sectional Dependence

Nonparametric Trending Regression with Cross-Sectional Dependence Nonparametric Trending Regression with Cross-Sectional Dependence Peter M. Robinson London School of Economics January 5, 00 Abstract Panel data, whose series length T is large but whose cross-section

More information

VECTOR AUTOREGRESSIONS WITH UNKNOWN MIXTURES OF I (0), I (1), AND I (2) COMPONENTS

VECTOR AUTOREGRESSIONS WITH UNKNOWN MIXTURES OF I (0), I (1), AND I (2) COMPONENTS Econometric Theory, 16, 2000, 905 926+ Printed in the United States of America+ VECTOR AUTOREGRESSIONS WITH UNKNOWN MIXTURES OF I (0), I (1), AND I (2) COMPONENTS YOOSOON CHANG Rice University This paper

More information

ECON0702: Mathematical Methods in Economics

ECON0702: Mathematical Methods in Economics ECON0702: Mathematical Methods in Economics Yulei Luo SEF of HKU January 12, 2009 Luo, Y. (SEF of HKU) MME January 12, 2009 1 / 35 Course Outline Economics: The study of the choices people (consumers,

More information

The Generalized Cochrane-Orcutt Transformation Estimation For Spurious and Fractional Spurious Regressions

The Generalized Cochrane-Orcutt Transformation Estimation For Spurious and Fractional Spurious Regressions The Generalized Cochrane-Orcutt Transformation Estimation For Spurious and Fractional Spurious Regressions Shin-Huei Wang and Cheng Hsiao Jan 31, 2010 Abstract This paper proposes a highly consistent estimation,

More information

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011 Revised March 2012

SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER. Donald W. K. Andrews. August 2011 Revised March 2012 SIMILAR-ON-THE-BOUNDARY TESTS FOR MOMENT INEQUALITIES EXIST, BUT HAVE POOR POWER By Donald W. K. Andrews August 2011 Revised March 2012 COWLES FOUNDATION DISCUSSION PAPER NO. 1815R COWLES FOUNDATION FOR

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Fall 2008 Environmental Econometrics (GR03) TSII Fall 2008 1 / 16 More on AR(1) In AR(1) model (Y t = µ + ρy t 1 + u t ) with ρ = 1, the series is said to have a unit root or a

More information

8 Periodic Linear Di erential Equations - Floquet Theory

8 Periodic Linear Di erential Equations - Floquet Theory 8 Periodic Linear Di erential Equations - Floquet Theory The general theory of time varying linear di erential equations _x(t) = A(t)x(t) is still amazingly incomplete. Only for certain classes of functions

More information

Bootstrapping Long Memory Tests: Some Monte Carlo Results

Bootstrapping Long Memory Tests: Some Monte Carlo Results Bootstrapping Long Memory Tests: Some Monte Carlo Results Anthony Murphy and Marwan Izzeldin Nu eld College, Oxford and Lancaster University. December 2005 - Preliminary Abstract We investigate the bootstrapped

More information

ECON 4160, Lecture 11 and 12

ECON 4160, Lecture 11 and 12 ECON 4160, 2016. Lecture 11 and 12 Co-integration Ragnar Nymoen Department of Economics 9 November 2017 1 / 43 Introduction I So far we have considered: Stationary VAR ( no unit roots ) Standard inference

More information

Title. Description. var intro Introduction to vector autoregressive models

Title. Description. var intro Introduction to vector autoregressive models Title var intro Introduction to vector autoregressive models Description Stata has a suite of commands for fitting, forecasting, interpreting, and performing inference on vector autoregressive (VAR) models

More information

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October

More information

Simple Estimators for Monotone Index Models

Simple Estimators for Monotone Index Models Simple Estimators for Monotone Index Models Hyungtaik Ahn Dongguk University, Hidehiko Ichimura University College London, James L. Powell University of California, Berkeley (powell@econ.berkeley.edu)

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Vector error correction model, VECM Cointegrated VAR

Vector error correction model, VECM Cointegrated VAR 1 / 58 Vector error correction model, VECM Cointegrated VAR Chapter 4 Financial Econometrics Michael Hauser WS17/18 2 / 58 Content Motivation: plausible economic relations Model with I(1) variables: spurious

More information

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?

More information

ARIMA Models. Richard G. Pierse

ARIMA Models. Richard G. Pierse ARIMA Models Richard G. Pierse 1 Introduction Time Series Analysis looks at the properties of time series from a purely statistical point of view. No attempt is made to relate variables using a priori

More information

It is easily seen that in general a linear combination of y t and x t is I(1). However, in particular cases, it can be I(0), i.e. stationary.

It is easily seen that in general a linear combination of y t and x t is I(1). However, in particular cases, it can be I(0), i.e. stationary. 6. COINTEGRATION 1 1 Cointegration 1.1 Definitions I(1) variables. z t = (y t x t ) is I(1) (integrated of order 1) if it is not stationary but its first difference z t is stationary. It is easily seen

More information

On the robustness of cointegration tests when series are fractionally integrated

On the robustness of cointegration tests when series are fractionally integrated On the robustness of cointegration tests when series are fractionally integrated JESUS GONZALO 1 &TAE-HWYLEE 2, 1 Universidad Carlos III de Madrid, Spain and 2 University of California, Riverside, USA

More information

Vector Time-Series Models

Vector Time-Series Models T. Rothenberg Fall, 2007 1 Introduction Vector Time-Series Models The n-dimensional, mean-zero vector process fz t g is said to be weakly stationary if secondorder moments exist and only depend on time

More information

Estimating the Number of Common Factors in Serially Dependent Approximate Factor Models

Estimating the Number of Common Factors in Serially Dependent Approximate Factor Models Estimating the Number of Common Factors in Serially Dependent Approximate Factor Models Ryan Greenaway-McGrevy y Bureau of Economic Analysis Chirok Han Korea University February 7, 202 Donggyu Sul University

More information

Comparing Nested Predictive Regression Models with Persistent Predictors

Comparing Nested Predictive Regression Models with Persistent Predictors Comparing Nested Predictive Regression Models with Persistent Predictors Yan Ge y and ae-hwy Lee z November 29, 24 Abstract his paper is an extension of Clark and McCracken (CM 2, 25, 29) and Clark and

More information

ARDL Cointegration Tests for Beginner

ARDL Cointegration Tests for Beginner ARDL Cointegration Tests for Beginner Tuck Cheong TANG Department of Economics, Faculty of Economics & Administration University of Malaya Email: tangtuckcheong@um.edu.my DURATION: 3 HOURS On completing

More information

Testing the null of cointegration with structural breaks

Testing the null of cointegration with structural breaks Testing the null of cointegration with structural breaks Josep Lluís Carrion-i-Silvestre Anàlisi Quantitativa Regional Parc Cientí c de Barcelona Andreu Sansó Departament of Economics Universitat de les

More information

A Conditional-Heteroskedasticity-Robust Con dence Interval for the Autoregressive Parameter

A Conditional-Heteroskedasticity-Robust Con dence Interval for the Autoregressive Parameter A Conditional-Heteroskedasticity-Robust Con dence Interval for the Autoregressive Parameter Donald W. K. Andrews Cowles Foundation for Research in Economics Yale University Patrik Guggenberger Department

More information

Multivariate Time Series: Part 4

Multivariate Time Series: Part 4 Multivariate Time Series: Part 4 Cointegration Gerald P. Dwyer Clemson University March 2016 Outline 1 Multivariate Time Series: Part 4 Cointegration Engle-Granger Test for Cointegration Johansen Test

More information

A Test of Cointegration Rank Based Title Component Analysis.

A Test of Cointegration Rank Based Title Component Analysis. A Test of Cointegration Rank Based Title Component Analysis Author(s) Chigira, Hiroaki Citation Issue 2006-01 Date Type Technical Report Text Version publisher URL http://hdl.handle.net/10086/13683 Right

More information

Chapter 2. GMM: Estimating Rational Expectations Models

Chapter 2. GMM: Estimating Rational Expectations Models Chapter 2. GMM: Estimating Rational Expectations Models Contents 1 Introduction 1 2 Step 1: Solve the model and obtain Euler equations 2 3 Step 2: Formulate moment restrictions 3 4 Step 3: Estimation and

More information

Inference on a Structural Break in Trend with Fractionally Integrated Errors

Inference on a Structural Break in Trend with Fractionally Integrated Errors Inference on a Structural Break in rend with Fractionally Integrated Errors Seongyeon Chang Boston University Pierre Perron y Boston University November, Abstract Perron and Zhu (5) established the consistency,

More information

Bootstrapping Long Memory Tests: Some Monte Carlo Results

Bootstrapping Long Memory Tests: Some Monte Carlo Results Bootstrapping Long Memory Tests: Some Monte Carlo Results Anthony Murphy and Marwan Izzeldin University College Dublin and Cass Business School. July 2004 - Preliminary Abstract We investigate the bootstrapped

More information

E 4101/5101 Lecture 9: Non-stationarity

E 4101/5101 Lecture 9: Non-stationarity E 4101/5101 Lecture 9: Non-stationarity Ragnar Nymoen 30 March 2011 Introduction I Main references: Hamilton Ch 15,16 and 17. Davidson and MacKinnon Ch 14.3 and 14.4 Also read Ch 2.4 and Ch 2.5 in Davidson

More information

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Defining cointegration Vector

More information

Estimation and Testing for Common Cycles

Estimation and Testing for Common Cycles Estimation and esting for Common Cycles Anders Warne February 27, 2008 Abstract: his note discusses estimation and testing for the presence of common cycles in cointegrated vector autoregressions A simple

More information

GMM estimation of spatial panels

GMM estimation of spatial panels MRA Munich ersonal ReEc Archive GMM estimation of spatial panels Francesco Moscone and Elisa Tosetti Brunel University 7. April 009 Online at http://mpra.ub.uni-muenchen.de/637/ MRA aper No. 637, posted

More information

Simulating Properties of the Likelihood Ratio Test for a Unit Root in an Explosive Second Order Autoregression

Simulating Properties of the Likelihood Ratio Test for a Unit Root in an Explosive Second Order Autoregression Simulating Properties of the Likelihood Ratio est for a Unit Root in an Explosive Second Order Autoregression Bent Nielsen Nuffield College, University of Oxford J James Reade St Cross College, University

More information

Trending Models in the Data

Trending Models in the Data April 13, 2009 Spurious regression I Before we proceed to test for unit root and trend-stationary models, we will examine the phenomena of spurious regression. The material in this lecture can be found

More information

Stationarity and cointegration tests: Comparison of Engle - Granger and Johansen methodologies

Stationarity and cointegration tests: Comparison of Engle - Granger and Johansen methodologies MPRA Munich Personal RePEc Archive Stationarity and cointegration tests: Comparison of Engle - Granger and Johansen methodologies Faik Bilgili Erciyes University, Faculty of Economics and Administrative

More information

1 Linear Algebra Problems

1 Linear Algebra Problems Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and

More information

Representation of integrated autoregressive processes in Banach space

Representation of integrated autoregressive processes in Banach space Representation of integrated autoregressive processes in Banach space Won-Ki Seo Department of Economics, University of California, San Diego April 2, 218 Very preliminary and incomplete. Comments welcome.

More information

A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED

A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED by W. Robert Reed Department of Economics and Finance University of Canterbury, New Zealand Email: bob.reed@canterbury.ac.nz

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Forecasting 1 to h steps ahead using partial least squares

Forecasting 1 to h steps ahead using partial least squares Forecasting 1 to h steps ahead using partial least squares Philip Hans Franses Econometric Institute, Erasmus University Rotterdam November 10, 2006 Econometric Institute Report 2006-47 I thank Dick van

More information

CAE Working Paper # Fixed-b Asymptotic Approximation of the Sampling Behavior of Nonparametric Spectral Density Estimators

CAE Working Paper # Fixed-b Asymptotic Approximation of the Sampling Behavior of Nonparametric Spectral Density Estimators CAE Working Paper #06-04 Fixed-b Asymptotic Approximation of the Sampling Behavior of Nonparametric Spectral Density Estimators by Nigar Hashimzade and Timothy Vogelsang January 2006. Fixed-b Asymptotic

More information

Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION. September 2017

Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION. September 2017 Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION By Degui Li, Peter C. B. Phillips, and Jiti Gao September 017 COWLES FOUNDATION DISCUSSION PAPER NO.

More information

LECTURE 13: TIME SERIES I

LECTURE 13: TIME SERIES I 1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural

More information

The PPP Hypothesis Revisited

The PPP Hypothesis Revisited 1288 Discussion Papers Deutsches Institut für Wirtschaftsforschung 2013 The PPP Hypothesis Revisited Evidence Using a Multivariate Long-Memory Model Guglielmo Maria Caporale, Luis A.Gil-Alana and Yuliya

More information

A FRACTIONAL DICKEY-FULLER TEST FOR UNIT ROOTS

A FRACTIONAL DICKEY-FULLER TEST FOR UNIT ROOTS A FRACTIONAL DICKEY-FULLER TEST FOR UNIT ROOTS By Juan J. Dolado, Jesus Gonzalo, and Laura Mayoral Abstract This paper presents a new test for fractionally integrated (FI) processes. In particular, we

More information

MATHEMATICAL PROGRAMMING I

MATHEMATICAL PROGRAMMING I MATHEMATICAL PROGRAMMING I Books There is no single course text, but there are many useful books, some more mathematical, others written at a more applied level. A selection is as follows: Bazaraa, Jarvis

More information

Forecasting Levels of log Variables in Vector Autoregressions

Forecasting Levels of log Variables in Vector Autoregressions September 24, 200 Forecasting Levels of log Variables in Vector Autoregressions Gunnar Bårdsen Department of Economics, Dragvoll, NTNU, N-749 Trondheim, NORWAY email: gunnar.bardsen@svt.ntnu.no Helmut

More information

Cointegration. Overview and Development

Cointegration. Overview and Development Cointegration. Overview and Development Søren Johansen Department of Applied Mathematics and Statistics, University of Copenhagen sjo@math.ku.dk Summary. This article presents a survey of the analysis

More information

13 Endogeneity and Nonparametric IV

13 Endogeneity and Nonparametric IV 13 Endogeneity and Nonparametric IV 13.1 Nonparametric Endogeneity A nonparametric IV equation is Y i = g (X i ) + e i (1) E (e i j i ) = 0 In this model, some elements of X i are potentially endogenous,

More information

1 Quantitative Techniques in Practice

1 Quantitative Techniques in Practice 1 Quantitative Techniques in Practice 1.1 Lecture 2: Stationarity, spurious regression, etc. 1.1.1 Overview In the rst part we shall look at some issues in time series economics. In the second part we

More information

Equivalence of several methods for decomposing time series into permananent and transitory components

Equivalence of several methods for decomposing time series into permananent and transitory components Equivalence of several methods for decomposing time series into permananent and transitory components Don Harding Department of Economics and Finance LaTrobe University, Bundoora Victoria 3086 and Centre

More information

On GMM Estimation and Inference with Bootstrap Bias-Correction in Linear Panel Data Models

On GMM Estimation and Inference with Bootstrap Bias-Correction in Linear Panel Data Models On GMM Estimation and Inference with Bootstrap Bias-Correction in Linear Panel Data Models Takashi Yamagata y Department of Economics and Related Studies, University of York, Heslington, York, UK January

More information

Testing for a Trend with Persistent Errors

Testing for a Trend with Persistent Errors Testing for a Trend with Persistent Errors Graham Elliott UCSD August 2017 Abstract We develop new tests for the coe cient on a time trend in a regression of a variable on a constant and time trend where

More information