Notes on Time Series Modeling

Size: px
Start display at page:

Download "Notes on Time Series Modeling"

Transcription

1 Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g tt : These notes will consider discrete stochastic processes, i.e., processes indexed by the set of integers T = f::: ; 1; ; 1; ; :::g: fy t g 1 t= 1: Moreover, each y t is a real number. Stochastic processes will be referred to more concisely as "processes." De nition The oint distribution of the process fy t g is determined by the oint distributions of all nite subsets of y t s: F t1 ;t ;:::;t n ( 1 ; ; :::; n ) = Pr(y t1 1 ; y t :::; y tn n ); for all possible collections of distinct integers t 1 ; t ; :::; t n. This distribution can be used to determine moments: Z Mean: t = E(y t ) = y t df t (y t ); Z Variance: t = E(y t t ) = (y t t ) df t (y t ); 1

2 Autocovariance: t = E(y t t )(y t t ) Z = (y t t )(y t t )df t;t (y t ; y t ): De nition f" t g is a white noise process if, for all t: t = E(" t ) = ; t = E(" t ) = ; t = E(" t " t ) =, 6= : f" t g is a Gaussian white noise process if " t N(; ) for all t, i.e., " t is normally distributed with mean and variance. De nition fy t g is an AR(1) process ( rst-order autoregressive) if y t = c + y t 1 + " t ; where f" t g is white noise and c; are arbitrary constants. For an AR(1) process with < 1: t = c 1 ; t = 1 ; t = 1 : Note that the moments in the preceding examples do not depend on t. This property de nes an important class of processes. De nition fy t g is covariance-stationary or weakly stationary if t and t do not depend on t. For these notes we will simply say "stationary." Intuitively, for a stationary process the e ects of a given realization of y t die out as t! 1.

3 De nition fy t g is an MA(1) process (in nite-order moving average) if y t = + X 1 = " t ; where f" t g is white noise and ; ; 1 ; ::: are arbitrary constants. A su cient condition for stationarity of an MA(1) process is square summability of the coe cients f g: X 1 = < 1: Linear forecasting interpretation Consider the problem of forecasting y t on the basis of past observations y t 1 ; y t ; :::. A linear forecasting rule predicts y t as a linear function of n past observations y t 1 ; y t ; :::; y t n : y t = X n g (n)y t ; =1 where g 1 (n); :::; g n (n) are constants. The forecast error and mean squared error of the forecast rule are given by: X n F E = y t g (n)y t ; =1 X n MSE = E(y t g (n)y t ) : =1 De nition The linear proection of y t on y t 1 ; y t ; :::; y t n is the linear forecasting rule that satis es X n E(y t =1 gp (n)y t )y t s = ; s = 1; ; :::, i.e., the forecast error is uncorrelated with y t s for all s >. Let the linear proection be denoted by y P t (n): y P t (n) = P n =1 gp (n)y t : The associated forecast error is denoted by " P t (n): " P t (n) = y t X n =1 gp t (n)y t :

4 The linear proection has the following key property. Proposition The linear proection minimizes M SE among all linear forecasting rules. Proof For any linear forecasting rule, we may write MSE = E(y t X n =1 g (n)y t ) = E(y t y P t (n) + y P t (n) = E(y t y P t (n)) + E(y P t (n) +E(y t y P t (n))(y P t (n) = E(y t y P t (n)) + E( X n + X n X n =1 g (n)y t ) X n X n =1 g (n)y t ) =1 g (n)y t ) =1 (gp (n) g (n))y t ) =1 (gp (n) g (n))e(y t y P t (n))y t : The third term is zero by the de nition of linear proection. Thus, MSE is minimized by setting g (n) = g P (n) for all, i.e., the linear proection gives the lowest MSE among all linear forecasting rules. It can be shown that the linear proections y P t (n) converge in mean square to a random variable y P t as n! 1: lim n!1 E(yP t (n) yt P ) = : y P t is the linear proection of y t on y t 1 ; y t ; :::. De nition The fundamental innovation is the forecast error associated with the linear proection of y t on y t 1 ; y t ; ::: : " t = y t y P t : Note that the fundamental innovation is a least squares residual that obeys the orthogonality condition E(" t y t s ) = for s = 1; ; :::. Wold Decomposition Theorem Any stationary process fy t g with Ey t = can be represented as: y t = X 1 = 4 " t + t ;

5 where: (i) = 1 and P 1 = < 1 (square summability); (ii) f" t g is white noise; (iii) " t = y t y P t (fundamental innovation); (iv) t is the linear proection of t on y t 1 ; y t ; :::; and (v) E(" s t ) = for all s and t. " t is called the linearly indeterministic component, and t is called the linearly deterministic component. The Wold Decomposition Theorem represents the stationary process fy t g in terms of processes f" t g and f t g that are orthogonal at all leads and lags. The component f t g can be predicted arbitrarily well from a linear function of y t 1 ; y t ; :::, while the component f" t g is the forecast error when yt P is used to forecast y t. De nition A stationary process fy t g is purely linearly indeterministic if t = for all t. For a purely linearly indeterministic process, the Wold Decomposition Theorem shows that fy t g can be represented as a MA(1) process with = : y t = X 1 = " t ; This is called the moving average representation of fy t g. Example An AR(1) process fy t g may be represented using the lag operator: (1 L)y t = c + " t If < 1: y t = = 1 1 L (c + " t) = X 1 = (L) (c + " t ) c 1 + X 1 = " t

6 Express as deviation from mean: ^y t = y t c 1 = X 1 = " t Then E(^y t ) =. It follows that f^y t g is purely linearly indeterministic, and the MA(1) representation has =. Moreover: X 1 E(^y t ^y t 1 )^y t s = E" t = " t s = X 1 = E" t " t s = Thus ^y t P = ^y t 1, and the " t s are the fundamental innovations of the process. M A(1) representation of AR(p) processes De nition fy t g is an AR(p) process ( p th -order autoregressive) if y t = c + 1 y t 1 + y t + ::: + p y t p + " t (1) = c + X p i=1 iy t i + " t ; where f" t g is white noise and c; 1 ; :::; p are arbitrary constants. Represent (1) using the lag operator: (1 1 L L ::: p L p )y t = c + " t : To obtain an M A(1) representation, consider the equation p 1 p 1 p ::: p = : () This is called the characteristic equation. According to the fundamental theorem of algebra, there are p roots 1 ; ; :::; p in the complex plane such that, for any : p 1 p 1 p ::: p = ( 1 )( ) ( p ): Note that complex roots come in conugate pairs i = a + bi; = a by p and let z = 1=: bi. Divide through 1 1 z z ::: p z p = (1 1 z)(1 z) (1 p z): 6

7 By setting z = L we may write (1 1 L L ::: p L p )y t = (1 1 L)(1 L) (1 p L)y t = c + " t : Assume i < 1 for all i, i.e., all roots lie inside the unit circle on the complex plane (recall a + bi = a + b, which is the length of the vector (a; b)). Solve for y t : y t = L 1 L 1 1 p L (c + " t): () The MA(1) representation is derived from () in two steps. Step 1 - Constant term. Note that, for any constant : Thus: 1 1 i L = X 1 = ( il) = X 1 = i = 1 i : L 1 L 1 1 p L c = c (1 1 )(1 ) (1 p ) = c : (4) 1 1 ::: p Step - MA coe cients. Suppose the roots of () are distinct, i.e., i 6= k for all i; k. Then the product term in () can be expanded with partial fractions: where L 1 L 1 1 p L = X p! i p 1 i i=1! i 1 i L ; : () py ( i k ) k=1 k6=i It can be shown that P p i=1! i = 1. Furthermore, we can write: X p i=1! i 1 i L = X p! X 1 i ( il) = X 1 X p! i i=1 = = i=1 i L = X 1 = L ; where X p i=1! i i : (6) 7

8 Thus: L 1 L 1 1 p L " t = X 1 = " t : Clearly, = 1. Combining the terms gives: y t = + " t + X 1 =1 " t : (7) The restriction i < 1 for all i implies that f g satis es square summability, and so fy t g is stationary. The following proposition summarizes this analysis. Proposition. Suppose () has distinct roots 1 ; :::; p satisfying i < 1 for all i. Then the AR(p) process (1) is stationary and has an MA(1) representation (7), where is given by (4) and i is given by () and (6). Often AR(p) processes are analyzed using this alternative form of the characteristic equation: 1 1 z z ::: p z p = : In this case, the stationarity condition is that the roots lie outside of the unit circle, since the roots of this equation are the inverses of the earlier roots. 4 Nonstationary processes a Trend-stationary processes De nition fy t g is a trend-stationary process if fy t y tr t g is stationary, where fy tr t g is a deterministic sequence referred to as the trend of fy t g. Example Let fy t g be given by y t = X K = k= kt k + X 1 " t ; 8

9 where 1 ; :::; K are arbitrary constants. In this case the process has a polynomial trend: y tr t = X K k= kt k : fy t g is trend-stationary as long as its MA(1) component is stationary. b Unit root processes De nition An AR(p) process is integrated of order r, or I(r), if its characteristic equation has r roots equal to unity. Let fy t g be an AR(p) process whose roots satisfy i < 1, i = 1; :::; p 1, and p = 1. Then fy t g is I(1), and it may be written as (1 1 L L ::: p L p )y t = (1 1 L)(1 L) (1 p 1 L)(1 L)y t = (1 1 L)(1 L) (1 p 1 L)y t = c + " t ; where y t = y t y t 1 is the rst di erence of y t. It follows that the process fy t g is stationary. Example The following AR(1) process is called a random walk with drift: y t = c + y t 1 + " t : De ne the process fy t g by y t = c + " t : Then fy t g is stationary. V AR(p) processes a De nition De nition. fy t g is a V AR(p) process ( p th -order vector autoregressive) if Y t = C + X p i=1 iy t i + " t ; (8) 9

10 where Y t = 6 4 y 1t y t. ; C = c 1 c. ; i = i 11 i 1 i 1n i 1 i i n... ; " t = " 1t " t. ; 7 y nt c n i n1 i n i nn " nt and " t is vector white noise: E(" t ) = n1 ; E(" t " t) = ; where is a positive de nite and symmtric n n matrix, and E(" s " t) = nn for all s 6= t: is the variance-covariance matrix of the white noise vector. Positive de niteness means that x x > for all nonzero n-vectors x. b Stationarity To evaluate stationarity of a V AR(p), consider the equation I n p 1 p 1 p ::: p = ; (9) where denotes the determinant and I n is the n n identity matrix: 1 1 I n : The V AR is stationary if all solutions = i to (9) satisfy i < 1. (Note that there are np roots of (9), possibly repeated, and complex roots come in conugate pairs.) Equivalently, the V AR is stationary if all values of z satisfying I n 1 z z ::: p z p = 1

11 lie outside of the unit circle. The solutions to (9) can be computed using the following np np matrix: 1 p 1 p I n n n n n F = n I n n n n ; n n n I n n where n is an n n matrix of zeros. It can be shown that the eigenvalues 1 ; :::; np of F are precisely the solutions to I n p 1 p 1 p ::: p = : To calculate the mean of a stationary V AR, take expectation: EY t = C + X p iey t i : i=1 Stationarity implies EY t = for all t. Thus: = (I n X p i=1 i) 1 C: The variance and autocovariances of a stationary V AR are given by E(Y t )(Y t ) : Each is an n n matrix, with ik giving the covariance between y it and y k;t. c M A(1) representation To obtain an M A(1) representation, express (8) as X p (I n il i )Y t = C + " t : i=1 Stationarity allows us to invert the lag polynomial: (I n X p i=1 il i ) 1 = X 1 = L : (1) 11

12 Thus: Y t = + X 1 = " t : (11) The values of, = ; 1; ; ::: may be obtained using the method of undetermined coe cients. Write (1) as I n = (I n X p i=1 il i ) X 1 = L : (1) The constant terms on each side of (1) must agree. Thus: I n = : (1) Further, since there are no powers of L on the LHS, the coe cient of L on the RHS must equal zero for each > : = 1 1 p p ; = 1; ; :::. (14) Given the coe cients i and = I n, (14) may be iterated to compute MA coe cients 1; ; ; :::. Nonuniqueness. Importantly, the M A(1) representation of a V AR is nonunique. Let H be any nonsingular n n matrix, and de ne u t H" t : Note that u t is vector white noise: E(u t ) = HE(" t ) = n1 ; E(u t u t) = HE(" t " t)h = HH ; E(u s u t) = HE(" s " t)h = n ; and HH is positive de nite since H x is nonzero whenever x is. The MA(1) representation can be expressed as Y t = + X 1 = H 1 H" t = + X 1 = u t ; 1

13 where H 1. Note that in this case u t is not the fundamental innovation. To obtain the MA(1) representation in terms of the fundamental innovation we must impose the normalization = I n, i.e., H = I n. 6 Identi cation of shocks a Triangular factorization We wish to assess how uctuations in "more exogenous" variables a ect "less exogenous" ones. One way to do this is to rearrange the vector of innovations " t into components that derive from "exogenous shocks" to the n variables. This can be accomplished using a triangular factorization of. For any positive de nite symmetric matrix, there exists a unique representation of the form = ADA ; (1) where A is a lower triangular matrix with 1 s along the principal diagonal: 1 a 1 1 A = a 1 a 1 ; a n1 a n a n 1 and D is a diagonal matrix: D = 6 4 with d ii > for i = 1; :::; n. d 11 d d.... d nn ; 7 1

14 Use the factorization to de ne a vector of exogenous shocks: u t A 1 " t : Substitute into the M A(1) representation to obtain an alternative "structural" representation: Y t = + " t + X 1 =1 " t = + Au t + X 1 =1 Au t = + X 1 = u t ; where A; A; = 1; ; :::. Note that the shocks u 1t ; :::; u nt are mutually uncorrelated: E(u t u t) = A 1 E(" t " t)(a 1 ) = A 1 (A ) 1 = A 1 ADA (A ) 1 = D: Thus: V ar(u it ) = d ii ; Cov(u it ; u kt ) = : To implement this approach, we order the variables from "most exogenous" to "least exogenous." This means that innovations to y it are a ected by the shocks u 1t ; :::; u it, but not by u i+1;t ; :::; u nt. Bivariate case. where ^y it y it matrices Let n =. (11) may be expressed as ^y t " t 4 ^y 1t = 4 " 1t + X 1 = " 1;t " ;t i. Here y 1t is taken to be "most exogenous." is factorized using the A = 4 1 a 1 1 ; D = 4 d 11 : d ; Thus, 4 " 1t " t = 4 1 a u 1t u t = 4 u 1t a 1 u 1t + u t : 14

15 Innovations to y 1t are driven by the exogenous shocks u 1t. Innovations to y t are driven by both innovations to y 1t and uncorrelated shocks u t. Furthermore, for > : = A = a 1 1 = a a 1 : The alternative "structural" M A(1) representation is 4 ^y 1t = u 1t + X 1 4 =1 a 1 1 ^y t u t 11 + a a 1 4 u 1;t u ;t We can use this to assess the e ects of an exogenous shock to y 1t. Suppose the system begins : in the nonstochastic steady state: 4 u 1;t u ;t = 4 ; = 1; ; ::: ) 4 ^y 1;t ^y ;t = 4 : At time t there is a positive shock to y 1t, and there are no shocks following this: 4 u 1t = 4 1 ; u t ^y t 4 u 1;t+ u ;t+ = 4 ; = 1; ; :::. Then from the above representation we have 4 ^y 1t = = 4 1 a ^y 1;t+ ^y ;t+ = a a 1 a = 4 ; 11 + a a 1 Subsequent movements in each variable are driven by the direct e ect of y 1t and an indirect e ect coming through the response of y t. These are the orthogonalized impulse-response functions. We can also assess the e ects of a positive shock to y t, as captured by u t. In this case the change in y t is conditioned on u 1t, i.e., u t indicates the movement in y t that cannot : 1

16 be predicted after u 1t is known. 4 u 1t = 4 ; 1 u t 4 ^y 1;t+ ^y ;t+ 4 ^y 1t ^y t = 4 = 4 u 1;t+ u ;t+ 4 1 a 1 1 = 4 ; = 1; ; :::, 11 + a a 1 4 = 4 ; = 4 1 Note that u 1t a ects y t in period t (as long as a 1 6= ), but u t does not a ect y 1t. This is the sense in which y 1t is "more exogenous." 1 : Empirical implementation. For a given observed sample of size T, we can obtain OLS estimates ^C and ^ i, i = 1; :::; p by regressing Y t on a constant terms and p lags Y t 1 ; :::; Y t p. Estimated innovations are obtained from the OLS residuals: ^" t = Y t ^C X p ^ i Y t i=1 i : The variance-covariance matrix is estimated as ^ = 1 T X T t=1 ^" t^" t: Estimates of the MA coe cients ^, = 1; ; ::: can be obtained using the formulas derived above: ^ = I n ; ^ s ^ s 1 ^1 ^ s ^ ^ s p ^p = ; s = 1; ; :::. Orthogonalized impulse response functions are computed as ^ = A; ^ = ^ A; = 1; ; :::. The coe cient ^ ik, the ik-element of ^, gives the response of ^y i;t+ to a one-unit positive shock to u kt. 16

17 Cholesky factorization. For any positive de nite symmetric matrix, there exists a unique representation of the form = P P ; (16) where P = AD 1= = a 1 1 a 1 a a n1 a n a n p d11.. p d p d.. p dnn : 7 This is called the Cholesky factorization. Using the Cholesky factorization, the vector of exogenous shocks may be de ned as: v t P 1 " t : In the structural representation, A is simply replaced by P. Moreover, E(v t v t) = I n, i.e., V ar(v it ) = 1 for all i. b Forecast error decomposition For a stationary V AR(p), consider the problem of forecasting Y t+s at period t. Using (11), the forecast error may be written Y t+s E t Y t+s = X s =1 s " t+ ; where E t () denotes expectation conditional on period t information. The mean squared error of the s-period ahead forecast is given by = X s =1 MSE(s) = E(Y t+s E t Y t+s )(Y t+s E t Y t+s ) s X s = E =1 E(" t+ " t+) s s = X s " t+ X s + X s =1 s =1 l=1 " t+l s l X l6= s ; s E(" t+ " t+l ) s l 17

18 since E(" t+ " t+ ) = E(" t" t) = for all, while E(" t+ " t+l ) = nn for l 6=. MSE(s) can be decomposed based on the contributions of the identi ed shocks u 1t ; :::; u nt. The innovations " t may be expressed as " t = Au t = X n i=1 A iu it ; where A i is the i th column of the matrix A de ned in (1). Thus: X = E(" t " n t) = E A iu it X n i=1 k=1 A k u kt = X n i=1 A ie(u it)a i + X n = X n i=1 X i=1 A id ii A i; i6=k A ie(u it u kt )A k since E(u it ) = V ar(u it) = d ii and, for k 6= i, E(u it u kt ) = Cov(u it ; u kt ) =. Substitution gives MSE(s) = X s = X n s =1 d X s ii i=1 =1 X n s i=1 A id ii A i A i A i s : s (17) Equation (17) decomposes M SE(s) into n terms, associated with variation contributed by the n shocks u 1t ; :::; u nt. As s! 1, stationarity implies E t Y t+s!, and MSE(s)! E(Y t )(Y t ) = ; i.e., MSE(s) converges to the variance of the V AR. Thus, (17) decomposes the variance in terms of the contributions of the underlying shocks. When the Cholesky factorization is used, the vectors A i are replaced by vectors P i, which are columns of the matrix P de ned in (16). c Identi cation via long-run restrictions Consider the following bivariate VAR process: 4 y 1t = 4 y 1;t " 1t + X 1 y t " t =1 4 " 1;t " ;t ; (18) 18

19 with variance-covariance matrix, where = : Note that forecasted values of y 1t are permanently a ected by innovations, while the e ects on y t die out when the s satisfy suitable stationary restrictions. This distinction can be used to identify "permanent" versus "transitory" shocks. Write (18) as y t " t 4 y 1t = 4 " 1t + X 1 =1 4 " 1;t " ;t ; (19) where y t = y t y t 1, and assume that (19) is stationary. We wish to obtain a structural representation 4 y 1t y t = X 1 = 4 u 1;t u ;t ; () where u 1t and u t indicate permanent and transitory shocks, respectively, and = Assume Cov(u 1t ; u t ) = and V ar(u 1t ) = V ar(u t ) = 1, i.e., the variances of the shocks are normalized to unity. Furthermore, since u t = " t : : E t (u t u t) = E t (" t " t) ) = : Recall that the Cholesky factorization gives a unique lower triangular matrix satisfying P P =. It follows that = P for some orthogonal matrix, i.e., satis es = I. Orthogonality implies three restrictions on., so we need one more restriction to identify For the fourth restriction, assume that u t has no long-run e ect on the level of y 1t, so that u t is transitory. For this to be true, all e ects on y 1t must cancel out in the long run: X 1 = 1 = : 19

20 Moreover, since = : 1 = : Substitute and rearrange: 1 X 1 = 11 + X 1 = 1 = : This supplies one more restriction, and thus is identi ed. 7 Granger causality Consider two stationary processes fy 1t g and fy t g. Recall that the linear proection of y 1t on y 1;t 1 ; y 1;t ; :::, denoted by y1t P, minimizes MSE among all linear forecast rules. We are interested in whether the variable y t can be used to obtain better predictions of y 1t. That is, does the linear proection of y 1t on y 1;t 1 ; y 1;t ; ::: and y ;t 1 ; y ;t ; ::: give a lower MSE than y P 1t? If not, then we say that the variable y t does not Granger-cause y 1t. Suppose y 1t and y t are given by a bivariate V AR: 4 y 1t = 4 c 1 + X p y t c i=1 4 i 11 i 1 i 1 i 4 y 1;t i y ;t i + 4 " 1t " t : Then y t does not Granger-cause y 1t if the coe cient matrices are lower triangular: 4 i 11 i 1 = 4 i 11 ; i = 1; :::; p: i 1 i i 1 i To test for Granger causality, estimate the rst equation in the V AR with and without the parameter restriction y 1t = c 1 + X p y 1t = c 1 + X p i=1 i 11y 1;t i + 1t ; i=1 (i 11y 1;t i + i 1y ;t i ) + " 1t : Let ^ 1t and ^" 1t be the tted residuals and let the sample size be T. De ne RSS = X T t=1 ^ 1t; RSS 1 = X T t=1 ^" 1t:

21 Then for large T the following statistic has a distribution: S = T (RSS RSS 1 ) RSS 1 If S exceeds a designated critical value for a (p) variable (e.g., %), then we reect the null hypothesis that y t does not Granger-cause y 1t, i.e., y t does help in forecasting y 1t. Sources Hamilton, Time Series Analysis, 1994, chs.,4,11 Blanchard and Quah,"The Dynamic E ects of Aggregate Demand and Supply Disturbances," AER, Sept

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models

Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models Chapter 6. Maximum Likelihood Analysis of Dynamic Stochastic General Equilibrium (DSGE) Models Fall 22 Contents Introduction 2. An illustrative example........................... 2.2 Discussion...................................

More information

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56 Cointegrated VAR s Eduardo Rossi University of Pavia November 2013 Rossi Cointegrated VAR s Financial Econometrics - 2013 1 / 56 VAR y t = (y 1t,..., y nt ) is (n 1) vector. y t VAR(p): Φ(L)y t = ɛ t The

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Time Series Analysis for Macroeconomics and Finance

Time Series Analysis for Macroeconomics and Finance Time Series Analysis for Macroeconomics and Finance Bernd Süssmuth IEW Institute for Empirical Research in Economics University of Leipzig December 12, 2011 Bernd Süssmuth (University of Leipzig) Time

More information

Problem set 1 - Solutions

Problem set 1 - Solutions EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed

More information

2. Multivariate ARMA

2. Multivariate ARMA 2. Multivariate ARMA JEM 140: Quantitative Multivariate Finance IES, Charles University, Prague Summer 2018 JEM 140 () 2. Multivariate ARMA Summer 2018 1 / 19 Multivariate AR I Let r t = (r 1t,..., r kt

More information

1 First-order di erence equation

1 First-order di erence equation References Hamilton, J. D., 1994. Time Series Analysis. Princeton University Press. (Chapter 1,2) The task facing the modern time-series econometrician is to develop reasonably simple and intuitive models

More information

Title. Description. var intro Introduction to vector autoregressive models

Title. Description. var intro Introduction to vector autoregressive models Title var intro Introduction to vector autoregressive models Description Stata has a suite of commands for fitting, forecasting, interpreting, and performing inference on vector autoregressive (VAR) models

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Fall 2008 Environmental Econometrics (GR03) TSII Fall 2008 1 / 16 More on AR(1) In AR(1) model (Y t = µ + ρy t 1 + u t ) with ρ = 1, the series is said to have a unit root or a

More information

Vector Auto-Regressive Models

Vector Auto-Regressive Models Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

VAR Models and Applications

VAR Models and Applications VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Equivalence of several methods for decomposing time series into permananent and transitory components

Equivalence of several methods for decomposing time series into permananent and transitory components Equivalence of several methods for decomposing time series into permananent and transitory components Don Harding Department of Economics and Finance LaTrobe University, Bundoora Victoria 3086 and Centre

More information

1 Regression with Time Series Variables

1 Regression with Time Series Variables 1 Regression with Time Series Variables With time series regression, Y might not only depend on X, but also lags of Y and lags of X Autoregressive Distributed lag (or ADL(p; q)) model has these features:

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

Parametric Inference on Strong Dependence

Parametric Inference on Strong Dependence Parametric Inference on Strong Dependence Peter M. Robinson London School of Economics Based on joint work with Javier Hualde: Javier Hualde and Peter M. Robinson: Gaussian Pseudo-Maximum Likelihood Estimation

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT MARCH 29, 26 LECTURE 2 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT (Davidson (2), Chapter 4; Phillips Lectures on Unit Roots, Cointegration and Nonstationarity; White (999), Chapter 7) Unit root processes

More information

LECTURE 13: TIME SERIES I

LECTURE 13: TIME SERIES I 1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural

More information

Simultaneous Equation Models Learning Objectives Introduction Introduction (2) Introduction (3) Solving the Model structural equations

Simultaneous Equation Models Learning Objectives Introduction Introduction (2) Introduction (3) Solving the Model structural equations Simultaneous Equation Models. Introduction: basic definitions 2. Consequences of ignoring simultaneity 3. The identification problem 4. Estimation of simultaneous equation models 5. Example: IS LM model

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Vector Time-Series Models

Vector Time-Series Models T. Rothenberg Fall, 2007 1 Introduction Vector Time-Series Models The n-dimensional, mean-zero vector process fz t g is said to be weakly stationary if secondorder moments exist and only depend on time

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Econ 424 Time Series Concepts

Econ 424 Time Series Concepts Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length

More information

Vector Autogregression and Impulse Response Functions

Vector Autogregression and Impulse Response Functions Chapter 8 Vector Autogregression and Impulse Response Functions 8.1 Vector Autogregressions Consider two sequences {y t } and {z t }, where the time path of {y t } is affected by current and past realizations

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails GMM-based inference in the AR() panel data model for parameter values where local identi cation fails Edith Madsen entre for Applied Microeconometrics (AM) Department of Economics, University of openhagen,

More information

1. The Multivariate Classical Linear Regression Model

1. The Multivariate Classical Linear Regression Model Business School, Brunel University MSc. EC550/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 08956584) Lecture Notes 5. The

More information

Structural VAR Models and Applications

Structural VAR Models and Applications Structural VAR Models and Applications Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 SVAR: Objectives Whereas the VAR model is able to capture efficiently the interactions between the different

More information

Economics Computation Winter Prof. Garey Ramey. Exercises : 5 ; B = AX = B: 5 ; h 3 1 6

Economics Computation Winter Prof. Garey Ramey. Exercises : 5 ; B = AX = B: 5 ; h 3 1 6 Economics 80 - Computation Winter 01-18 Prof. Garey Ramey Exercises Exercise 1. Consider the following maices: 9 A = 1 8 ; B = 1 1 1 : a. Enter A and B as Matlab variables. b. Calculate the following maices:

More information

ARIMA Models. Richard G. Pierse

ARIMA Models. Richard G. Pierse ARIMA Models Richard G. Pierse 1 Introduction Time Series Analysis looks at the properties of time series from a purely statistical point of view. No attempt is made to relate variables using a priori

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

Chapter 2. Dynamic panel data models

Chapter 2. Dynamic panel data models Chapter 2. Dynamic panel data models School of Economics and Management - University of Geneva Christophe Hurlin, Université of Orléans University of Orléans April 2018 C. Hurlin (University of Orléans)

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

Chapter 5. Structural Vector Autoregression

Chapter 5. Structural Vector Autoregression Chapter 5. Structural Vector Autoregression Contents 1 Introduction 1 2 The structural moving average model 1 2.1 The impulse response function....................... 2 2.2 Variance decomposition...........................

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

When Do Wold Orderings and Long-Run Recursive Identifying Restrictions Yield Identical Results?

When Do Wold Orderings and Long-Run Recursive Identifying Restrictions Yield Identical Results? Preliminary and incomplete When Do Wold Orderings and Long-Run Recursive Identifying Restrictions Yield Identical Results? John W Keating * University of Kansas Department of Economics 334 Snow Hall Lawrence,

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

UNIVERSITY OF CALIFORNIA, SAN DIEGO DEPARTMENT OF ECONOMICS

UNIVERSITY OF CALIFORNIA, SAN DIEGO DEPARTMENT OF ECONOMICS 2-7 UNIVERSITY OF LIFORNI, SN DIEGO DEPRTMENT OF EONOMIS THE JOHNSEN-GRNGER REPRESENTTION THEOREM: N EXPLIIT EXPRESSION FOR I() PROESSES Y PETER REINHRD HNSEN DISUSSION PPER 2-7 JULY 2 The Johansen-Granger

More information

New Introduction to Multiple Time Series Analysis

New Introduction to Multiple Time Series Analysis Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2

More information

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.

More information

Cointegration, Stationarity and Error Correction Models.

Cointegration, Stationarity and Error Correction Models. Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average

More information

Economics 620, Lecture 13: Time Series I

Economics 620, Lecture 13: Time Series I Economics 620, Lecture 13: Time Series I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 13: Time Series I 1 / 19 AUTOCORRELATION Consider y = X + u where y is

More information

Lecture 7a: Vector Autoregression (VAR)

Lecture 7a: Vector Autoregression (VAR) Lecture 7a: Vector Autoregression (VAR) 1 Big Picture We are done with univariate time series analysis Now we switch to multivariate analysis, that is, studying several time series simultaneously. VAR

More information

Empirical Macroeconomics

Empirical Macroeconomics Empirical Macroeconomics Francesco Franco Nova SBE April 5, 2016 Francesco Franco Empirical Macroeconomics 1/39 Growth and Fluctuations Supply and Demand Figure : US dynamics Francesco Franco Empirical

More information

Ch. 19 Models of Nonstationary Time Series

Ch. 19 Models of Nonstationary Time Series Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

FINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 2015

FINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 2015 FINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 205 Time Allowed: 60 minutes Family Name (Surname) First Name Student Number (Matr.) Please answer all questions by

More information

We begin by thinking about population relationships.

We begin by thinking about population relationships. Conditional Expectation Function (CEF) We begin by thinking about population relationships. CEF Decomposition Theorem: Given some outcome Y i and some covariates X i there is always a decomposition where

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Structural Macroeconometrics. Chapter 4. Summarizing Time Series Behavior

Structural Macroeconometrics. Chapter 4. Summarizing Time Series Behavior Structural Macroeconometrics Chapter 4. Summarizing Time Series Behavior David N. DeJong Chetan Dave The sign of a truly educated man is to be deeply moved by statistics. George Bernard Shaw This chapter

More information

Empirical Macroeconomics

Empirical Macroeconomics Empirical Macroeconomics Francesco Franco Nova SBE April 21, 2015 Francesco Franco Empirical Macroeconomics 1/33 Growth and Fluctuations Supply and Demand Figure : US dynamics Francesco Franco Empirical

More information

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Fin. Econometrics / 31

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Fin. Econometrics / 31 Cointegrated VAR s Eduardo Rossi University of Pavia November 2014 Rossi Cointegrated VAR s Fin. Econometrics - 2014 1 / 31 B-N decomposition Give a scalar polynomial α(z) = α 0 + α 1 z +... + α p z p

More information

B y t = γ 0 + Γ 1 y t + ε t B(L) y t = γ 0 + ε t ε t iid (0, D) D is diagonal

B y t = γ 0 + Γ 1 y t + ε t B(L) y t = γ 0 + ε t ε t iid (0, D) D is diagonal Structural VAR Modeling for I(1) Data that is Not Cointegrated Assume y t =(y 1t,y 2t ) 0 be I(1) and not cointegrated. That is, y 1t and y 2t are both I(1) and there is no linear combination of y 1t and

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Forecasting and Estimation

Forecasting and Estimation February 3, 2009 Forecasting I Very frequently the goal of estimating time series is to provide forecasts of future values. This typically means you treat the data di erently than if you were simply tting

More information

Lecture 7a: Vector Autoregression (VAR)

Lecture 7a: Vector Autoregression (VAR) Lecture 7a: Vector Autoregression (VAR) 1 2 Big Picture We are done with univariate time series analysis Now we switch to multivariate analysis, that is, studying several time series simultaneously. VAR

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

2014 Preliminary Examination

2014 Preliminary Examination 014 reliminary Examination 1) Standard error consistency and test statistic asymptotic normality in linear models Consider the model for the observable data y t ; x T t n Y = X + U; (1) where is a k 1

More information

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14 Introduction to Econometrics (3 rd Updated Edition) by James H. Stock and Mark W. Watson Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14 (This version July 0, 014) 015 Pearson Education,

More information

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations.

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations. Exercises for the course of Econometrics Introduction 1. () A researcher is using data for a sample of 30 observations to investigate the relationship between some dependent variable y i and independent

More information

The Predictive Space or If x predicts y; what does y tell us about x?

The Predictive Space or If x predicts y; what does y tell us about x? The Predictive Space or If x predicts y; what does y tell us about x? Donald Robertson and Stephen Wright y October 28, 202 Abstract A predictive regression for y t and a time series representation of

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008

ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008 ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008 Instructions: Answer all four (4) questions. Point totals for each question are given in parenthesis; there are 00 points possible. Within

More information

1 Augmented Dickey Fuller, ADF, Test

1 Augmented Dickey Fuller, ADF, Test Applied Econometrics 1 Augmented Dickey Fuller, ADF, Test Consider a simple general AR(p) process given by Y t = ¹ + Á 1 Y t 1 + Á 2 Y t 2 + ::::Á p Y t p + ² t (1) If this is the process generating the

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Advanced Econometrics

Advanced Econometrics Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Chapter 2: Unit Roots

Chapter 2: Unit Roots Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity

More information

2.5 Forecasting and Impulse Response Functions

2.5 Forecasting and Impulse Response Functions 2.5 Forecasting and Impulse Response Functions Principles of forecasting Forecast based on conditional expectations Suppose we are interested in forecasting the value of y t+1 based on a set of variables

More information

1 Quantitative Techniques in Practice

1 Quantitative Techniques in Practice 1 Quantitative Techniques in Practice 1.1 Lecture 2: Stationarity, spurious regression, etc. 1.1.1 Overview In the rst part we shall look at some issues in time series economics. In the second part we

More information

Chapter 6: Endogeneity and Instrumental Variables (IV) estimator

Chapter 6: Endogeneity and Instrumental Variables (IV) estimator Chapter 6: Endogeneity and Instrumental Variables (IV) estimator Advanced Econometrics - HEC Lausanne Christophe Hurlin University of Orléans December 15, 2013 Christophe Hurlin (University of Orléans)

More information

On Econometric Analysis of Structural Systems with Permanent and Transitory Shocks and Exogenous Variables

On Econometric Analysis of Structural Systems with Permanent and Transitory Shocks and Exogenous Variables On Econometric Analysis of Structural Systems with Permanent and Transitory Shocks and Exogenous Variables Adrian Pagan School of Economics and Finance, Queensland University of Technology M. Hashem Pesaran

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

VAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where:

VAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where: VAR Model (k-variate VAR(p model (in the Reduced Form: where: Y t = A + B 1 Y t-1 + B 2 Y t-2 + + B p Y t-p + ε t Y t = (y 1t, y 2t,, y kt : a (k x 1 vector of time series variables A: a (k x 1 vector

More information

A. Recursively orthogonalized. VARs

A. Recursively orthogonalized. VARs Orthogonalized VARs A. Recursively orthogonalized VAR B. Variance decomposition C. Historical decomposition D. Structural interpretation E. Generalized IRFs 1 A. Recursively orthogonalized Nonorthogonal

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

10) Time series econometrics

10) Time series econometrics 30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit

More information

Augmenting our AR(4) Model of Inflation. The Autoregressive Distributed Lag (ADL) Model

Augmenting our AR(4) Model of Inflation. The Autoregressive Distributed Lag (ADL) Model Augmenting our AR(4) Model of Inflation Adding lagged unemployment to our model of inflationary change, we get: Inf t =1.28 (0.31) Inf t 1 (0.39) Inf t 2 +(0.09) Inf t 3 (0.53) (0.09) (0.09) (0.08) (0.08)

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

ECON 4160, Lecture 11 and 12

ECON 4160, Lecture 11 and 12 ECON 4160, 2016. Lecture 11 and 12 Co-integration Ragnar Nymoen Department of Economics 9 November 2017 1 / 43 Introduction I So far we have considered: Stationary VAR ( no unit roots ) Standard inference

More information

Lecture Notes on Measurement Error

Lecture Notes on Measurement Error Steve Pischke Spring 2000 Lecture Notes on Measurement Error These notes summarize a variety of simple results on measurement error which I nd useful. They also provide some references where more complete

More information

ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University

ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University Instructions: Answer all four (4) questions. Be sure to show your work or provide su cient justi cation for

More information

Vector error correction model, VECM Cointegrated VAR

Vector error correction model, VECM Cointegrated VAR 1 / 58 Vector error correction model, VECM Cointegrated VAR Chapter 4 Financial Econometrics Michael Hauser WS17/18 2 / 58 Content Motivation: plausible economic relations Model with I(1) variables: spurious

More information

It is easily seen that in general a linear combination of y t and x t is I(1). However, in particular cases, it can be I(0), i.e. stationary.

It is easily seen that in general a linear combination of y t and x t is I(1). However, in particular cases, it can be I(0), i.e. stationary. 6. COINTEGRATION 1 1 Cointegration 1.1 Definitions I(1) variables. z t = (y t x t ) is I(1) (integrated of order 1) if it is not stationary but its first difference z t is stationary. It is easily seen

More information

1. Stochastic Processes and Stationarity

1. Stochastic Processes and Stationarity Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed

More information

Estimating and Identifying Vector Autoregressions Under Diagonality and Block Exogeneity Restrictions

Estimating and Identifying Vector Autoregressions Under Diagonality and Block Exogeneity Restrictions Estimating and Identifying Vector Autoregressions Under Diagonality and Block Exogeneity Restrictions William D. Lastrapes Department of Economics Terry College of Business University of Georgia Athens,

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

R = µ + Bf Arbitrage Pricing Model, APM

R = µ + Bf Arbitrage Pricing Model, APM 4.2 Arbitrage Pricing Model, APM Empirical evidence indicates that the CAPM beta does not completely explain the cross section of expected asset returns. This suggests that additional factors may be required.

More information

Economics 241B Estimation with Instruments

Economics 241B Estimation with Instruments Economics 241B Estimation with Instruments Measurement Error Measurement error is de ned as the error resulting from the measurement of a variable. At some level, every variable is measured with error.

More information