Ch. 14 Stationary ARMA Process
|
|
- Allan Kelly
- 5 years ago
- Views:
Transcription
1 Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable to employ models that use parameters parsimoniously. Parsimony may often be achieved by representation of the linear process in terms of a small number of autoregressive and moving average terms. This chapter introduces univariate ARM A process, which provide a very useful class of models for describing the dynamics of an individual time series. The ARMA model is based on a principle in philosophy called reductionism. The reductionism 1 believe that anything can be understood once upon it is decomposed to its basic elements. Throughout this chapter we assume the time index T to be T = {... 2, 1, 0, 1, 2,...}. 1 Moving Average Process 1.1 The First-Order Moving Average Process A stochastic process {Y t, t T } is said to be a first order moving average process (MA(1)) if it can be expressed in the form Y t = µ + ε t + θε t 1, where µ and θ are constants and ε t is a white-noise process. and Remember that a white noise process {ε t, t T } is that E(ε t ε s ) = E(ε t ) = 0 { σ 2 when t = s 0 when t s Condition for Stationarity The expectation of Y t is given by E(Y t ) = E(µ + ε t + θε t 1 ) = µ + E(ε t ) + θe(ε t 1 ) = µ, for all t T. 1 Thales ( BC) was thought to be the first one to use reductionism in his writing. 1
2 Ch MOVING AVERAGE PROCESS The variance of Y t is γ 0 = E(Y t µ) 2 = E(ε t + θε t 1 ) 2 = E(ε 2 t + 2θε t ε t 1 + θ 2 ε 2 t 1) = σ θ 2 σ 2 = (1 + θ 2 )σ 2. The first autocovariance is γ 1 = E(Y t µ)(y t 1 µ) = E(ε t + θε t 1 )(ε t 1 + θε t 2 ) = E(ε t ε t 1 + θε 2 t 1 + θε t ε t 2 + θ 2 ε t 1 ε t 2 ) = 0 + θσ = θσ 2. Higher autocovariances are all zero: γ j = E(Y t µ)(y t j µ) = E(ε t + θε t 1 )(ε t j + θε t j 1 ) = 0 for j > 1. Since the mean and the autocovariances are not functions of time, an MA(1) process is weakly-stationary regardless of the value of θ Conditions for Ergodicity It is clear that the condition 2 γ j = (1 + θ 2 ) + θσ 2 < is satisfied. Thus the MA(1) process is ergodic for any finite value of θ The Dependence Structure The jth autocorrelation of a weakly-stationary process is defined as its jth autocovariance divided by the variance r j = γ j γ 0. By Cauchy-Schwarz inequality, we have r j 1 for all j. 2 See p.10 of Chapter Copy Right by Chingnun Lee R 2006
3 Ch MOVING AVERAGE PROCESS From above results, the autocorrelation of an M A(1) process is 1 when j = 0 θσ r j = 2 = θ when j = 1 (1+θ 2 )σ 2 (1+θ 2 ). 0 when j > 1 The autocorrelation r j can be plotted as a function of j. This plot is usually called autocogram. See the plots of p.50 of Hamilton The Conditional First Two Moments of M A(1) process Let F t 1 denote the information set available at time t 1. The conditional mean of u t is E(Y t F t 1 ) = E[ε t + θε t 1 F t 1 ] = θε t 1 + E[ε t F t 1 ] = θε t 1, (since E(ε t F t 1 ) = 0) and from this result, it implies that the conditional variance of Y t is σt 2 = V ar(y t F t 1 ) = E{[Y t E(Y t F t 1 )] 2 F t 1 } = E(ε 2 t F t 1 ) = σ 2. While the conditional mean of Y t depends upon the information at t 1, however, the conditional variance does not. Engle (1982) propose a class of models where the variance does depend upon the past and argue for their usefulness in economics. See Chapter The q-th Order Moving Average Process A stochastic process {Y t, t T } is said to be a moving average process of order q (MA(q)) if it can be expressed in this form Y t = µ + ε t + θ 1 ε t 1 + θ 2 ε t θ q ε t q q = µ + θ j ε t j where µ, θ 0, θ 1, θ 2,..., θ q are constants with θ 0 = 1 and ε t is a white-noise process. 3 Copy Right by Chingnun Lee R 2006
4 Ch MOVING AVERAGE PROCESS Conditions for Stationarity The expectation of Y t is given by E(Y t ) = E(µ + ε t + θ 1 ε t 1 + θ 2 ε t θ q ε t q ) = µ + E(ε t ) + θ 1 E(ε t 1 ) + θ 2 E(ε t 2 ) θ q E(ε t q ) = µ, for all t T. The variance of Y t is γ 0 = E(Y t µ) 2 = E(ε t + θ 1 ε t 1 + θ 2 ε t θ q ε t q ) 2. Since ε t s are uncorrelated, the variance of Y t is γ 0 = σ 2 + θ 2 1σ 2 + θ 2 2σ θ 2 qσ 2 = (1 + θ θ θ 2 q)σ 2. For j = 1, 2,..., q, γ j = E[(Y t µ)(y t j µ)] = E[(ε t + θ 1 ε t 1 + θ 2 ε t θ q ε t q ) (ε t j + θ 1 ε t j 1 + θ 2 ε t j θ q ε t j q )] = E[θ j ε 2 t j + θ j+1 θ 1 ε 2 t j 1 + θ j+2 θ 2 ε 2 t j θ q θ q j ε 2 t q]. Terms involving ε s at different dates have been dropped because their product has expectation zero, and θ 0 is defined to be unity. For j > q, there are no ε s with common dates in the definition of γ j, and so the expectation is zero. Thus, γ j = { [θj + θ j+1 θ 1 + θ j+2 θ θ q θ q j ]σ 2 for j = 1, 2,..., q 0 for j > q. Since the mean and the autocovariances are not functions of time, an MA(q) process is weakly-stationary regardless of the value of θ i, i = 1, 2,.., q. For example, for an MA(2) process, γ 0 = [1 + θ1 2 + θ2]σ 2 2 γ 1 = [θ 1 + θ 2 θ 1 ]σ 2 γ 2 = [θ 2 ]σ 2 γ 3 = γ 4 =... = 0 4 Copy Right by Chingnun Lee R 2006
5 Ch MOVING AVERAGE PROCESS Conditions for Ergodicity It is clear that the condition γ j < is satisfied. Thus the MA(q) process is ergodic for any finite value of θ i, i = 1, 2,.., q The Dependence Structure The autocorrelation function is zero after q lags. See the plots of p The Infinite-Order Moving Average Process A stochastic process {Y t, t T } is said to be an infinite-order moving average process (MA( )) if it can be expressed in this form Y t = µ + ϕ j ε t j = µ + ϕ 0 ε t + ϕ 1 ε t 1 + ϕ 2 ε t (1) where µ, ϕ 0, ϕ 1, ϕ 2,..., are constants with ϕ 0 = 1 and ε t is a white-noise process Convergence of Infinite Series Before we discuss the statistical properties of a MA( ) process, we need an understanding of the theory of convergence of an infinite series. Let {a j } be a sequence of numbers. Then the formal sum a 0 + a 1 + a a n + 3 Consider a MA( ) process Y t = β ju t j where u t is a white noise with variance σ 2 u. Without loss of generality, if β 0 0, we can simply define and obtain the representation ε t j = β 0 u t j, ϕ j = β j β0 1, j = 1, 2,, Y t = ϕ j ε t j, where ϕ 0 = 1 and ε t is uncorrelated (0, β 2 0σ 2 u) random variables. 5 Copy Right by Chingnun Lee R 2006
6 Ch MOVING AVERAGE PROCESS or a j is called an infinite series. The number a 0, a 1,..., a n,... are its terms, and the numbers S n n a j its partial sums. If lim S n exists, its value S is called the sum of the series. In this case, we say that the series converges and we write S a j <. If lim S n does not exists, we say that the series diverges. Theorem: Suppose that a j converges. Them lim a j = 0. Proof: Note first that, if lim S n = S, then lim S n 1 = S. Now a j = S j S j 1, so lim a j = lim(s j S j 1 ) = lim S j lim S j 1 = S S = 0. This does not say that, if lim a j = 0, then a j converges. Indeed this is not correct. It says that convergence of a j implies a j 0. Hence if a j does not tend to zero, the series cannot converge. Thus, in a given series a j, we can examine lim a j. If lim a j = 0, we have no information about convergence of divergence; but if lim a j 0, either because it fail to exist or because it exists and has another value, then a j diverges. Theorem (Cauchy Criterion): 4 A necessary and sufficient condition that a series a j converges is that, for each ς > 0, there exist an N(ς) for which 5 a j+1 + a j a m < ς if m > j > N. 4 The Cauchy Criterion is an assertion about the behavior of the terms of a sequence. It says that far out in the sequence all of them are close to each other. 5 This implies that lim(a j+1 +a j a m ) = 0 = lim(a j+1 )+lim(a j+2 )+...+lim(a m ) = 0, which is satisfied from the theorem above that if a j converges, then lim a j = 0. 6 Copy Right by Chingnun Lee R 2006
7 Ch MOVING AVERAGE PROCESS Definition: A sequence {a j } is said to be square-summable if a 2 j <, whereas a sequence {a j } is said to be absolute-summable if a j <. Proposition: Absolute summability implies square-summability, but the converse does not hold. Proof: First we show that absolute summability implies square-summability. Suppose that {a j } is absolutely summable. Then there exists an N < such that a j < 1 for all j N, 6 implying that a 2 j < a j for all j N. Then a 2 j = N 1 a 2 j + a 2 j < j=n N 1 a 2 j + a j. But N 1 a2 j is finite, since N is finite, and j=n a j is finite, since {a j } is absolutely summable. Hence a2 j <. It can verified that the converse is not true by considering j=1 j 2. Proposition: Given two absolutely summable sequences {a j } and {b j }, then the sequence {a j + b j } and {a j b j } are absolutely summable. It is also apparent that a j + bj <. j=n Proof: a j + b j ( a j + b j ) = a j + b j <, (2) a j b j = ( a j b j ) 6 Since by assumption {a j } is absolute summable, then a j 0. ( a j + b j ) 2 <. (3) 7 Copy Right by Chingnun Lee R 2006
8 Ch MOVING AVERAGE PROCESS Of course, ( a j + b j ) 2 < since ( a j + b j ) 2 is the square of an absolutely summable sequence. Proposition: The covolution of two absolutely summable sequence {a j } and {b j } defined by c j = a k b j+k, k=0 is absolutely summable. Proof: c j a k b j+k = k=0 a k b s <. (4) k=0 s=0 Proposition: If a j converges, so does a j Is This a Well Defined Random Sequence? Proposition: If the coefficients of the MA( ) in (1) is square-summable, then T ϕ jε t j converges in mean square to some random variable Z t (Say) as T. Proof: The Cauchy criterion states that T ϕ jε t j converges in mean square 7 to some random variable Z t as T if and only if, for any ς > 0, there exists a suitably large N such that for any integer M > N [ M E ϕ j ε t j ] 2 N ϕ j ε t j < ς. (5) 7 Since here we require that Z t being a covariance-stationary process, we use the convergence in mean square error of Cauchy criterion which guarantee the existence of second moments of Y t. 8 Copy Right by Chingnun Lee R 2006
9 Ch MOVING AVERAGE PROCESS In words, once N terms have been summed, the difference between that sum and the one obtained from summing to M is a random variable whose mean and variance are both arbitrarily close to zero. Now the left hand side of (5) is simply E [ϕ M ε t M + ϕ M 1 ε t M ϕ N+1 ε t N 1 ] 2 = (ϕ 2 M + ϕ 2 M ϕ 2 N+1)σ 2 [ M = ϕ 2 j N ϕ 2 j ] σ 2. (6) But if ϕ2 j <, then by the Cauchy criterion the right side of (6) may be made as small as desired by a suitable large N. Thus the MA( ) is well defined sequence since the infinity series ϕ jε t j converges in mean squares. So the MA( ) process Y t (= µ + Z t ) is a well defined random variable with finite second moment Check Stationarity Assume the M A( ) process to be with absolutely summable coefficients. The expectation of Y t is given by E(Y t ) = lim 0ε t + ϕ 1 ε t 1 + ϕ 2 ε t ϕ T ε t T ) T = µ The variance of Y t is γ 0 = E(Y t µ) 2 = lim T E(ϕ 0ε t + ϕ 1 ε t 1 + ϕ 2 ε t ϕ T ε t T ) 2 = lim T (ϕ2 0 + ϕ ϕ ϕ 2 T )σ 2 <. (F rom the assumption of absolutely summable coef f icients) 9 Copy Right by Chingnun Lee R 2006
10 Ch MOVING AVERAGE PROCESS For j > 0, γ j = E(Y t µ)(y t j µ) = (ϕ j ϕ 0 + ϕ j+1 ϕ 1 + ϕ j+2 ϕ 2 + ϕ j+3 ϕ )σ 2 = σ 2 ϕ j+k ϕ k k=0 σ 2 ϕ j+k ϕ k k=0 <. (from (3)) Thus, E(Y t ) and γ j are both finite and independent of t. The MA( ) process with absolute-summable coefficients is weakly-stationary Check Ergodicity Proposition: The absolute summability of the moving average coefficients implies that the process is ergodic. Proof: From the results of (4) or recall the autocovariance of an MA( ) is γ j = σ 2 Then k=0 ϕ j+k ϕ k. γ j = σ 2 ϕ j+k ϕ k k=0 σ 2 ϕ j+k ϕ k, k=0 and γ j σ 2 ϕ j+k ϕ k k=0 = σ 2 k=0 ϕ j+k ϕ k = σ 2 ϕ k ϕ j+k. k=0 10 Copy Right by Chingnun Lee R 2006
11 Ch MOVING AVERAGE PROCESS But there exists an M < such that ϕ j < M, and therefore ϕ j+k < M for k = 0, 1, 2,..., meaning that γ j < σ 2 ϕ k M < σ 2 M 2 <. k=0 Hence, the M A( ) process with absolute-summable coefficients is ergodic. 11 Copy Right by Chingnun Lee R 2006
12 Ch AUTOREGRESSIVE PROCESS 2 Autoregressive Process 2.1 The First-Order Autoregressive Process A stochastic process {Y t, t T } is said to be a first order autoregressive process (AR(1)) if it can be expressed in the form Y t = c + φy t 1 + ε t, where c and φ are constants and ε t is a white-noise process Check Stationarity and Ergodicity Write the AR(1) process in lag operator form: then Y t = c + φly t + ε t, (1 φl)y t = c + ε t. In the case φ < 1, we know from the properties of lag operator in last chapter that thus (1 φl) 1 = 1 + φl + φ 2 L , Y t = (c + ε t ) (1 + φl + φ 2 L ) = (c + φlc + φ 2 L 2 c +...) + (ε t + φlε t + φ 2 L 2 ε t +...) = (c + φc + φ 2 c +...) + (ε t + φε t 1 + φ 2 ε t ) c = 1 φ + ε t + φε t 1 + φ 2 ε t This can be viewed as an MA( ) process with ϕ j given by φ j. When φ < 1, this AR(1) is an M A( ) with absolute summable coefficient: ϕ j = φ j = 1 1 φ <. Therefore, the AR(1) process is stationary and ergodic provided that φ < Copy Right by Chingnun Lee R 2006
13 Ch AUTOREGRESSIVE PROCESS The Dependence Structure The expectation of Y t is given by 8 E(Y t ) = c E( 1 φ + ε t + φ 1 ε t 1 + φ 2 ε t ) = c 1 φ = µ. The variance of Y t is γ 0 = E(Y t µ) 2 = E(ε t + φ 1 ε t 1 + φ 2 ε t ) 2 = (1 + φ 2 + φ )σ 2 ( ) 1 = σ 2. 1 φ 2 For j > 0, γ j = E(Y t µ)(y t j µ) = E(ε t + φ 1 ε t 1 + φ 2 ε t φ j ε t j + φ j+1 ε t j 1 + φ j+2 ε t j ) (ε t j + φ 1 ε t j 1 + φ 2 ε t j ) = (φ j + φ j+2 φ j )σ 2 = φ j (1 + φ 2 + φ )σ 2 ( ) φ j = σ 2 1 φ 2 = φγ j 1. It follows that the autocorrelation function r j = γ j γ 0 = φ j, which follows a pattern of geometric decay as the plot on p Therefore, it is noted that while µ is the mean of a MA process, the constant c is not the mean of a AR process. 13 Copy Right by Chingnun Lee R 2006
14 Ch AUTOREGRESSIVE PROCESS An Alternative Way to Calculate the Moments of a Stationary AR(1) Process Assume that the AR(1) process under consideration is weakly-stationary, then taking expectation on both side we have E(Y t ) = c + φe(y t 1 ) + E(ε t ). Since by assumption that the process is stationary, Therefore, or reproducing the earlier result. E(Y t ) = E(Y t 1 ) = µ. µ = c + φµ + 0 µ = c 1 φ, as To find a higher moments of Y t in an analogous manner, we rewrite this AR(1) Y t = µ(1 φ) + φy t 1 + ε t or (Y t µ) = φ(y t 1 µ) + ε t. (7) For j 0, multiply (Y t j µ) on both side of (7) and take expectation: γ j = E[(Y t µ)(y t j µ)] = φe[(y t 1 µ)(y t j µ)] + E(Y t j µ)ε t = φγ j 1 + E(Y t j µ)ε t. Next we consider the term E(Y t j µ)ε t. When j = 0, multiply ε t on both side of (7) and take expectation: E(Y t µ)ε t = E[φ(Y t 1 µ)ε t ] + E(ε 2 t ). 14 Copy Right by Chingnun Lee R 2006
15 Ch AUTOREGRESSIVE PROCESS Recall that Y t 1 µ is a linear function of ε t 1, ε t 2,... : Y t 1 µ = ε t 1 + φε t 2 + φ 2 ε t we have E[φ(Y t 1 µ)ε t ] = 0. Therefore, E(Y t µ)ε t = E(ε 2 t ) = σ 2, and when j > 0, it is obvious that E(Y t j µ)ε t = 0. Therefore we the results that That is γ 0 = φγ 1 + σ 2, for j = 0 γ 1 = φγ 0, for j = 1 and γ j = φγ j 1, for j > 1. γ 0 = φφγ 0 + σ 2 = σ 2 1 φ 2. Beside γ 0, we need first moment (γ 1 ) to solve γ The Second-Order Autoregressive Process A stochastic process {Y t, t T } is said to be a second order autoregressive process (AR(2)) if it can be expressed in the form Y t = c + φ 1 Y t 1 + φ 2 Y t 2 + ε t, where c, φ 1 and φ 2 are constants and ε t is a white-noise process Check Stationarity and Ergodicity Write the AR(2) process in lag operator form: Y t = c + φ 1 LY t + φ 2 L 2 Y t + ε t, 15 Copy Right by Chingnun Lee R 2006
16 Ch AUTOREGRESSIVE PROCESS then (1 φ 1 L φ 2 L 2 )Y t = c + ε t. In the case that all the roots of the polynomial (1 φ 1 L φ 2 L 2 ) = 0 lies outside the unit circle, we know from the properties of lag operator in last chapter that there exist a polynomial ϕ(l) such that ϕ(l) = (1 φ 1 L φ 2 L 2 ) 1 = ϕ 0 + ϕ 1 L + ϕ 2 L , with ϕ j <. Proof: ϕ j here is equal to c 1 λ j 1 + c 2 λ j 2, where c 1 + c 2 = 1 and λ 1, λ 2 are the reciprocal of the roots of the polynomial (1 φ 1 L φ 2 L 2 ) = 0. Therefore, λ 1 and λ 2 lie inside the unit circle. See Hamilton, p. 33, [2.3.23]. Thus ϕ j = c 1 λ j 1 + c 2 λ j 2 c 1 λ j 1 + c 2 λ j 2 c 1 λ j 1 + c 2 λ j 2 <. Y t = (c + ε t ) (1 + ϕ 1 L + ϕ 2 L ) = (c + ϕ 1 Lc + ϕ 2 2L 2 c +...) + (ε t + ϕ 1 Lε t + ϕ 2 L 2 ε t +...) = (c + ϕ 1 c + ϕ 2 2c +...) + (ε t + ϕ 1 ε t 1 + ϕ 2 ε t ) = c(1 + ϕ 1 + ϕ ) + (ε t + ϕ 1 ε t 1 + ϕ 2 ε t ) c = + ε t + ϕ 1 ε t 1 + ϕ 2 ε t , 1 φ 1 φ 2 where the constant term is from the fact that substituting 1 into the identity (1 φ 1 L φ 2 L 2 ) 1 = 1 + ϕ 1 L + ϕ 2 L Copy Right by Chingnun Lee R 2006
17 Ch AUTOREGRESSIVE PROCESS This can be viewed as an MA( ) process with absolute summable coefficient Therefore, the AR(2) process is stationary and ergodic provided that all the roots of (1 φ 1 L φ 2 L 2 ) = 0 lies outside the unit circle The Dependence Structure Assume that the AR(2) process under consideration is weakly-stationary, then taking expectation on both side we have E(Y t ) = c + φ 1 E(Y t 1 ) + φ 2 E(Y t 2 ) + E(ε t ). Since by assumption that the process is stationary, Therefore, or E(Y t ) = E(Y t 1 ) = E(Y t 2 ) = µ. µ = c + φ 1 µ + +φ 2 µ + 0 µ = c 1 φ 1 φ 2. To find the higher moment of Y t in an analogous manner, we rewrite this AR(2) as or Y t = µ(1 φ 1 φ 2 ) + φ 1 Y t 1 + +φ 2 Y t 2 + ε t (Y t µ) = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) + ε t. (8) For j 0, multiply (Y t j µ) on both side of (8) and take expectation: γ j = E[(Y t µ)(y t j µ)] = φ 1 E[(Y t 1 µ)(y t j µ)] + φ 2 E[(Y t 2 µ)(y t j µ)] + E(Y t j µ)ε t = φ 1 γ j 1 + φ 2 γ j 2 + E(Y t j µ)ε t. 17 Copy Right by Chingnun Lee R 2006
18 Ch AUTOREGRESSIVE PROCESS Next we consider the term E(Y t j µ)ε t. When j = 0, multiply ε t on both side of (8) and take expectation: E(Y t µ)ε t = E[φ 1 (Y t 1 µ)ε t ] + E[φ 2 (Y t 2 µ)ε t ] + E(ε 2 t ). Recall that Y t 1 µ is a linear function of ε t 1, ε t 2,... :, we have E[φ 1 (Y t 1 µ)ε t ] = 0 and obviously E[φ 2 (Y t 2 µ)ε t ] = 0 also. Therefore, E(Y t µ)ε t = E(ε 2 t ) = σ 2, and when j > 0, it is obvious that E(Y t j µ)ε t = 0. Therefore we the results that γ 0 = φ 1 γ 1 + φ 2 γ 2 + σ 2, for j = 0; γ 1 = φ 1 γ 0 + φ 2 γ 1, for j = 1, γ 2 = φ 1 γ 1 + φ 2 γ 0, for j = 2, and γ j = φ 1 γ j 1 + φ 2 γ j 2, for j > 2. That is γ 1 = γ 2 = φ 1 γ 0, 1 φ 2 (9) φ 2 1 γ 0 + φ 2 γ 0 1 φ 2 (10) and therefore or γ 0 = [ φ φ ] 2φ φ σ 2 1 φ 2 1 φ 2 γ 0 = (1 φ 2 )σ 2 (1 + φ 2 )[(1 φ 2 ) 2 φ 2 1]. Substituting this result to (9) and (10), we obtains γ 1 and γ 2. Beside γ 0, we need first two moments (γ 1 and γ 2 ) to solve γ Copy Right by Chingnun Lee R 2006
19 Ch AUTOREGRESSIVE PROCESS 2.3 The pth-order Autoregressive Process A stochastic process {Y t, t T } is said to be a p th order autoregressive process (AR(p)) if it can be expressed in the form Y t = c + φ 1 Y t 1 + φ 2 Y t φ p Y t p + ε t, where c, φ 1, φ 2,..., and φ p are constants and ε t is a white-noise process Check Stationarity and Ergodicity Write the AR(p) process in lag operator form: Y t = c + φ 1 LY t + φ 2 L 2 Y t φ p L p Y t + ε t, then (1 φ 1 L φ 2 L 2... φ p L p )Y t = c + ε t. In the case all the roots of the polynomial (1 φ 1 L φ 2 L 2... φ p L p ) = 0 lies outside the unit circle, we know from the properties of lag operator in last chapter that there exist a polynomial ϕ(l) such that ϕ(l) = (1 φ 1 L φ 2 L 2... φ p L p ) 1 = ϕ 0 + ϕ 1 L + ϕ 2 L , with Thus ϕ j <. Y t = (c + ε t ) (1 + ϕ 1 L + ϕ 2 L ) = (c + ϕ 1 Lc + ϕ 2 2L 2 c +...) + (ε t + ϕ 1 Lε t + ϕ 2 L 2 ε t +...) = (c + ϕ 1 c + ϕ 2 2c +...) + (ε t + ϕ 1 ε t 1 + ϕ 2 ε t ) = c(1 + ϕ 1 + ϕ ) + (ε t + ϕ 1 ε t 1 + ϕ 2 ε t ) c = + ε t + ϕ 1 ε t 1 + ϕ 2 ε t , 1 φ 1 φ 2... φ p where the constant term is from the fact that substituting 1 into the identity (1 φ 1 L φ 2 L 2... φ p L p ) 1 = 1 + ϕ 1 L + ϕ 2 L This can be viewed as an MA( ) process with absolute summable coefficient Therefore, the AR(p) process is stationary and ergodic provided that all the roots of (1 φ 1 L φ 2 L 2... φ p L p ) = 0 lies outside the unit circle. 19 Copy Right by Chingnun Lee R 2006
20 Ch AUTOREGRESSIVE PROCESS The Dependence Structure Assume that the AR(p) process under consideration is weakly-stationary, then taking expectation on both side we have E(Y t ) = c + φ 1 E(Y t 1 ) + φ 2 E(Y t 2 ) φ p E(Y t p ) + E(ε t ). Since by assumption that the process is stationary, E(Y t ) = E(Y t 1 ) = E(Y t 2 ) =... = E(Y t p ) = µ. Therefore, µ = c + φ 1 µ + φ 2 µ φ p + 0 or µ = c 1 φ 1 φ 2... φ p. To find the higher moment of Y t in an analogous manner, we rewrite this AR(p) as or Y t = µ(1 φ 1 φ 2... φ p ) + φ 1 Y t 1 + φ 2 Y t φ p Y t p + ε t (Y t µ) = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + ε t. (11) For j 0, multiply (Y t j µ) on both side of (11) and take expectation: γ j = E[(Y t µ)(y t j µ)] = φ 1 E[(Y t 1 µ)(y t j µ)] + φ 2 E[(Y t 2 µ)(y t j µ)] φ p E[(Y t p µ)(y t j µ)] + E(Y t j µ)ε t = φ 1 γ j 1 + φ 2 γ j φ p γ j p + E(Y t j µ)ε t. Next we consider the term E(Y t j µ)ε t. When j = 0, multiply ε t on both side of (11) and take expectation: E(Y t µ)ε t = E[φ 1 (Y t 1 µ)ε t ] + E[φ 2 (Y t 2 µ)ε t ] E[φ p (Y t p µ)ε t ] + E(ε 2 t ). 20 Copy Right by Chingnun Lee R 2006
21 Ch AUTOREGRESSIVE PROCESS Recall that Y t 1 µ is a linear function of ε t 1, ε t 2,... :, we have E[φ 1 (Y t 1 µ)ε t ] = 0 and obviously, E[φ i (Y t i µ)ε t ] = 0, i = 2,..., p also. Therefore, E(Y t µ)ε t = E(ε 2 t ) = σ 2, and when j > 0, it is obvious that E(Y t j µ)ε t = 0. Therefore we the results that γ 0 = φ 1 γ 1 + φ 2 γ φ p γ p + σ 2, for j = 0; and γ j = φ 1 γ j 1 + φ 2 γ j φ p γ j p, for j = 1, 2,... (12) Divided (12) by γ 0 produce the Y ule W aker equation: r j = φ 1 r j 1 + φ 2 r j φ p r j p, for j = 1, 2,... (13) Intuitively, beside γ 0, we need first p moments (γ 1, γ 2,...,γ p ) to solve γ 0. Exercise: Use the same realization of ε s to simulate and plot the following Gaussian process Y t (set σ 2 = 1) in a sample of size T=500: (1). Y t = ε t, (2). Y t = ε t + 0.8ε t 1, (3). Y t = ε t 0.8ε t 1, (4). Y t = 0.8Y t 1 + ε t, (5). Y t = 0.8Y t 1 + ε t, (6). Ỹ t = ε t ε t 1, where V ar( ε t ) = Finally, please also plot their sample and population autocograms. 21 Copy Right by Chingnun Lee R 2006
22 Ch MIXED AUTOREGRESSIVE MOVING AVERAGE PROCESS 3 Mixed Autoregressive Moving Average Process The dependence structure described by a M A(q) process is truncated after the first q period, meanwhile it is geometrically decaying in an AR(p) process, depending on it AR coefficients. A richer flexibility in the dependence structure in the first few lags model is called for to meet the real phenomena. An ARMA(p, q) model meets this requirement. A stochastic process {Y t, t T } is said to be a autoregressive moving average process of order (p, q) (ARMA(p, q)) if it can be expressed in the form Y t = c + φ 1 Y t 1 + φ 2 Y t φ p Y t p + ε t + θ 1 ε t 1 + θ 2 ε t θ q ε t q, where c, φ 1, φ 2,...,φ p, and θ 1,..., θ q are constants and ε t is a white-noise process. 3.1 Check Stationarity and Ergodicity Write the ARMA(p, q) process is lag operator form: (1 φ 1 L φ 2 L 2... φ p L p )Y t = c + (1 + θ 1 L + θ 2 L θ q L q )ε t. In the case all the roots of the polynomial (1 φ 1 L φ 2 L 2... φ p L p ) = 0 lies outside the unit circle, we know from the properties of lag operator in last chapter that there exist a polynomial ϕ(l) such that ϕ(l) = (1 φ 1 L φ 2 L 2... φ p L p ) 1 (1 + θ 1 L + θ 2 L θ q L q ) = ϕ 0 + ϕ 1 L + ϕ 2 L , with Thus ϕ j <. Y t = µ + ϕ(l)ε t, where (1 + θ 1 L + θ 2 L θ q L q ) ϕ(l) = (1 φ 1 L φ 2 L 2... φ p L p ), ϕ j <, µ = c 1 φ 1 φ 2... φ p. 22 Copy Right by Chingnun Lee R 2006
23 Ch MIXED AUTOREGRESSIVE MOVING AVERAGE PROCESS This can be viewed as an MA( ) process with absolute summable coefficient Therefore, the ARM A(p, q) process is stationary and ergodic provided that all the roots of (1 φ 1 L φ 2 L 2... φ p L p ) = 0 lies outside the unit circle. Thus, the stationarity of an ARM A(p, q) process depends entirely on the autoregressive parameters (φ 1, φ 2,..., φ p ) and not on the moving average parameters (θ 1, θ 2,..., θ q ). 3.2 The Dependence Structure Assume that the ARM A(p, q) process under consideration is weakly-stationary, then taking expectation on both side we have E(Y t ) = c + φ 1 E(Y t 1 ) + φ 2 E(Y t 2 ) φ p E(Y t p ) + E(ε t ) + θ 1 E(ε t 1 ) θ q E(ε t q ). Since by assumption that the process is stationary, E(Y t ) = E(Y t 1 ) = E(Y t 2 ) =... = E(Y t p ) = µ. Therefore, µ = c + φ 1 µ + φ 2 µ φ p or µ = c 1 φ 1 φ 2... φ p. To find the higher moment of Y t in an analogous manner, we rewrite this ARMA(p, q) as Y t = µ(1 φ 1 φ 2... φ p ) + φ 1 Y t 1 + φ 2 Y t φ p Y t p + ε t + θ 1 ε t θ q ε t q or (Y t µ) = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + ε t + θ 1 ε t θ q ε t q.(14) For j 0, multiply Y t j µ on both side of (14) and take expectation: γ j = E[(Y t µ)(y t j µ)] = φ 1 E[(Y t 1 µ)(y t j µ)] + φ 2 E[(Y t 2 µ)(y t j µ)] φ p E[(Y t p µ)(y t j µ)] + (Y t j µ)ε t + θ 1 E(Y t j µ)(ε t 1 ) θ q E(Y t j µ)(ε t q ) = φ 1 γ j 1 + φ 2 γ j φ p γ j p +E(Y t j µ)ε t + θ 1 E(Y t j µ)(ε t 1 ) θ q E(Y t j µ)(ε t q ). 23 Copy Right by Chingnun Lee R 2006
24 Ch MIXED AUTOREGRESSIVE MOVING AVERAGE PROCESS It is obvious that the term E(Y t j µ)ε t +θ 1 E(Y t j µ)(ε t 1 )+...+θ q E(Y t j µ)(ε t q ) = 0 when j > q. Therefore we the results that γ j = φ 1 γ j 1 + φ 2 γ j φ p γ j p, for j = q + 1, q + 2,... (15). Thus, after q lags the autocovariance function γ j follow the pth order difference equation governed by the autoregressive coefficients. However, (15) does not hold for j q, owing to correlation between θ j ε t j and Y t j. Hence, an ARM A(p, q) process will have more complicated autocovariance function from lag 1 through q than would the corresponding AR(p) process. 3.3 Common Factor Therefore is a potential for redundant parameterizations with ARM A process. Consider factoring the lag polynomial operator in an ARM A(p, q) process: (1 λ 1 L)(1 λ 2 L)...(1 λ p L)Y t = (1 η 1 L)(1 η 2 L)...(1 η q L)ε t. (16) We assume that λ i < 1 for all i, so that the process is covariance-stationary. If the autoregressive operator (1 φ 1 L φ 2 L 2... φ p L p ) and the moving average operator (1 + θ 1 L + θ 2 L θ q L q ) have any roots in common, say, λ i = η j for some i and j, then both side of (16) can be divided by (1 λ i L): p q (1 λ k )Y t = (1 η j )ε t, k=1,k i k=1,k j or (1 φ 1L φ 2L 2... φ p 1L p 1 )Y t = (1 + θ1l + θ2l θq 1L q 1 )ε t, (17) where (1 φ 1L φ 2L 2... φ p 1L p 1 ) = (1 λ 1 L)(1 λ 2 L)...(1 λ i 1 L)(1 λ i+1 L)...(1 λ p L), and (1 + θ1l + θ2l θq 1L q 1 ) = (1 η 1 L)(1 η 2 L)...(1 η j 1 L)(1 η j+1 L)...(1 η q L). The stationary ARM A(p, q) process satisfying (16) is clearly identical to the stationary ARMA(p 1, q 1) process satisfying (17) which is more parsimonious in parameters. 24 Copy Right by Chingnun Lee R 2006
25 Ch THE AUTOCOVARIANCE-GENERATING FUNCTION 4 The Autocovariance-Generating Function If {Y t, t T } is a stationary process with autocovariance function γ j, then its autocovariance generating f unction is defined by g Y (z) = γ j z j. j= Proposition If two different process share the same autocovariance-generating function, then the two processes exhibit the identical sequence of autocovariance. As an example of calculating an autocovariance-generating function, consider the M A(1) process. Its autocovariance-generating function is: g Y (z) = [θσ 2 ]z 1 + [(1 + θ 2 )σ 2 ]z 0 + [θσ 2 ]z 1 = σ 2 [θz 1 + (1 + θ 2 ) + θz] = σ 2 (1 + θz)(1 + θz 1 ). (18) The form of expression (18) suggests that for the M A(q) process, its autocovariancegenerating function might be calculated as g Y (z) = σ 2 (1 + θ 1 z + θ 2 z θ q z q )(1 + θ 1 z 1 + θ 2 z θ q z q ). (19) This conjecture can be verified by carrying out the multiplication in (19) and collecting terms by power of z: σ 2 (1 + θ 1 z + θ 2 z θ q z q )(1 + θ 1 z 1 + θ 2 z θ q z q ) = (θ q )z q + (θ q 1 + θ q θ 1 )z q 1 + (θ q 2 + θ q 1 θ 1 + θ q θ 2 )z q (θ 1 + θ 2 θ 1 + θ 3 θ θ q θ q 1 )z 1 +(1 + θ1 2 + θ θq)z 2 0 +(θ 1 + θ 2 θ 1 + θ 3 θ θ q θ q 1 )z (θ q )z q. The coefficient on z j is indeed the jth autocovariance in an MA(q) process. The method for finding g Y (z) extends to the MA( ) case. If Y t = µ + ϕ(l)ε t with ϕ(l) = ϕ 0 + ϕ 1 L + ϕ 2 L Copy Right by Chingnun Lee R 2006
26 Ch THE AUTOCOVARIANCE-GENERATING FUNCTION and then ϕ j <, g Y (z) = σ 2 ϕ(z)ϕ(z 1 ). Example: An AR(1) process: Y t µ = (1 φl) 1 ε t, is in this form with ϕ(l) = 1/(1 φl). Thus, the autocovariance-generating function can be calculated from g Y (z) = σ 2 (1 φz)(1 φz 1 ). Example: The autocovariance-generating function of an ARM A(p, q) process is therefore be written: g Y (z) = σ2 (1 + θ 1 z + θ 2 z θ q z q )(1 + θ 1 z 1 + θ 2 z θ q z q ) (1 φ 1 z φ 2 z 2... φ p z p )(1 φ 1 z 1 φ 2 z 2... φ p z p ). 4.1 Filters Sometimes the data are filtered, or treated in a particular way before they are analyzed, and we would like to summarize the effect of this treatment on the autocovariance. This calculation is particularly simple using the autocovariancegenerating function. For example, suppose that the original data, Y t were generated from an MA(1) process: Y t = (1 + θl)ε t, 26 Copy Right by Chingnun Lee R 2006
27 Ch THE AUTOCOVARIANCE-GENERATING FUNCTION which has autocovariance-generating function g Y (z) = σ 2 (1 + θz)(1 + θz 1 ). Let the data be analyzed is X t which is taking first difference of Y t : X t = Y t Y t 1 = (1 L)Y t = (1 L)(1 + θl)ε t. Regarding this X t as an MA(2) process, then it has the autocovariance-generating function as g X (z) = σ 2 [(1 z)(1 + θz)][(1 z 1 )(1 + θz 1 )] = σ 2 [(1 z)(1 z 1 )][(1 + θz)(1 + θz 1 )] = [(1 z)(1 z 1 )]g Y (z). Therefore, applying the filter (1 L) to Y t thus resulting in multiplying its autocovariance-generating function by (1 z)(1 z 1 ). This principle readily generalizes. Let the original data series be Y t and it is filtered according to X t = h(l)y t, with h(l) = h j L j, and j= j= h j <. The autocovariance-generating function of X t can according be calculated as g X (z) = h(z)h(z 1 )g Y (z). 27 Copy Right by Chingnun Lee R 2006
28 Ch INVERTIBILITY 5 Invertibility 5.1 Invertibility for the M A(1) Process Consider an M A(1) process, with E(ε t ε s ) = Y t µ = (1 + θl)ε t, (20) { σ 2 for t = s 0 otherwise. Provided that θ < 1 both side of (20) can be multiplied by (1 + θl) 1 obtain to (1 θl + θ 2 L 2 θ 3 L )(Y t µ) = ε t, (21) which could be viewed as an AR( ) representation. If a moving average representation such as (20) can be rewritten as an AR( ) representation such as (21) simply by inverting the moving average operator (1 + θl), then the moving average representation is said to be invertible. For an M A(1) process, invertibility requires θ < 1; if θ 1, then the infinite sequence in (21) would not be well defined. 9 Let us investigate what invertibility means in terms of the first and second moments. Recall that the M A(1) process (21) has mean µ and autocovariancegenerating function g Y (z) = σ 2 (1 + θz)(1 + θz 1 ). Now consider a seemingly different M A(1) process, Ỹ t µ = (1 + θl) ε t, (22) with ε t a white noise sequence having different variance { σ 2 for t = s E( ε t ε s ) = 0 otherwise. 9 When θ 1, (1 + θl) 1 = θ 1 L 1 1+θ 1 L 1 = θ 1 L 1 (1 θ 1 L 1 + θ 2 L 2 θ 3 L ) 28 Copy Right by Chingnun Lee R 2006
29 Ch INVERTIBILITY Note that Ỹt has the same mean (µ) as Y t. Its autocovariance-generating function is gỹ (z) = σ 2 (1 + θz)(1 + θz 1 ) = σ 2 {( θ 1 z 1 + 1)( θz)}{( θ 1 z + 1)( θz 1 )} = ( σ 2 θ2 )(1 + θ 1 z)(1 + θ 1 z 1 ). Suppose that the parameters of (22), ( θ, σ 2 ) are related to those of (20) by the following equations: θ = θ 1, (23) σ 2 = θ 2 σ 2. (24) Then the autocovariance-generating function g Y (z) and gỹ (z) would be the same, meaning that Y t and Ỹt would have identical first and second moments. Notice from (23) that if θ < 1, then θ > 1. In other words, for any invertible M A(1) representation (20), we have found a non-invertible M A(1) representation (22) with the same first and second moments as the invertible representation. Conversely, given any non-invertible representation with θ > 1, there exist an invertible representation with θ = (1/ θ) that has the same first and second moments as the noninvertible representation. Not only do the invertible and noninvertible representation share the same moments, either representation (20) or (22) could be used as an equally valid description of any given M A(1) process. Suppose a computer generated an infinite sequence of Ỹt s according to the noninvertible MA(1) process: Ỹ t µ = (1 + θl) ε t with θ > 1. In what sense could these same data be associated with a invertible M A(1) representation? Imagine calculating a series {ε t } t= defined by ε t (1 + θl) 1 (Ỹt µ) (25) = (Ỹt µ) θ(ỹt 1 µ) + θ 2 (Ỹt 2 µ) θ 3 (Ỹt 3 µ) +..., (26) where θ = (1/ θ). 29 Copy Right by Chingnun Lee R 2006
30 Ch INVERTIBILITY The autocovariance-generating function of ε t is g ε (z) = (1 + θz) 1 (1 + θz 1 ) 1 gỹ (z) = (1 + θz) 1 (1 + θz 1 ) 1 ( σ 2 θ2 )(1 + θ 1 z)(1 + θ 1 z 1 ) = ( σ 2 θ2 ), where the last equality follows from the fact that θ 1 = θ. Since the autocovariancegenerating function is a constant, it follows that ε t is a white noise process with variance σ 2 = σ 2 θ2. Multiplying both side of (25) by (1 + θl), Ỹ t µ = (1 + θl)ε t is a perfectly valid invertible M A(1) representation of data that were actually generated from the noninvertible representation (22). The converse proposition is also true suppose that the data were generated really from (20) with θ < 1, an invertible representation. Then there exists a noninvertible representation with θ = 1/θ that describes these data with equal validity. Either the invertible or the noninvertible representation could characterize any given data equally well, though there is a practical reason for preferring the invertible representation. To find the value of ε for date t associated with the invertible representation as in (20), we need to know current and past value of Y. By contrast, to find the value of ε for date t associated with the noninvertible representation, we need to use all of the future value of Y. If the intention is to calculate the current value of ε t using real-world data, it will be feasible only to work with the invertible representation. 30 Copy Right by Chingnun Lee R 2006
31 Ch INVERTIBILITY 5.2 Invertibility for the M A(q) Process Consider the M A(q) process, Y t µ = (1 + θ 1 L + θ 2 L θ q L q )ε t { σ 2 for t = s E(ε t ε s ) = 0 otherwise. Provided that all the roots of (1 + θ 1 L + θ 2 L θ q L q ) = 0 lie outside the unit circle, this MA(q) process can be written as an AR( ) simply by inverting the M A operator, (1 + η 1 L + η 2 L )(Y t µ) = ε t, where (1 + η 1 L + η 2 L ) = (1 + θ 1 L + θ 2 L θ q L q ) 1. where this is the case, the M A(q) representation is invertible. 31 Copy Right by Chingnun Lee R 2006
Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationAutoregressive and Moving-Average Models
Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationChapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for
Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply
More informationBasic concepts and terminology: AR, MA and ARMA processes
ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More informationLecture on ARMA model
Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationCh. 19 Models of Nonstationary Time Series
Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.
More informationUniversità di Pavia. Forecasting. Eduardo Rossi
Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The
More informationDefine y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting
Forecasting Let {y t } be a covariance stationary are ergodic process, eg an ARMA(p, q) process with Wold representation y t = X μ + ψ j ε t j, ε t ~WN(0,σ 2 ) j=0 = μ + ε t + ψ 1 ε t 1 + ψ 2 ε t 2 + Let
More informationTrend-Cycle Decompositions
Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationLINEAR STOCHASTIC MODELS
LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationTime Series 3. Robert Almgren. Sept. 28, 2009
Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationPermanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko
Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationECONOMETRICS Part II PhD LBS
ECONOMETRICS Part II PhD LBS Luca Gambetti UAB, Barcelona GSE February-March 2014 1 Contacts Prof.: Luca Gambetti email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ Description This is
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationUnivariate Nonstationary Time Series 1
Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More information6 NONSEASONAL BOX-JENKINS MODELS
6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationStatistics of stochastic processes
Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationForecasting and Estimation
February 3, 2009 Forecasting I Very frequently the goal of estimating time series is to provide forecasts of future values. This typically means you treat the data di erently than if you were simply tting
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationDifference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.
Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order
More information5 Transfer function modelling
MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationModule 4. Stationary Time Series Models Part 1 MA Models and Their Properties
Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationEstimating Moving Average Processes with an improved version of Durbin s Method
Estimating Moving Average Processes with an improved version of Durbin s Method Maximilian Ludwig this version: June 7, 4, initial version: April, 3 arxiv:347956v [statme] 6 Jun 4 Abstract This paper provides
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationClass: Trend-Cycle Decomposition
Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis
More informationNote: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).
Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial
More informationCh.10 Autocorrelated Disturbances (June 15, 2016)
Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the
More informationAdvanced Econometrics
Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate
More informationCalculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by
Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary
More informationTime Series Analysis
Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March
More informationTime Series (Part II)
Time Series (Part II) Luca Gambetti UAB IDEA Winter 2011 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ 2 Goal
More informationWe use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.
Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationConsider the trend-cycle decomposition of a time series y t
1 Unit Root Tests Consider the trend-cycle decomposition of a time series y t y t = TD t + TS t + C t = TD t + Z t The basic issue in unit root testing is to determine if TS t = 0. Two classes of tests,
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationTime Series Analysis Fall 2008
MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Introduction 1 14.384 Time
More informationVector autoregressions, VAR
1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,
More informationChapter 2: Unit Roots
Chapter 2: Unit Roots 1 Contents: Lehrstuhl für Department Empirische of Wirtschaftsforschung Empirical Research and undeconometrics II. Unit Roots... 3 II.1 Integration Level... 3 II.2 Nonstationarity
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More information4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,
61 4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, E(y t ) = µ + θ(l)e(ɛ t ) = µ 62 Example: MA(q) Model: y t = ɛ t + θ 1 ɛ
More information