Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by ρ(1) = (1 + φθ)(φ + θ) 1 + θ 2 + 2φθ, ρ(h) = φh 1 ρ(1) for h 1. Solution: Taking expectations E(X t ) = φe(x t 1 ), and using φ < 1 and stationarity we get E(X t ) = E(X t 1 ) = 0. For k 2: multiplying X t = φx t 1 + ɛ t + θɛ t 1 by X t k and taking expectations we get γ k = φγ k 1, and hence γ k = φ k 1 γ 1 for k 2. Multiplying the same equation by X t and taking expectations we get and so Also γ 0 = φγ 1 + E[X t (ɛ t + θɛ t 1 )] X t = φx t 1 + ɛ t + θɛ t 1 = φ[φx t 2 + ɛ t 1 + θɛ t 2 ] + ɛ t + θɛ t 1 = φ 2 X t 2 + φɛ t 1 + φθɛ t 2 + ɛ t + θɛ t 1 γ 0 = φγ 1 + E[(φ 2 X t 2 + φɛ t 1 + φθɛ t 2 + ɛ t + θɛ t 1 )(ɛ t + θɛ t 1 )] = φγ 1 + σ 2 [φθ + 1 + θ 2 ]. γ 1 = E(X t X t+1 ) = E[X t (φx t + ɛ t+1 + θɛ t )] = φγ 0 + E[(φX t 1 + ɛ t + θɛ t 1 )(ɛ t+1 + θɛ t )] = φγ 0 + θσ 2. We can now solve the two equations involving γ 0, γ 1, and then find γ k, and hence ρ k, as required. 1
2. Consider a process consisting of a linear trend plus an additive noise term, that is, X t = β 0 + β 1 t + ɛ t where β 0 and β 1 are fixed constants, and where the ɛ t are independent random variables with zero means and variances σ 2. Show that X t is non-stationary, but that the first difference series X t = X t X t 1 is second-order stationary, and find the acf of X t. Solution: E(X t ) = E(β 0 + β 1 t + ɛ t ) = β 0 + β 1 t which depends on t, hence X t is non-stationary. Let Y t = X t = X t X t 1. Then So Y t = β 0 + β 1 t + ɛ t {β 0 + β 1 (t 1) + ɛ t 1 } = β 1 + ɛ t ɛ t 1. cov(y t, Y t+k ) = cov(ɛ t ɛ t 1, ɛ t+k ɛ t+k 1 ) = E(ɛ t ɛ t+k ɛ t 1 ɛ t+k ɛ t ɛ t+k 1 + ɛ t 1 ɛ t+k 1 ) 2σ 2 k = 0 = σ 2 k = 1 0 k 2. Hence Y t is stationary and its acf is 1 k = 0 ρ k = = 1 k = 1 2 0 k 2. 3. Let {S t, t = 0, 1, 2,... } be the random walk with constant drift µ, defined by S 0 = 0 and S t = µ + S t 1 + ɛ t, t = 1, 2,..., where ɛ 1, ɛ 2,... are independent and identically distributed random variables with mean 0 and variance σ 2. Compute the mean of S t and the autocovariance of the process {S t }. Show that { S t } is stationary and compute its mean and autocovariance function. 2
Solution: 4. If So E(S t ) = 0 + tµ + 0 = tµ. S t = ɛ t + µ + S t 1 = ɛ t + µ + ɛ t 1 + µ + S t 2 = ɛ t + ɛ t 1 + 2µ + S t 2 =... t 1 = ɛ t j + tµ + S 0 j=0 For the autocovariance of S t, the autocovariance at lag k is t 1 t+k 1 E[{S t tµ}{s t+k (t + k)µ}] = E( ɛ t j ɛ t+k i ) j=0 i=0 t 1 = E(ɛ t j ɛ t j ) j=0 = tσ 2 since, when moving from the first line to the second line of the above display, E(ɛ t j ɛ t+k i ) = 0 unless i = j + k. Y t = S t = S t S t 1 = µ + ɛ t, which is clearly stationary. E(Y t ) = µ. For the autocovariance of Y t, note Y t µ = ɛ t, and similarly Y t µ = ɛ t, and so for t t each Y t depends on a different ɛ t, and therefore cov(y t, Y t ) = 0 for all t t. So the autocovariance function is σ 2 at lag 0, and is zero at all other lags. X t = a cos(λt) + ɛ t where ɛ t WN(0, σ 2 ), and where a and λ are constants, show that {X t } is not stationary. Now consider the process X t = a cos(λt + Θ) where Θ is uniformly distributed on (0, 2π), and where a and λ are constants. Is this process stationary? Find the autocorrelations and the spectrum of X t. 3
[To find the autocorrelations you may want to use the identity cos α cos β = {cos(α + β) + cos(α β)}.] 1 2 Solution: E(X t ) = E(a cos(λt) + ɛ t ) = a cos(λt), which depends on t, so X t is not stationary. Now for X t = a cos(λt + Θ) we need to consider the joint distributions of (X(t 1 ),..., X(t k )) and of (X(t 1 + τ),..., X(t k + τ)). Since shifting time by t is equivalent to shifting Θ by λt, and since Θ is uniform on (0, 2π), these two joint distributions are the same, and so X t is stationary. E(X t ) = ae(cos(λt + Θ)) 2π = a cos(λt + θ) dθ 2π 0 = a [sin(λt + θ)]2π 0 2π = 0 So ρ t = cos(λt). γ t = E(X t X 0 ) = a 2 E(cos(Θ) cos(λt + Θ)) [ ] 1 = a 2 E {cos(λt + 2Θ) + cos(λt)} 2 = a2 2 [ 1 2π 2π 0 = a2 2 cos(λt) cos(λt + 2θ) + cos(λt) dθ] The spectrum is F where γ t = π π eitω df (ω). Try the discrete distribution for F, F (λ) = F ( λ) = c, a constant, F (ω) = 0 otherwise. Then γ t = e itλ c + e itλ c = c[cos(tλ) + i sin(tλ) + cos(tλ) i sin(tλ)] = 2c cos(λt). So we want 2c = a 2 /2, or c = a 2 /4. So F (λ) = F ( λ) = a 2 /4. 5. Find the Yule-Walker equations for the AR(2) process X t = 1 3 X t 1 + 2 9 X t 2 + ɛ t 4
where ɛ t WN(0, σ 2 ). Hence show that this process has autocorrelation function ( ρ k = 16 2 ) k ( 21 3 + 5 21 1 k 3). [To solve an equation of the form aρ k + bρ k 1 + cρ k 2 = 0, try ρ k = Aλ k for some constants A and λ: solve the resulting quadratic equation for λ and deduce that ρ k is of the form ρ k = Aλ k 1 + Bλ k 2 where A and B are constants.] Solution: The Yule-Walker equations are So as in the hint, to solve ρ k = 1 3 ρ k 1 + 2 9 ρ k 2. ρ k 1 3 ρ k 1 2 9 ρ k 2 = 0 try ρ k = Aλ k. Substituting this into the above equation, and cancelling a factor of λ k 2, we get λ 2 1 3 λ 2 9 = 0 which has roots λ = 2 3 and λ = 1 3, so ρ k = A( 2 3 )k + B( 1 3 )k. We also require ρ 0 = 1 and ρ 1 = 1 + 2ρ 3 9 1. Hence we can solve for A and B: A = 16 and B = 5. So 21 21 ρ k = 16( 2 21 3 )k + 5 ( 1 21 3 )k. 6. Let {Y t } be a stationary process with mean zero and let a and b be constants. (a) If X t = a + bt + s t + Y t where s t is a seasonal component with period 12, show that 12 X t = (1 B)(1 B 12 )X t is stationary. (b) If X t = (a+bt)s t +Y t where s t is again a seasonal component with period 12, show that 2 12X t = (1 B 12 )(1 B 12 )X t is stationary. Solution: (a) X t = a + bt + s t + Y t [a + b(t 1) + s t 1 + Y t 1 ] = b + s t s t 1 + Y t Y t 1 12 X t = b + s t s t 1 + Y t Y t 1 [b + s t 12 s t 13 + Y t 12 Y t 13 ] = Y t Y t 1 Y t 12 + Y t 13 and this is a stationary process since Y t is stationary. (We have used the fact that s t = s t 12 for all t.) 5
(b) Solution: 12 X t = (a + bt)s t + Y t [(a + b(t 12))s t 12 + Y t 12 ] = Y t + 12bs t 12 Y t 12 2 12X t = Y t + 12bs t 12 Y t 12 [Y t 12 + 12bs t 24 Y t 24 ] = Y t 2Y t 12 + Y t 24 and this is stationary since Y t is stationary (again using s t = s t 12 for all t.) 7. Consider the univariate state-space model given by state conditions X 0 = W 0, X t = X t 1 + W t, and observations Y t = X t + V t, t = 1, 2,..., where V t and W t are independent, Gaussian, white noise processes with var(v t ) = σv 2 and var(w t ) = σw 2. Show that the data follow an ARIMA(0,1,1) model, that is, Y t follows an MA(1) model. Include in your answer an expression for the autocorrelation function of Y t in terms of σv 2 and σ2 W. Solution: Y t = Y t Y t 1 = (X t + V t ) (X t 1 + V t 1 ) = X t X t 1 + V t V t 1 = W t + V t V t 1 and so Y t is an MA(1). As V t, V t 1 and W t are independent, Furthermore, γ 0 = V ar( Y t ) γ 1 = Cov( Y t, Y t+1 ) = σ 2 W + 2σ 2 V. = Cov(W t + V t V t 1, W t+1 + V t+1 V t ) = σ 2 V, and, from the independence, γ k = 0 for k 2. Hence the acf is ρ 0 = 1, and ρ k = 0 for k 2. σ 2 V ρ 1 = σw 2 +, 2σ2 V 6