Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Size: px
Start display at page:

Download "Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8"

Transcription

1 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E

2

3 Contents 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes Linear Processes Forecasting Stationary Time Series Innovation Algorithm ARMA Processes ACF and PACF of an ARMA(p, q) Process Forecasting ARMA Processes Estimation of ARMA Processes Yule-Walker Equations Estimation for Moving Average Processes Using the Innovations Algorithm Maximum Likelihood Estimation Order Selection Spectral Analysis The Spectral Density of an ARMA Process The Periodogram 23

4 4 andrew tulloch 2 Bibliography 25

5 1 Time Series Analysis 1.1 Introduction References: (i) Brockwell and Davis [2009] (ii) Brockwell and Davis [2002] Definition 1.1 (Time Series). A set of observations (X t ), each being recorded at a predictable time t T 0. In a continuous time series, T 0 is continuous. In a discrete time series, T 0 is discrete. Definition 1.2 (Time Series Model). Specification of joint distribution (or only means and covariances) of a sequence of random variables of which X t is a realization. Remark 1.3. A complete probability model specifies the joint distribution of all the random variables X t, t T. This often requires too many estimators, so we only specify the first and second order moments. Example 1.4. When X t is multivariate IID - P(X 1 = x 1,..., X n = x n ) = Example 1.5. First order moving average model Example 1.6. Trend and seasonal component. n i=1 F(x i ) (1.1)

6 6 andrew tulloch 1.2 Stationary Processes Intuitively, a stationary time series is one where the joint distribution is invariant to time shifts. Definition 1.7 (Mean, Covariance function). Define the mean function µ X (t) = E(X t ). Define the covariance function γ X (t, s) = Cov(X t, X s ) = E((X t µ X (t))(x s µ X (s))). Definition 1.8 (Weak Stationarity). A time series X t is stationary if (i) E ( X t 2) < for all t Z (ii) E(X t ) = c for all t Z (iii) γ X (t, s) = γ X (t + h, s + h)) for all t, s, h Z Definition 1.9 (Strict Stationarity). A time series X t is said to be strict stationary if the joint distributions of X t1,...,x tk and X t1 +h,..., X tk +h are identical for all k and for all t 1,..., t k, h Z. Definition 1.10 (Autocovariance function). For a stationary time series X t, define the autocovariance function γ X (t) = Cov(X t+h, X t ). (1.2) and the autocorrelation function ρ X (h) = γ X(h) γ X (0). (1.3) Lemma 1.11 (Properties of the autocovariance function). γ(0) 0 (1.4) γ(h) γ(0) (1.5) γ(h) = γ( h) (1.6) for all h. Note that these all hold for the autocorrelation function ρ, with the additional condition that ρ(0) = 1. Theorem A real-valued function defined on the integers is the autocovariance function of a stationary time series if and only if it is even and nonnegative definite.

7 time series and monte carlo inference 7 Example Consider a white noise, with X t a time series with X t uncorrelated with mean zero and variance σ 2. Then γ X (h) = σ 2 I(h = 0) (1.7) ρ X (h) = I(h = 0) (1.8) Example 1.14 (First order moving average MA(1)). X t = Z t + θz t 1 (1.9) with Z t WN(0, σ 2 ). Then σ 2 (1 + θ 2 ) h = 0 γ X (h) = σ 2 θ h = 1 0 otherwise 1 h = 0 ρ X (h) = θ 1+θ h = 1 0 otherwise (1.10) (1.11) Definition 1.15 (Sample Autocovariance). The sample autocovariance function of {x 1,..., x n } is defined by ˆγ(h) = 1 n n h j=1 and ˆγ(h) = ˆγ( h), n < h 0. (x j+h x)(x j x), 0 h < n (1.12) Note that the divisor is n rather than n h since this ensures that the sample autocovariance matrix ˆΓ n = ( ˆγ(i j)) i,j (1.13) is positive semidefinite.

8 8 andrew tulloch 1.3 State Space Modesl Definition The observation equation is Y t = G t X t + W t. (1.14) The state equation is X t+1 = F t X t + V t (1.15) {Y t } has a state-space representation if there exists a state-space model for {Y t } as specified by the previous equations. Theorem 1.17 (De Finitte). If {X 1, V 1, V 2,... } are independent, then {X t } has the Markov property - that is, X t+1 X t, X t 1, = X t+1 X t. 1 1 All of Section 8.1 in Introduction to Time Series and Forecasting In the stable case, there is a unique stationary solution, given by X t = F j V t j 1 (1.16) Definition The state equation is said to be stable if the matrix F has all it s eigenvalues in the interior of the unit circle. 1.4 Stationary Processes Linear Processes Definition 1.19 (Wold Decomposition). If X t is a nondeterministic stationary time series, then where X t = (i) ψ 0 = 1 and ψ2 j <, (ii) Z t WN(0, σ 2 ), (iii) Cov(Z s, V t ) = 0 for all s, t, (iv) Z t = P t Z t for all t, ψ j Z t j + V t (1.17)

9 time series and monte carlo inference 9 (v) V t = P s V t for all s, t, (vi) V t is deterministic. The sequences Z t, ψ j, V t are unique and can be written explicitly as Z t = X t P t 1 X t (1.18) ψ j = E( ) X t Z t j E(Z t ) 2 (1.19) V t = X t ψ j Z t j. (1.20) Definition A times series {X t } is a linear process if it has the representation where Z t j= ψ j <. X t = ψ j Z t j (1.21) j= WN(0, σ 2 ) and {ψ j } is a sequence of constants with A linear process is called a moving average or MA( ) if ψ j = 0 for all j < 0, so X t = ψ j Z t j. (1.22) Proposition Let Y t be a stationary time series with mean zero and coavariance function γ Y. If j= ψ j <, then the time series X t = ψ j Y t j = ψ(b)y t (1.23) j= is stationary with mean zero and autocovariance function γ X (h) = j= k= In the special case where X t is a linear process, γ X (h) = ψ j ψ k γ Y (h + k j). (1.24) ψ j ψ j+h σ 2. (1.25) j= Forecasting Stationary Time Series Our goal is to find the linear combination of 1, X n, X n 1,..., X 1 that forecasts X n+h with minimum mean squared error. The best linear

10 10 andrew tulloch predictor in terms of 1, X n,..., X 1 will be deonted by P n X n+h and clearly has the form P n X n+h = a 0 + a 1 X n + + a n X 1. (1.26) To find these equations, we solve the convex problem by setting derivatives to zero, and obtain the result given below. Theorem 1.22 (Properties of h-step best linear predictor P n X n+h ). (i) P n X n+h = µ + where a n = (a 1,..., a n ) satisfies n i=1 a i (X n+1 i µ) (1.27) Γ n a n = γ n (h) (1.28) Γ n = [γ(i j)] n i,j=1 (1.29) γ n (h) = (γ(h), γ(h + 1),..., γ(h + n 1)) (1.30) (ii) ( E (X n+h P n X n+h ) 2) = γ(0) a n, γ n (h) (1.31) (iii) E(X n+h P n X n+h ) = 0 (1.32) (iv) E ( (X n+h P n X n+h )X j ) = 0 (1.33) for j = 1,..., n. Definition 1.23 (Prediction Operator P( W)). Suppose that E ( U 2) <, E ( V 2) <, Γ = Cov(W, W), and β, α 1,..., α n are constants. (i) P(U W) = E(U) = a (W E(W)) (1.34) where Γa = Cov(U, W). (ii) E((U P(U W))W) = 0 (1.35)

11 time series and monte carlo inference 11 and E(U P(U W)) = 0 (1.36) (iii) ( E (U P(U W)) 2) = V(U) a Cov(U, W) (1.37) (iv) Pα 1 + α 2 V + β W = α 1 P(U W) + α 2 P(V W) + β (1.38) (v) (vi) P( n i=1 α i W i + β W) = n α i W i + β (1.39) i=1 P(U W) = EU (1.40) if Cov(U, W) = Innovation Algorithm Theorem Suppose X t is a zero-mean series with E ( X t 2) < for each t and E ( ) X i X j = κ(i, j). Let ˆX n = 0 if n = 1, and P n 1 X n if n = 2, 3,..., and let v n = E ( (X n+1 P n X n+1 ) 2). Define the innovations, or one-step prediction errors, as U n = X n ˆX n. Then we can write 0 n = 0 ˆX n+1 = (1.41) n j=1 θ nj(x n+1 j ˆX n+1 j ) where the coefficients θ n1,..., θnn can be computed recursively from the equations for 0 k < n, and θ n,n k = 1 k 1 (κ(n + 1, k + 1) v k n 1 v n = κ(n + 1, n + 1) v 0 = κ(1, 1) (1.42) θ k,k j θ n,n j v j ) (1.43) θ 2 n,n j v j. (1.44)

12 12 andrew tulloch 1.5 ARMA Processes Definition X t is an ARMA(p, q) process if X t is stationary and if for every t, X t φ 1 X t 1 φ p X t p = Z t + θ 1 Z t θ q Z t q (1.45) where Z t WN(0, σ 2 ) and the polynomials (1 φ 1 z φ p z p ) and (1 + θ 1 z + + θ q z q ) have no common factors. It can be more convenient to write this in the form φ(b)x t = θ(b)z t (1.46) with B the back-shift operator. ARMA(0, q) is a moving average process of order q (MA(q)). ARMA(p, 0) is an autoregressive process of order p (AR(p)). Theorem A stationary solution of (1.45) exists (and is the unique stationary solution) if and only if φ(z) = 1 φ 1 z φ p z p = 0 (1.47) for all z = 1 Definition An ARMA(p, q) process X t is causal (or a causal function of Z t ) if there exists constants ψ j such that ψ j < and for all t. X t = ψ j Z t j (1.48) Theorem An ARMA(p, q) process is causal if and only if φ(z) = 1 φ 1 z φ p z p = 0 (1.49) for all z 1. Note that the coefficients ψ j are determined by ψ j p k=1 θ k ψ j k = θ j (1.50)

13 time series and monte carlo inference 13 for j = 0, 1,.... and θ 0 = 1, θ j = 0 for j > q, and ψ j = 0 for j < 0. Definition An ARMA(p, q) is invertible if there exist constants π j such that π j < and for all t. Z t = π j X t j (1.51) The coefficients π j are determined by the equations π j + q k=1 θ k π j k = φ j (1.52) where φ 0 = 1, θ j = 0 for j > p, and π j = 0 for j < 0. Theorem Invertibility is equivalent to the condition θ(z) = 1 + θ 1 z + + θ q z q = 0 (1.53) for all z ACF and PACF of an ARMA(p, q) Process Theorem For a causal ARMA(p, q) process defined by φ(b)x t = θ(b)z t (1.54) we know we can write X t = ψ j Z t j (1.55) where ψ jz j = θ(z)/φ(z) for z 1. Thus, the ACVF γ is given as γ(h) = E(X t+h X t ) = σ 2 ψ j ψ j+ h (1.56) A second approach is to multiple each side by X tk and take expectations, and obtain a sequence of m homogenous linear difference equations with constant coefficients. These can be solved to obtain the γ(h) values. Definition 1.32 (PACF). The partial autocorrelation function (PACF)

14 14 andrew tulloch of an AMRA process X is the function α( ) defined by α(0) = 1 (1.57) α(h) = φ hh, h 1 (1.58) where φ hh is the last component of Œ h = Γ 1 h γ h, where Γ h = [γ(i j)] i,j=1 h, and γ h = [γ(1), γ(2),..., γ(h)]. Theorem For an AR(p) process, the sample PACF values at lags greater than p are approximately independent N(0, n 1 ) random variables. Thus, if we have a sample PACF satisfying ˆα(h) > 1.96 n (1.59) for 0 h p and ˆα(h) < 1.96 n (1.60) for h > p, this suggests an AR(p) model for the data. Theorem 1.34 (PACF summary). For an AR(p) process X t, the PACF α( ) has the properties that α(p) = φ p, and α(h) = 0 for h > p. For h < p we can compute numerically from the expression that Œ h = Γ 1 h fl h Forecasting ARMA Processes For the causal ARMA(p, q) process φ(b)x t = θ(b)z t, Z t WN(0, σ 2 ) (1.61) we can avoid using the full innovations algorithm. If we apply the algorithm to the transformed process W t given by 1 σ W t = X t t = 1,..., m (1.62) 1 σ φ(b)x t t > m where m = max(p, q). For notational convenience, take θ 0 = 1, θ j = 0 for j > q.

15 time series and monte carlo inference 15 Lemma The autocovariances κ(i, j) = E ( ) W i W j are found from σ 2 γ X (i j) 1 i, j m σ 2 (γ X (i j) p r=1 κ(i, j) = φ rγ X (r i j )) min(i, j) m < max(i, j) 2m q r=0 θ rθ r+ i j min(i, j) > m 0 otherwise (1.63) Applying the innovations algorithm to the process W t, we obtain n j=1 Ŵ n+1 = θ nj(w n+1 j Ŵ n+1 j ) 1 n < m (1.64) q j=1 θ nj(w n+1 j Ŵ n+1 j ) n m where the coefficients θ nj and MSE r n = E ( (W n+1 Ŵ n+1 ) 2) are found recursively using the innovations algorithm. Since the equations (1.62) allow us to write X n as a linear combination of W j, 1 j n, and conversely, each W n, n 1 to be written as a linear combination of X j, 1 j n. Thus the best linear predictor of the random variable Y in terms of {1, X 1,..., X n } is the same as the best linear predictor of Y in terms of {1, W 1,..., W n. Thus, by linearity of ˆP n, we have 1 ˆX σ t t = 1,..., m Ŵ t = (1.65) 1 σ ( ˆX t φ 1 X t 1 φ p X t p ) t > m which shows that X t ˆX t = σ(w t Ŵ t ) (1.66) Substituting into (1.63) and (1.64), we obtain n j=1 ˆX n+1 = θ nj(x n+1 j ˆX n+1 j ) φ 1 X n + + φ p X n+1 p + q j=1 θ nj(x n+1 j ˆX n+1 j ) and 1 n < m n m (1.67) ( E (X n+1 ˆX n+1 ) 2) ( = σ 2 E (W n+1 Ŵ n+1 ) 2) = σ 2 r n (1.68) where θ nj and r n are found using the innovation algorithm.

16 16 andrew tulloch 1.6 Estimation of ARMA Processes Yule-Walker Equations Consider estimating a causal AR(p) process. We can write X t = ψ j Z t j (1.69) where ψ jz j = 1 for z 1. φ(z) Multiplying each side by Z t j, and taking expectations, we obtain the Yule-Walker equations Γ p Œ = fl p (1.70) and σ 2 = γ(0) Œ, fl p where Γp = [γ(i j)] p i,j=1 and fl p = (γ(1), γ(2),..., γ(p)). If we replace the covariances by the sample covariances ˆγ(j), we obtain a set of equations for the so-called Yule-Walker estimators ˆŒ and ˆσ 2, given by and ˆσ 2 = ˆγ(0) ˆŒ, ˆfl p ˆΓ p ˆŒ = ˆfl p (1.71) Theorem If X t is the causal AR(p) process and ˆŒ is the Yule-Walker estimator of Œ, then n 1 2 ( ˆŒ Œ) d N(0, σ 2 Γ 1 p ) (1.72) Moreover, ˆσ 2 p σ 2. Theorem If X t is a causal AR(p) process and ˆŒ m is the Yule-Walker estimate of order m > p, then n 1 2 ( ˆŒ m Œ m ) d N(0, σ 2 Γ 1 m ) (1.73) where ˆŒ m is the coefficient vector of the best linear predictor Œ m, X m of X m+1 based on X m,..., X 1. So Œ m = R 1 m æ m. In particular, for m > p, n 1 2 ˆφ mm d N(0, 1) (1.74)

17 time series and monte carlo inference 17 Theorem 1.38 (Durbin-Levinson Algorithm for AR models). Consider fitting an AR(m) process X t ˆθ m1 X t 1 ˆθ mm X t m = Z t (1.75) with Z t WN(0, ˆv m ). If ˆγ(0) > 0, then the fitted autoregressive models for m = 1, 2,..., n 1 can be determined recursively from the relations ˆφ mm = ˆφ m1. ˆφ m,m 1 ˆφ 11 = ˆρ(1) (1.76) ˆv 1 = ˆγ(0)(1 ˆρ 2 )(1) (1.77) ˆγ(m) m 1 j=1 ˆφ m 1,j ˆγ(m j) (1.78) ˆv m 1 = ˆŒ m 1 ˆφ mm ˆφ m 1,m 1. ˆφ m 1,1 (1.79) ˆv m = ˆv m 1 (1 ˆφ 2 mm) (1.80) Theorem 1.39 (Confidence intervals for AR(p) estimation). Under the assumption that the order p of the fitted model is the correct value, for large sample-size n, the region {Œ R p (Œ ˆφ p ) ˆΓ p (Œ ˆŒ p ) 1 n ˆv pχ 2 1 α (p)} (1.81) contains Œ p with probability close to 1 α where χ 2 1 α (p) is the (1 α) quantile of the chi-squared distribution with p degrees of freedom. Similarly, if Φ 1 α is the (1 α) quantile of the standard normal distribution and ˆv jj is the j-th diagonal element of ˆv p ˆΓ 1 p, then for large n { ˆφ pj ± Φ 1 α 2 1 n 1 2 ˆv 1 2 jj } (1.82) contains φ pj with probability close to (1 α).

18 18 andrew tulloch Estimation for Moving Average Processes Using the Innovations Algorithm Consider estimating X t = Z t + ˆθ m1 Z t ˆθ mm Z t m (1.83) with Z t WN(0, ˆv m ). Theorem We can apply the innovation estimates by applying the recursive relations for k = 0,..., m 1, and ˆv 0 = ˆγ(0) (1.84) ˆθ m,m k = 1ˆv k 1 ( ˆγ(m k) k ˆθ m,m j ˆθ k,k j ˆv j ) (1.85) m 1 ˆv m = ˆγ(0) ˆθ m,m j 2 ˆv j. (1.86) Theorem Let X t be the causal invertible ARMA process φ(b)x t = θ(b)z t with Z t WN(0, σ 2 ), E ( Z 4 t ) <, and let ψ(z) = ψ j z j = θ(z) φ(z) for z 1, and ψ 0 = 1 and ψ j = 0 for j < 0. Then for any sequence of positive integers m n, such that m < n, m, and m = o(n 1 3 ) as n, we have for each k, where A = [a ij ] k i,j=1 and n 1 2 ( ˆθ m1 ψ 1,..., ˆθ mk ψ k ) d N(0, A) (1.87) and a ij = min(i,j) ψ i r ψ j r (1.88) r=1 ˆv m p σ 2. (1.89) Remark Note that for the AR(p) process, the Yule-Walker estimator is a consistent estimator of Œ p. However, for an MA(q) process, the estimator ˆ`q is not consistent for the true parameter vector as n. For consistency, it is necessary to use the estimators with m satisfying the conditions given

19 time series and monte carlo inference 19 in Theorem Theorem 1.43 (Asymptotic confidence regions for the `q). {θ R θ ˆθ mj Φ 1 α 2 1 is an (1 α) confidence interval for θ mj. n 1 2 j 1 ( ˆθ mk 2 ) 1 2 } (1.90) k= Maximum Likelihood Estimation Consider X t a gaussian time series with zero mean and autocovariance function κ(i, j) = E ( X i X j ). Let ˆX j = P j 1 X j. Let Γ n be the covariance matrix and assume Γ n is nonsingular. The likelihood of X n is L(Γ n ) = 1 (2π) n 2 1 (det Γ n ) 1 2 Theorem The likelihood of the vector X n reduces to exp( 1 2 X nγ 1 n X n ) (1.91) L(Γ n ) = 1 exp( 1 n (X j ˆX j ) (2π) n n 1 i=0 r 2 2 ) (1.92) r i j=1 j 1 Remark Even if X t is not Gaussian, the large sample estimates are the same for Z t IID(0, σ 2 ), regardless of whether or not Z t is Gaussian. Theorem 1.46 (Maximum Likelihood Estimators for ARMA processes). ˆσ 2 = 1 n S( ˆŒ, ˆ`) (1.93) where ˆŒ, ˆ` are the values of Œ, ` that minimize and l(œ, `) = ln( 1 n S(`, `)) + 1 n S( ˆŒ, ˆ`) = n 1 ln r j (1.94) n (X j ˆX j ) 2 (1.95) r j=1 j 1 Theorem 1.47 (Asyptotic Distribution of Maximum Likelihood Esti-

20 20 andrew tulloch mators). For a large sample from an ARMA(p, q) process, ˆfi = N(fi, 1 Vfi) (1.96) n where V(fi) = σ 2 E(U tu t ) E(U tv t ) E(V t U t ) E(V tv t ) 1 (1.97) and U t are the autoregressive process φ(b)u t = Z t and θ(b)v t = Z t. Note that for p = 0, V(fi) = σ 2 [E(V t V t )] 1, and for q = 0, V(fi) = σ 2 [E(U t U t )] Order Selection Definition 1.48 (Kullback-Leibler divergence). The Kullback-Leibler (KL) divergence between f ( ; ψ) and f ( ; θ) is defined as d(ψ θ) = (ψ θ) (θ θ) (1.98) where (ψ θ) = E θ ( 2 ln f (X; ψ)) (1.99) is the Kullback-Leibler index of f ( ; ψ) relative to f ( ; θ). Theorem 1.49 (AICC of ARMA(p, q) process). AICC(fi) = 2 ln L X (fi, S X(β) 2(p + q + 1)n ) + n n p q 2 (1.100) Theorem 1.50 (AIC of ARMA(p, q) process). AIC(fi) = 2 ln L X (fi, S X(β) ) + 2(p + q + 1) (1.101) n Theorem 1.51 (BIC of ARMA(p, q) process). BIC(fi) = (n p q) ln nˆσ 2 n p q + n(1 + ln 2π) + (p + q) ln n t=1 X2 t nˆσ2 p + q where ˆσ 2 is the MLE estimate of the white noise variance. (1.102)

21 time series and monte carlo inference Spectral Analysis Let X t be a zero-mean stationary time series with autocovariance function γ( ) satisfying h= γ(h) <. Definition The spectral density of X t is the function f ( ) defined by f (λ) = 1 2π e ihλ y(h) (1.103) h= The summability implies that the series converges absolutely. Theorem (i) f is even (ii) f (λ) 0 for all λ ( π, π]. (iii) γ(k) = π π e kλ f (λ)dλ = π π cos(kλ) f (λ)dλ. Definition A function f is the spectral density of a stationary time series X t with autocovariance function γ( ) if (i) f (λ) 0 for all λ (0, π], (ii) γ(h) = π π eihλ f (λ)dλ for all integers h. Lemma If f and g are two spectral density corresponding to the autocovariance function γ, then f and g have the same Fourier coefficients and hence are equal. Theorem A real-valued function f defined on ( π, π] is the spectral density of a stationary process if and only if (i) f (λ) = f ( λ), (ii) f (λ) 0 (iii) π π f (λ)dλ <. Theorem An absolutely summable function γ( ) is the autocovariance function of a stationary time series if and only if it is even and f (λ) = 1 2π e ihλ γ(h) 0 (1.104) h= for all λ ( π, π], in which case f ( ) is the spectral density of γ( ).

22 22 andrew tulloch Theorem 1.58 (Spectral Representation of the ACVF). A function γ( ) defined on the integers is the ACVF of a stationary time series if and only if there exists a right-continuous, nondecreasing, bounded function F on [ π, π] with F( π) = 0 such that γ(h) = for all integers h. π π e ihλ df(λ) (1.105) Remark The function F is a generalized distribution function on [ π, π] in the sense that G(λ) = F(λ) is a probability distribution function F(π) on [ π, π]. Note that since F(π) = γ(0) = V(X 1 ), the ACF of X t has the spectral representation function ρ(h) = π π e ihλ dg(λ) (1.106) The function F is called the spectral distribution function of γ( ). If F(λ) can be expressed as F(λ) = λ π f (y)dy for all λ [ π, π], then f is the spectral density function and the time series is said to have a continuous spectrum. If F is a discrete function, then the time series is said to have a discrete spectrum. Theorem A complex valued function γ( ) is the autocovariance function of a stationary process X t if and only if either (i) γ(h) = π π e ihv df(v) for all h = 0, ±1,... where F is a rightcontinuous, non-decreasing, bounded function on [ π, π] with F( π) = 0, or (ii) n i,j=1 a iγ(i j)a j 0 for all positive integers n and all a = (a 1,..., a n C n ) The Spectral Density of an ARMA Process Theorem If Y t is any zero-mean, possibly complex-valued stationary process with spectral distribution function F Y ( ) and X t is the process X t = j= ψ jy t j where j= ψ j <, then X t is stationary with spectral distribution function F X (λ) = π,λ j= ψ je ijv 2 df Y (v) for π λ π.

23 time series and monte carlo inference 23 If Y t has a spectral density f Y ( ), then X t has a spectral density f X ( ) given by f X (λ) = Ψ(e iλ ) 2 f Y (λ) where Ψ(e iλ ) = j= ψ je ijλ. Theorem Let X t be an ARMA(p, q) process, not necessarily causal or invertible satisfying φ(b)x t = θ(b)z t, Z t WN(0, σ 2 ) where φ(z) = 1 φ 1 z φ p z p and θ(z) = 1 + θ 1 z + + θ q z q have no common zeroes and φ(z) has no zeroes on the unit circle. Then X t has spectral density f X (λ) = σ2 θ(e iλ ) 2 2π φ(e iλ ) 2 (1.107) for π λ π. Theorem The spectral density of the white noise process is constant, f (λ) = σ2 2π The Periodogram Definition The periodogram of (x 1,..., x n ) is the function I n (λ) = 1 n n t=1 x t e itλ 2 (1.108) Theorem If x 1,..., x n are any real numbers and ω k is any of the nonzero Fourier Frequencies 2πk n where ˆγ(h) is the sample ACVF of x 1,..., x n. in ( π, π], then I n(ω k ) = h <n ˆγ(h)e ihω k Theorem Let X t be the linear process X t = j= ψ jz t j, Z t IID(0, σ 2 ), with j= ψ j <. Let I n (λ) be the periodogram of X 1,..., X n, and let f (λ) be the spectral density of X t. (i) If f (λ) > 0for all λ [ π, π] and if 0 < λ 1 < < λ m < π, then the random vector (I n (λ 1 ),..., I n (λ m )) converges in distribution to a vector of independent and exponentially distributed random variables, the i-th component which has mean 2π f (λ i ), i = 1..., m. (ii) If j= ψ j j 1 2 <, E ( Z1 4 ) = νσ 4 <, ω j = 2πj n 0, and

24 24 andrew tulloch ω k = 2πk n 0, then 2(2π) Cov ( I n (ω j ), I n (ω k ), NoValue ) 2 f 2 (ω j ) + O(n 1 2 ) ω j = ω k = {0, π} = (2π) 2 f 2 (ω j ) + O(n 1 2 ) 0 < ω j = ω k < π O(n 1 ) ω j = ω k (1.109) Definition The estimator ˆf (ω) = ˆf (g(n, ω)) with ˆf (ω j ) defined by 1 2π k m n W n (k)i n (w j+k ) with m, m n 0, W n(k) = W n ( k), W n (k) 0 for all k, and k m W n (k) = 1, and k W 2 n(k) 0 as n is called a discrete spectral average estimator of f (w). Theorem Let X t be the linear process X t = j= ψ jz t j, Z t IID(0, σ 2 ), with j= ψ j j 1 2 < and E ( Z 4 1) <. If ˆf is a discrete spectral average estimator of the spectral density f, then for λ, ω [0, π], ( ) (i) lim n E ˆf (ω) = f (ω) (ii) 2 f 2 (ω) w = λ = {0, π} lim n 1 ( ) j m Wn(j) 2 Cov ˆf (ω), ˆf (λ), NoValue = f 2 (ω) 0 < ω = λ < π 0 ω = λ. (1.110)

25 2 Bibliography Peter J Brockwell and Richard A Davis. Introduction to time series and forecasting. Taylor & Francis US, Peter J Brockwell and Richard A Davis. Time series: theory and methods. Springer, 2009.

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")

More information

MATH 5075: Time Series Analysis

MATH 5075: Time Series Analysis NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Introduction to Time Series Analysis. Lecture 12.

Introduction to Time Series Analysis. Lecture 12. Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

where Z t W N(0, σ 2 ) and {ψ j } is a sequence of constants with

where Z t W N(0, σ 2 ) and {ψ j } is a sequence of constants with TIME SERIES AND MONTE-CARLO SUMMARY ANDREW TULLOCH. Basic Cocepts Defiitio (Mea, Covariace fuctio). Defie the mea fuctio µ X (t) = E(X t). Defie the covariace fuctio γ X (t, s) = Cov(X t, X s) = E((X t

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Model selection using penalty function criteria

Model selection using penalty function criteria Model selection using penalty function criteria Laimonis Kavalieris University of Otago Dunedin, New Zealand Econometrics, Time Series Analysis, and Systems Theory Wien, June 18 20 Outline Classes of models.

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Model Identification for Time Series with Dependent Innovations

Model Identification for Time Series with Dependent Innovations Model Identification for Time Series with Dependent Innovations Shuhao Chen 1, Wanli Min 2 and Rong Chen 1,3 1 Rutgers University, 2 IBM Singapore and 3 Peking University Abstract: This paper investigates

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

{ ϕ(f )Xt, 1 t n p, X t, n p + 1 t n, { 0, t=1, W t = E(W t W s, 1 s t 1), 2 t n.

{ ϕ(f )Xt, 1 t n p, X t, n p + 1 t n, { 0, t=1, W t = E(W t W s, 1 s t 1), 2 t n. Innovations, Sufficient Statistics, And Maximum Lielihood In ARM A Models Georgi N. Boshnaov Institute of Mathematics, Bulgarian Academy of Sciences 1. Introduction. Let X t, t Z} be a zero mean ARMA(p,

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009

ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009 1. Introduction Given a covariance stationary process µ ' E[ ], the Wold decomposition states that where U t ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009 with vanishing

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr R Tsay An effective procedure for building empirical time series models is the Box-Jenkins approach, which consists of three stages: model

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

3 ARIMA Models. 3.1 Introduction

3 ARIMA Models. 3.1 Introduction 3 ARIMA Models 3. Introduction In Chapters and, we introduced autocorrelation and cross-correlation functions (ACFs and CCFs) as tools for clarifying relations that may occur within and between time series

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES

SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES SF2943: A FEW COMMENTS ON SPECTRAL DENSITIES FILIP LINDSKOG The aim of this brief note is to get some basic understanding of spectral densities by computing, plotting and interpreting the spectral densities

More information

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom). Original 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000 0 50 100 150 200 Residual 0.05 0.05 ACF 0 500 1000 1500 2000 0 50 100 150 200 Generated 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof

More information

Introduction to Time Series Analysis. Lecture 15.

Introduction to Time Series Analysis. Lecture 15. Introduction to Time Series Analysis. Lecture 15. Spectral Analysis 1. Spectral density: Facts and examples. 2. Spectral distribution function. 3. Wold s decomposition. 1 Spectral Analysis Idea: decompose

More information

Strictly Stationary Solutions of Autoregressive Moving Average Equations

Strictly Stationary Solutions of Autoregressive Moving Average Equations Strictly Stationary Solutions of Autoregressive Moving Average Equations Peter J. Brockwell Alexander Lindner Abstract Necessary and sufficient conditions for the existence of a strictly stationary solution

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

An algorithm for robust fitting of autoregressive models Dimitris N. Politis

An algorithm for robust fitting of autoregressive models Dimitris N. Politis An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Efficient inference for periodic autoregressive coefficients with polynomial spline smoothing approach

Efficient inference for periodic autoregressive coefficients with polynomial spline smoothing approach The University of Toledo The University of Toledo Digital Repository Theses and Dissertations 2015 Efficient inference for periodic autoregressive coefficients with polynomial spline smoothing approach

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Time Series Analysis. Asymptotic Results for Spatial ARMA Models

Time Series Analysis. Asymptotic Results for Spatial ARMA Models Communications in Statistics Theory Methods, 35: 67 688, 2006 Copyright Taylor & Francis Group, LLC ISSN: 036-0926 print/532-45x online DOI: 0.080/036092050049893 Time Series Analysis Asymptotic Results

More information