Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Size: px
Start display at page:

Download "Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to"

Transcription

1 TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let a and b be constants. (a) If Y t = a + bt + s t + X t where s t is a seasonal component with period 12, show that 12 Y t = (1 B)(1 B 12 )Y t is weakly stationary. (b) If Y t = (a + bt)s t + X t where s t is again a seasonal component with period 12, show that 2 12Y t = (1 B 12 )(1 B 12 )Y t is weakly stationary. (a) We have 12 Y t = (1 B B 12 + B 13 )Y t = b(t t + 1 t t 13) Therefore, + (s t s t 12 ) (s t 1 s t 13 ) + X t X t 1 X t 12 + X t 13 = X t X t 1 X t 12 + X t 13. E( 12 Y t ) = 0, E( 12 Y t )( 12 Y t+h ) = E(X t X t 1 X t 12 + X t 13 )(X t+h X t+h 1 X t+h 12 + X t+h 13 ). Since these values do not depend on t, 12 Y t is weakly stationary. (b) We have and Hence 2 12Y t is stationary. R t := (1 B 12 )Y t = 12bs t 12 + X t X t Y t = (1 B 12 )R t = X t 2X t 12 + X t Let Y t = a + bt + X t, where {X t, t = 0, ±1,...} is an independent and identically distributed sequence of random variables with mean 0 and variance σ 2, and a and b are constants. Define W t = (2q + 1) 1 q Y t+j. Compute its mean. Show that although {W t } is not stationary, it has an autocovariance function γ h = cov(w t, W t+h ) that does not depend on t. Briefly discuss your findings in relation to time series smoothing. j= q We have E(W t ) = 1 2q + 1 E q j= q Y t+j = 1 2q + 1 q [a + b(t + j)] = a + bt j= q 1

2 and q q cov(w t, W t+h ) = (2q + 1) 2 cov X t+j, j= q j= q X t+h+j σ 2 /(2q + 1), h = 0 = σ 2 (2q + 1 h )/(2q + 1) 2, 0 < h 2q 0, h > 2q, which does not depend on t. This filter is a low-pass (or smoothing) filter since it takes the data {Y t } and removes from it the rapidly fluctuating (or high frequency) component {X t }, to leave the slowly varying estimated trend a + bt. 3. Which, if any, of the following functions defined on the integers is the autocovariance function of a weakly stationary time series? (a) (b) f(h) = { 1 if h = 0, 1/h if h 0. 1 if h = 0, 0.8 if h = ±1, f(h) = 0.6 if h = ±2, 0 otherwise. (a) No. An autocovariance function should be even, while f(h) is not. (b) No. An autocovariance function should be semi-positive (or non-negative) definite. It is easy to see that the following matrix is not semi-positive definite. 4. Let ω (0, π) be given, U Uniform[ π, π], and define X t = cos(ωt + U). Prove that the process is weakly stationary. In addition, show that X t = (2 cos ω)x t 1 X t 2, so we can perform perfect prediction using a linear function of past observations. Note that E(X t ) = 1 π cos(ωt + u)du = 0 2π π 2

3 and π γ h = cov(x t, X t+h ) = 1 cos(ωt + u) cos(ωt + ωh + u)du 2π π = 1 π 1 {cos(ωh) + cos(2ωt + ωh + 2u)}du 2π π 2 = 1 {2π cos(ωh) + 0} 4π = cos(ωh)/2. Thus {X t } is weakly stationary. As a side remark, γ h does not go to zero as h. Now because X t + X t 2 = cos(ωt + U) + cos(ωt 2ω + U) = 2 cos(ω) cos(ωt ω + U) = 2 cos(ω)x t 1, we have that X t = (2 cos ω)x t 1 X t For a given weakly stationary series {X t } and a pre-determined series {a 0, a 1, a 2,...} with i=0 a i <. Consider the sequence Y m t = m a j X t j j=0 for m = 0, 1, 2,.... Argue that Yt m has a mean square limit, that is, there exists a random variable Y t with EYt 2 < such that E Y t Yt m 2 0, as m. Explain why this result is useful when we write an ARMA model in its causal (i.e. MA( )) or invertible (i.e. AR( )) representation. It suffices to prove that {Yt m, m N} is a Cauchy sequence, i.e., E Yt n n > m > 0. This follows from E Y n t Y m t 2 = E = m<j n m<j n m<k n a j X t j 2 m<j n m<k n m<j n m<k n ( = var(x 1 ) m<j n a j a k E(X t j X t k ) a j a k E(X t j X t k ) a j a k (EX 2 t j) 1/2 (EX 2 t k) 1/2 a j ) 2 0, Yt m 2 0 as m, n for where we used the Cauchy Schwarz inequality in the second last line and the stationarity of {X t } in the last line. This result allows us to represent the present observation from ARMA as an infinite sum of past noise (i.e. MA( )), or represent the present noise as an infinite sum of past observations (i.e. AR( )). We now know that these representations need to be understood in the sense of mean squares convergence. 3

4 6. Prove that if an MA(q) process is invertible, then the roots of the equation Θ(z) = 0 all lie outside the unit circle. Let Θ(B) = 1 + θ 1 B + θ 2 B θ q B q and write the MA(q) process as X t = Θ(B)ɛ t. By the invertibility of the process, we can find Π(B) = π j B j, with j=0 π j < such that ɛ t = Π(B)X t. Therefore, Θ(B)ɛ t = Θ(B)Π(B)X t = Θ(B)Π(B)Θ(B)ɛ t, or equivalently, j=0 {Θ(B) Θ(B)Π(B)Θ(B)}ɛ t = 0. Note that Θ(z) Θ(z)Π(z)Θ(z) is a polynomial (though might be of degree ). Multiplying both sides of the above equation by ɛ t h for any integer h 0 and taking the expectation allows us to see that the coefficient of z h in {Θ(z) Θ(z)Π(z)Θ(z)} is zero. Furthermore, because j=0 π j <, Θ(z) Θ(z)Π(z)Θ(z) is well-defined for any z 1 (as the limit of a polynomial of degree ). Consequently, Θ(z) Θ(z)Π(z)Θ(z) = 0 for any z 1. For any z 1 such that Θ(z) 0, this implies that Π(z)Θ(z) = 1. Because Π(z)Θ(z) is a continuous function, and the set {z : Θ(z) = 0} can contain at most q points, we conclude that Π(z)Θ(z) = 1 for any z 1. If there is a complex number in the unit circle, say z 0 1, for which Θ(z 0 ) = 0, then it would imply Π(z 0 )Θ(z 0 ) = 0, which would lead to a contradiction. Therefore, all the roots of Θ(z) are outside the unit circle. Moreover, the series {π j } can be determined by solving Π(z) = j=0 π j z j = 1, z 1. Θ(z) 7. For any MA(2) process, if η 1, η 2 C are the roots of the MA polynomial Θ(z), show that they can be replace by 1/η 1 and 1/η 2 without changing the ACF. That is, MA processes with Θ(z) = (1 η1 1 z)(1 η 1 2 z) and Θ (z) = (1 η 1 z)(1 η 2 z) have the same ACF. Briefly comment on how you would generalise this result to MA(q) with q > 2. For an MA(2) process, the ACF cuts off after lag 2, so we only need to concentrate on ρ 1 and ρ 2. Suppose that the MA polynomial is Θ(z) = 1 + θ 1 z + θ 2 z 2, then ρ 1 = θ 1 + θ 1 θ θ1 2 +, ρ 2 = θ2 2 θ θ θ2 2 4

5 By plugging in the following two sets of equations (also note that η 1 and η 2 are conjugate), { θ 1 = η1 1 + η2 1 θ 2 = η1 1 η 1 2 we can verify that the ACFs are the same. { θ 1 = η 1 + η 2 θ 2 = η 1 η 2, This result can be generalised to MA(q), where if η 1,..., η q C are the roots of the MA polynomial Θ(z), some (or all) of the roots η i can be replace by 1/η i (1 i q) without changing the ACF. Spectral analysis turns out to be extremely useful in proving this generalisation. 8. Show that in order for an AR(2) process with autoregressive polynomial Φ(z) = 1 φ 1 z φ 2 z 2 to be causal, the parameters (φ 1, φ 2 ) must lie in the triangular region determined by the intersection of the three regions, φ 1 + φ 2 < 1, φ 2 φ 1 < 1 and φ 2 < 1. The process can be written as Φ(B)x t = ɛ t, where B is the backward shift operator. {X t } is causal if all of the (complex) roots of Φ(z) = 0 are outside the unit disc in C. Note that Φ is a quadratic polynomial with real coefficients, so either we have two different real roots, a repeated real root or a complex conjugate pair. Suppose that Φ can be factorised as Φ(z) = (1 az)(1 bz) with a, b < 1. Then by comparing coefficients of z and z 2, we see that φ 1 = a + b, φ 2 = ab. (a) Repeated root: a = b, thus (φ 1, φ 2 ) must lie on the parametric curve f(a) = (2a, a 2 ), a < 1. (b) Distinct real roots: say a = c + d, b = c d for some c, d with c < 1, d < 1 c. Then, φ 1 = 2c, φ 2 = c 2 + d 2, which are bounded by the curve from the first case, and the lines φ 1 + φ 2 = 1 and φ 2 φ 1 = 1. (c) Complex conjugate pair: let a = c + id, b = c id with c 2 + d 2 < 1. Then, φ 1 = 2c, φ 2 = (c 2 + d 2 ) which are bounded by the curve and the line φ 2 = 1. The combination of all these cases is the interior of the given region in the (φ 1, φ 2 ) plane. We conclude that the stipulated conditions are equivalent to causality. 9. Let ɛ t i.i.d. N(0, σ 2 ) and let φ < 1 be a constant. Consider the process X 1 = ɛ 1 and X t = φx t 1 + ɛ t, for t = 2, 3,.... (a) Find the mean and the variance of the series. Is {X t } stationary? (b) Find corr(x t, X t+h ) for h N. Argue that for large t, var(x t ) σ 2 /(1 φ 2 ) and corr(x t, X t+h ) φ h. So in a sense, {X t } is asymptotically stationary. 5

6 (c) Now suppose X 1 = ɛ 1 / 1 φ 2. Is this new process stationary? Comment on how you could use these findings to simulate n observations from a stationary AR(1). (a) Write X t as X t = ɛ t + φɛ t 1 + φ 2 ɛ t 2 + φ t 2 ɛ 2 + φ t 1 ɛ 1. So EX t = 0 and EX 2 t = σ 2 (1 φ 2t )/(1 φ 2 ). Therefore, {X t } is not (either weakly or strongly) stationary. (b) For h N, It follows that as t. cov(x t, X t+h ) = σ 2 (φ h + φ h φ 2t+h 2 ) = σ 2 φ h var(x t ). var(x t ) = σ 2 (1 φ 2t )/(1 φ 2 ) σ 2 /(1 φ 2 ), [ ] 1/2 cov(x t, X t+h ) corr(x t, X t+h ) = var(xt )var(x t+h ) = var(xt ) φh = φ h 1 φ 2t var(x t+h ) 1 φ 2(t+h) φh, (c) Now EX t = 0 and var(x t ) = σ 2 {1 + φ 2 + φ φ 2(t 2) + φ 2(t 1) /(1 φ 2 )} = σ 2 /(1 φ 2 ). Moreover, for h N, cov(x t, X t+h ) = σ 2 {φ h + φ h φ 2t+h 4 + φ 2t+h 2 /(1 φ 2 )} = σ 2 1 φ 2, which does not depend on t. Therefore, the new series is (both weakly and strongly) stationary. When simulating an AR(1), we can either discard the initial chunk of data, or simulate the first observation X 1 from its stationary distribution. 10. Identify the following models as ARMA(p, q) models, and determine whether they are causal and/or invertible: (a) X t = 0.4X t X t 2 + ɛ t + ɛ t ɛ t 2. (b) X t = X t 1 0.5X t 2 + ɛ t ɛ t 1. (a) Rewrite the process as (1 0.9B)( B)X t = ( B) 2 ɛ t. Since ( z) is the common factor in both the AR polynomial and the MA polynomial, there is an issue of parameter redundancy. This process is actually an ARMA(1,1) process, which can be written as (1 0.9B)X t = ( B)ɛ t. Since both the root of 1 0.9z = 0 and the root of z = 0 lie outside the unit circle, we conclude that this process is both causal and invertible. (b) Rewrite the process as (1 B + 0.5B 2 )X t = (1 B)ɛ t. This is an ARMA(2,1) process. Since the roots of the AR polynomial, 1 ± i, lie outside the unit circle, the process is causal. However, it is not invertible as the MA polynomial has a unit root. φ h 6

7 11. For an AR(2) model given by X t = 0.8X t 2 + ɛ t, determine and sketch its ACF and PACF. X t = ɛ t 0.8X t 2 = ɛ t 0.8ɛ t X t 4 = = ( 0.8) j ɛ t 2j It is easy to see that ρ h = 0 when h is odd. When h is even, we let Y t = X 2t. Therefore, j=0 ρ h = corr(x t, X t+h ) = corr(y t, Y t+h/2 ) = ( 0.8) h /2, where the last equality follows from the fact that {Y t } is an AR(1) process with φ = 0.8. For PACF, because {X t } is an AR(2) process, φ hh = 0 for any h > 2. By definition, φ 11 = ρ 1 = 0. Lastly, since β = ρ 1 = 0, we have φ 22 = corr(x t βx t+1, X t+2 βx t+1 ) = 0.8. This example demonstrates that both ACF and PACF are not necessarily decreasing functions (even in absolute values)! 12. ( ) Suppose that {X t } and {Y t } are two zero-mean weakly stationary processes with the same autocovariance function and that {Y t } is an ARMA(p, q) process. Show that {X t } must also be an ARMA(p, q) process 1. First, we state and prove the following lemma: Lemma. If {X t } is a non-zero stationary process with autocovariance function γ such that γ h = 0 for h > q and γ q 0, then {X t } is an MA(q) process, i.e., there exists a white noise process {Z t } such that X t = Z t + θ 1 Z t θ q Z t q. (1) Proof of Lemma. For each t, define the subspace M t = sp{x s, < s t} of L 2 and set Z t = X t P Mt 1 X t. (2) Here sp{ } is the closure of a linear span and P M ( ) represents the projection onto a subspace M. Clearly Z t M t, and by definition of P Mt 1, Z t M t 1, where M is the orthogonal complement of M. Thus if s < t, Z s M s M t 1 and hence EZ s Z t = 0. Moreover, P sp{xs,s=t n,...,t 1}X t m.s. P Mt 1 X t, n, where m.s. means convergence in mean squares. By weak stationarity and the continuity of the l 2 -norm, Z t+1 = X t+1 P Mt X t+1 = lim n X t+1 P sp{xs,s=t+1 n,...,t}x t+1 = lim n X t P sp{xs,s=t n,...,t 1}X t = X t P Mt 1 X t = Z t. 1 In this question, ARMA(p, q) processes should be interpreted in the broad sense, where the errors are white noise but not necessarily Gaussian. 7

8 Defining σ 2 = Z t 2, we conclude that {Z t } WN(0, σ 2 ). Now by (2), it follows that M t 1 = sp{x s, s < t 1, Z t 1 } = sp{x s, s < t q, Z t q,..., Z t 1 } and consequently M t 1 can be decomposed into the two orthogonal subspaces, sp{z t q,..., Z t 1 } and M t q 1. Since γ h = 0 for h > q, it follows that X t M t q 1 and P Mt 1 X t = P Mt q 1 X t + P sp{zt q,...,z t 1}X t = θ 1 Z t θ q Z t q where θ j := σ 2 E(X t Z t j ), which by stationarity is independent of t for j = 1,..., q. Substituting for P Mt 1 X t in (2) gives (1). Let {W t } be time series satisfying W t := X t φ 1 X t 1 φ p X t p = Φ(B)X t, where Φ(z) = 1 φ 1 z φ p z p and φ 1,..., φ p are the AR coefficients for {Y t }. The autocovariance function of {W t } thus satisfies γ W h = cov { Φ(B)X t, Φ(B)X t+h } = cov { Φ(B)Yt, Φ(B)Y t+h }. Since {Φ(B)Y t } is an MA(q) process by assumption, we have that γh W = 0 for h > q. It follows from the above lemma, that {W t } is also an MA(q) process and thus {X t } is an ARMA(p, q) process. 8

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Stochastic Modelling Solutions to Exercises on Time Series

Stochastic Modelling Solutions to Exercises on Time Series Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have

More information

MATH 5075: Time Series Analysis

MATH 5075: Time Series Analysis NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

ARMA Models: I VIII 1

ARMA Models: I VIII 1 ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina STAT 720 TIME SERIES ANALYSIS Spring 2015 Lecture Notes Dewei Wang Department of Statistics University of South Carolina 1 Contents 1 Introduction 1 1.1 Some examples........................................

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

3 ARIMA Models. 3.1 Introduction

3 ARIMA Models. 3.1 Introduction 3 ARIMA Models 3. Introduction In Chapters and, we introduced autocorrelation and cross-correlation functions (ACFs and CCFs) as tools for clarifying relations that may occur within and between time series

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Strictly Stationary Solutions of Autoregressive Moving Average Equations

Strictly Stationary Solutions of Autoregressive Moving Average Equations Strictly Stationary Solutions of Autoregressive Moving Average Equations Peter J. Brockwell Alexander Lindner Abstract Necessary and sufficient conditions for the existence of a strictly stationary solution

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Characteristics of Time Series

Characteristics of Time Series Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014 UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014 STAD57H3 Time Series Analysis Duration: One hour and fifty minutes Last Name: First Name: Student

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Time series models 2007

Time series models 2007 Norwegia Uiversity of Sciece ad Techology Departmet of Mathematical Scieces Solutios to problem sheet 1, 2007 Exercise 1.1 a Let Sc = E[Y c 2 ]. The This gives Sc = EY 2 2cEY + c 2 ds dc = 2EY + 2c = 0

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

Spectral representations and ergodic theorems for stationary stochastic processes

Spectral representations and ergodic theorems for stationary stochastic processes AMS 263 Stochastic Processes (Fall 2005) Instructor: Athanasios Kottas Spectral representations and ergodic theorems for stationary stochastic processes Stationary stochastic processes Theory and methods

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information