Levinson Durbin Recursions: I

Size: px
Start display at page:

Download "Levinson Durbin Recursions: I"

Transcription

1 Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions solve Γ n a n = γ n (1) efficiently, giving us the coefficients a n needed for best linear predictor X n+1 = a nx n of X n+1 given X n = [X n,..., X 1 ] in doing so, L D recursions also give us coefficients a m for X m+1 = a mx m, m = 1,..., n 1, the best linear predictor of X m+1 given X m = [X m,..., X 1 ] partial autocorrelation function (PACF), also known as partial autocorrelation sequence or reflection coefficient sequence will state L D recursions without proof (B&D have one; S&S leave it as exercise; will give alternative proof in Stat/EE 520) BD 69, CC 113, SS 112, 165 XI 1

2 Levinson Durbin Recursions: II to keep track of best linear predictors as sample size n increases (and to emphasize certain connections with AR processes), will switch notation from a n to φ n henceforth we now write X n+1 = φ n,1 X n + φ n,2 X n φ n,n X 1 = φ nx n where φ n [φ n,1, φ n,1,..., φ n,n ] simplify γ n (1) to just γ n so that γ n = [γ(1), γ(2),..., γ(n)] in new notation, L D recursions solve for φ n in Γ n φ n = γ n recall that Γ n is covariance matrix for X n, so its (i, j)th element is cov {X i, X j } = γ(i j) XI 2

3 Levinson Durbin Recursions: III referring back to overhead X 13, will denote mean square error (MSE) associated with predictor X n+1 as v n E{(X n+1 X n+1 ) 2 } = var {X n+1 X n+1 } 3. = γ(0) φ nγ n = var {X n+1 } φ n cov {X n+1, X n } BD 69, 70 XI 3

4 Levinson Durbin Recursions: IV for n = 1, have X 2 φ 1,1 X 1 equation Γ n φ n = γ n becomes γ(0)φ 1,1 = γ(1) solution is φ 1,1 = γ(1)/γ(0) = ρ(1) associated MSE is ( ) v 1 = γ(0) φ 1 γ 1 = γ(0) φ 1,1 γ(1) = γ(0) φ 1,1 [φ 1,1 γ(0)] (making use of ( )) = γ(0)(1 φ 2 1,1 ) = v 0(1 φ 2 1,1 ) with v 0 γ(0) Q: why is γ(0) a natural definition for v 0? note connection to AR(1) model X t = φ 1,1 X t 1 + Z t with {Z t } WN(0, σ 2 (1 φ 2 1,1 )), for which γ(0) = σ2 BD 70, SS 112 XI 4

5 Levinson Durbin Recursions: V given φ n 1 & v n 1, L D recursion gets φ n & v n in 3 steps 1. get nth order partial autocorrelation (more on this later!): φ n,n = γ(n) n 1 j=1 φ n 1,jγ(n j) v n 1 note: sum is inner product of φ n 1 & order reversal of γ n 1 2. get remaining φ n,j s: φ n,1. = φ n,n φ n,n 1 3. get nth order MSE: φ n 1,1. φ n 1,n 1 v n = v n 1 (1 φ 2 n,n) φ n 1,n 1. φ n 1,1 BD 70, SS 112 XI 5

6 Levinson Durbin Recursions: VI as a first example, reconsider AR(1) process X t = φx t 1 +Z t, where φ < 1 and {Z t } WN(0, σ 2 ) have already argued (X 14) that X n+1 = φx n φ n = [φ, 0,..., 0] and v n = σ 2 for all n since MSE is σ 2 accordingly, let s apply L D recursions to φ n 1 = [φ, 0,..., 0] & v n 1 = σ 2 and see if required forms for φ n and v n pop out step 1: recalling that γ(h) = σ 2 φ h /(1 φ 2 ) for h 0, we have φ n,n = γ(n) n 1 j=1 φ n 1,jγ(n j) v n 1 = σ 2φn φ n 1,1 φ n 1 v n 1 (1 φ 2 ) = σ 2 φ n φ n v n 1 (1 φ 2 ) = 0 XI 6

7 Levinson Durbin Recursions: VII step 2: yields φ n,1 φ n,2. φ n,n 2 φ n,n 1 = φ n 1,1 φ n 1,2. φ n 1,n 2 φ n 1,n 1 φ n,1 φ φ n,2. φ n,n 2 = φ n,n 1 0 so φ n = [φ, 0,..., 0] as required φ n,n φ = φ n 1,n 1 φ n 1,n 2. φ n 1,2 φ n 1,1 φ , XI 7

8 Levinson Durbin Recursions: VIII step 3: v n = v n 1 (1 φ 2 n,n) = v n 1 = σ 2, as required note: partial autocorrelation φ n,n for AR(1) process is φ for n = 1 and is zero for n = 2, 3,... homework exercise: run L D recursions on MA(1) process as 2nd example, reconsider stationary process of Problem 3(b): X t = Z 1 cos (ωt) + Z 2 sin (ωt), where Z 1 and Z 2 are independent N (0, 1) RVs ACVF for {X t } is γ(h) = cos (ωh) (same as is its ACF ρ(h)) starting with X 2 φ 1,1 X 1 (n = 1 case), we have φ 1,1 = ρ(1) = cos (ω) and v 1 = γ(0)(1 φ 2 1,1 ) = 1 cos2 (ω) = sin 2 (ω) XI 8

9 Levinson Durbin Recursions: IX now let us get coefficients for X 3 φ 2,1 X 2 + φ 2,2 X 1 (n = 2 case) using L D recursions first step φ n,n = γ(n) n 1 j=1 φ n 1,jγ(n j) v n 1, yields, for n = 2 (recalling γ(h) = cos (ωh) & φ 1,1 = cos (ω)), φ 2,2 = γ(2) φ 1,1γ(1) v 1 = cos (2ω) cos (ω) cos (ω) sin 2 (ω) = 1 because of trig identity cos (2ω) cos 2 (ω) = sin 2 (ω) XI 9

10 Levinson Durbin Recursions: X second step of L D recursions, namely, φ n,1 φ n 1,1. =. φ n,n φ n,n 1 φ n 1,n 1 yields, for n = 2, φ n 1,n 1. φ n 1,1, φ 2,1 = φ 1,1 φ 2,2 φ 1,1 = cos (ω)[1 ( 1)] = 2 cos (ω) third step of L D recursions, namely, v n = v n 1 (1 φ 2 n,n) yields, for n = 2, v 2 = v 1 [1 ( 1) 2 ] = 0 thus X 3 is perfectly predicable given X 2 & X 1 : X 3 = 2 cos (ω)x 2 X 1 = X 3 thus, for all t, X t is perfectly predicable given X t 1 & X t 2 : X t = 2 cos (ω)x t 1 X t 2 = X t (Q: why?) BD 77 XI 10

11 Aside Step-Down Levinson Durbin Recursions: I application of L D recursions to AR(p) process yields, for n p, Y t = φ 1 Y t φ p Y t p + Z t Ŷ n+1 = φ n,1 Y n + + φ n,n Y 1 = φ 1 Y n + + φ p Y n p+1, i.e., Ŷ n+1 only depends on p most recent values and, when n > p, not on remote values Y n p,..., Y 1 associated prediction error is Y n+1 Ŷn+1 = Y n+1 φ 1 Y n φ p Y n p+1 = Z n+1, so MSE is v n = var {Y n+1 Ŷn+1} = var {Z n+1 } = σ 2 given φ p,1 = φ 1, φ p,2 = φ 2,..., φ p,p = φ p and σ 2, can invert L D recursions to get coefficients for best linear predictors of orders p 1, p 2,..., 1 and associated MSEs XI 11

12 Aside Step-Down Levinson Durbin Recursions: II given φ h,1,..., φ h,h & v h, compute 1. φ h 1,j = φ h,j+φ h,h φ h,h j 1 φ 2 h,h 2. v h 1 = v h /(1 φ 2 h,h ), 1 j h 1 step-down L D recursion yields φ h 1,1,..., φ h 1,h 1 & v h 1 start with φ p,1 = φ 1,..., φ p,p = φ p & v p = σ 2 apply step-down recursions to get φ p 1,j s & v p 1, φ p 2,j s & v p 2,..., φ 1,1 & v 1 as opposed to usual L D recursions, step-down L D recursions do not make use of ACVF γ(h) for {Y t } in fact, given φ 1, φ 2,..., φ p & σ 2, can use results of step-down L D recursions to compute γ(h) (yet another method!) XI 12

13 Aside Step-Down Levinson Durbin Recursions: III to do so, return to overhead XI 4 and note that γ(0) v 0 = v 1 /(1 φ 2 1,1 ) γ(1) = γ(0)φ 1,1 next go to overhead XI 5, grab φ n,n = γ(n) n 1 j=1 φ n 1,jγ(n j) v n 1 and manipulate it to get γ(n) = φ n,n v n 1 + n 1 j=1 φ n 1,j γ(n j) and thus γ(2) = φ 2,2 v 1 + φ 1,1 γ(1) γ(3) = φ 3,3 v 2 + φ 2,1 γ(2) + φ 2,2 γ(1) etc., ending with γ(p) = φ p,p v p 1 + φ p 1,1 γ(p 1) + + φ p 1,p 1 γ(1) XI 13

14 Aside Step-Down Levinson Durbin Recursions: IV to get γ(p + 1), γ(p + 2),..., make use of an equation stated on overhead IX 50: γ(k) = φ 1 γ(k 1) + + φ p γ(k p), which holds for all k p + 1 note: can now argue that AR coefficients φ 1, φ 2,..., φ p and sequence of partial autocorrelations φ 1,1, φ 2,2,..., φ p,p are equivalent to one another (in particular, φ p,p = φ p ) we now return to our regularly scheduled program... XI 14

15 One-Step-Ahead Prediction Errors (Innovations): I given time series X 1, X 2,..., can use L D recursions to find coefficients φ m 1 for X m i.e., best linear predictor of X m given X m 1,..., X 1 define X 1 = 0 and X n = [ X n, X n 1,..., X 1 ] letting m = 1, 2,..., n, can generate a series of one-step-ahead prediction errors (or innovations): U m = X m X m collect these into U n = [U n, U n 1..., U 1 ] so that we can write U n = X n X n BD 71, SS 114 XI 15

16 One-Step-Ahead Prediction Errors (Innovations): II can write U n = A nx n, where A n is lower triangular: φ n 1, A n = φ n 1,n 3 φ n 2,n φ n 1,n 2 φ n 2,n 3 φ 2,1 1 0 φ n 1,n 1 φ n 2,n 2 φ 2,2 φ 1,1 1 inverse of A n is also lower triangular, so let s write it as θ n 1, C n θ n 1,n 3 θ n 2,n θ n 1,n 2 θ n 2,n 3 θ 2,1 1 0 θ n 1,n 1 θ n 2,n 2 θ 2,2 θ 1,1 1 BD 72, SS 114 XI 16

17 One-Step-Ahead Prediction Errors (Innovations): III since C n is inverse of A n, U n = A nx n leads to X n = C nu n ; i.e., time series can be reexpressed in terms of its innovations recall that L D recursions give v m 1 = E{(X m X m ) 2 } = var{u m }, m = 1, 2,..., n can use so-called innovations algorithm to get both v m and elements of C m (note: take sum with upper limit 1 to be 0): θ m,m k = γ(m k) k 1 j=0 θ k,k jθ m,m j v j, 0 k < m v k v m = γ(0) m 1 j=0 θ 2 m,m j v j start with v 0 = γ(0), get θ 1,1 & v 1, get θ 2,2, θ 2,1 & v 2 etc. BD 72, 73, SS 114 XI 17

18 One-Step-Ahead Prediction Errors (Innovations): IV since X n = C nu n, can write (with θ m,0 1), m X m+1 = θ m,j U m j+1, m = 1, 2,..., n 1, j=0 i.e., linear combination of innovations yields time series since X n = X n U n = C nu n U n = (C n I n )U n, where I n is the n n identity matrix, can also write m X m+1 = θ m,j U m j+1, m = 1, 2,..., n 1, j=1 i.e., linear combination of innovations also yields predictions HW exercise: innovations U 1, U 2,..., U n are uncorrelated BD 72, SS 114 XI 18

19 Aside Simulation of ARMA Processes: I often of interest to generate realizations of ARMA processes first consider stationary & causal Gaussian AR(p) process: Y t φ 1 Y t 1 φ p Y t p = Z t, {Z t } Gaussian WN(0,σ 2 ) recall that, for any t p + 1, best linear predictor Ŷt of Y t given Y t 1,..., Y 1 takes the form Ŷ t = φ 1 Y t φ p Y t p innovations are U t = Y t Ŷt = Z t and have MSE v t 1 var {U t } = σ 2 can use step-down L D recursions to get coefficients for Ŷ t = φ t 1,1 Y t φ t 1,t 1 Y 1, t = 2, 3,..., p and associated MSEs v t 1 (recall that Ŷ1 = 0 by definition) XI 19

20 Aside Simulation of ARMA Processes: II innovations U t = Y t Ŷt, t = 1,..., p are such that 1. E{U t } = 0 and var {U t } = v t 1 2. U 1, U 2,..., U p are uncorrelated RVs (homework exercise) implies independence under Gaussian assumption easy to simulate U t s: generate p independent realizations of N (0, 1) RVs, say, Z 1,..., Z p, and set U t = v 1/2 t 1 Z t can unroll U t s to get simulations of Y t s, t = 1,..., p: U 1 = Y 1 Ŷ1 = Y 1 yields Y 1 = U 1 U 2 = Y 2 Ŷ2 = Y 2 φ 1,1 Y 1 yields Y 2 = φ 1,1 Y 1 + U 2 U 3 = Y 3 Ŷ3 = Y 3 φ 2,1 Y 2 φ 2,2 Y 1 yields Y 3 = φ 2,1 Y 2 + φ 2,2 Y 1 + U 3 XI 20

21 finally Aside Simulation of ARMA Processes: III yields U p = Y p Ŷp = Y p φ p 1,1 Y p 1 φ p 1,p 1 Y 1 Y p = φ p 1,1 Y p φ p 1,p 1 Y 1 + U p can now generate remainder of desired simulated series using Y t = φ 1 Y t φ p Y t p + σ Z t, t = p + 1, p + 2,..., where Z t s are independent realizations of N (0, 1) RVs (these are independent of Z 1,..., Z p also) XI 21

22 Aside Simulation of ARMA Processes: IV knowing how to simulate AR process φ(b)y t = Z t, can in turn simulate ARMA process φ(b)x t = θ(b)z t since we can create ARMA process {X t } by applying filter θ(b) to AR process {Y t }: X t = θ(b)y t = θ(b)φ 1 (B)Z t, i.e., φ(b)x t = θ(b)z t (see overhead IX 47) hence can generate simulated ARMA series of length n via X t = Y t + θ 1 Y t θ q Y t q, t = q + 1,..., q + n; i.e., need to make simulated AR series of length n + q XI 22

23 Example Simulation of ARMA(2,2) Process: I consider ARMA(2,2) process given by X t = 3 4 X t 1 2 1X t 2 + Z t Z t Z t 2, {Z t } WN(0, 1), so that v 2 = 1 to simulate AR(2) process Y t = 4 3Y t 1 2 1Y t 2 + Z t, need to run reverse L D recursions once to obtain φ 1,1 = φ 2,1 + φ 2,2 φ 2,1 1 φ 2 2,2 = = 1 2, v 1 = v 2 1 φ 2 2,2 = 4 3 and hence v 0 = v 1 1 φ 2 1,1 = 16 9 XI 23

24 Example Simulation of ARMA(2,2) Process: II thus would generate AR(2) process using Y 1 = 4 3 Z 1 Y 2 = 1 2 Y Z2 Y 3 = 3 4 Y Y 1 + Z 3. Y n+2 = 3 4 Y n Y n + Z n+2, where Z t s are IID N (0, 1) RVs desired ARMA(2,2) process is given by X t = Y t Y t Y t, t = 1,..., n overhead VIII 24 shows AR(2) series (n = 100) used to form ARMA(2,2) simulation (n = 98) in next overhead XI 24

25 Realization of Second AR(2) Process t x t VIII 24

26 Realization of ARMA(2,2) Process t x t XI 25

27 Aside Simulation of ARMA Processes: V method described here deemed exact because of use of socalled stationary initial conditions (method used in R function arima.sim is not exact makes use of a burn-in period) source article is Kay (1981), which is just over a page in length, making it one of the shortest useful articles relevant to time series analysis (shortest is undoubtedly David, 1985!) XI 26

28 Multi-Step-Ahead Prediction: I reconsider one-step-ahead predictor X n+1 of X n+1 given X n, X n 1,..., X 1 in preparation for considering multi-step-ahead prediction, will now denote X n+1 by X n+1 n X n+1 n can be written as either a linear combination of previous time series values or previous innovations: n X n+1 n = φ n,j X n j+1 or X n n+1 n = θ n,j U n j+1 j=1 j=1 for a given h 2, want to formulate best linear predictor X n+h n of X n+h given X n, X n 1,..., X 1 XI 27

29 Multi-Step-Ahead Prediction: II first approach: replacing n in n X n+1 n = φ n,j X n j+1 with n + h 1 gives X n+h n+h 1 = j=1 n+h 1 j=1 φ n+h 1,j X n+h j above involves unobserved X n+h 1,..., X n+1, but replacing these with X n+h 1 n,..., X n+1 n, gives desired predictor: X n+h n = h 1 j=1 φ n+h 1,j Xn+h j n + n+h 1 j=h φ n+h 1,j X n+h j XI 28

30 Multi-Step-Ahead Prediction: III leads to recursive scheme for computing X n+h n starting with one-step-ahead predictor X n+1 n (we know how to get this!) two-step-ahead predictor: replace X n+1 in with X n+1 n to get X n+2 n+1 = n+1 j=1 X n+2 n = φ n+1,1 Xn+1 n + φ n+1,j X n+2 j n+1 j=2 φ n+1,j X n+2 j XI 29

31 Multi-Step-Ahead Prediction: IV three-step-ahead predictor: replace X n+2 & X n+1 in X n+3 n+2 = n+2 j=1 with X n+2 n & X n+1 n to get φ n+2,j X n+3 j n+2 X n+3 n = φ n+2,1 Xn+2 n +φ n+2,2 Xn+1 n + j=3 yadda, yadda, yadda, coming eventually to the desired X n+h n = h 1 j=1 φ n+h 1,j Xn+h j n + n+h 1 j=h φ n+2,j X n+3 j φ n+h 1,j X n+h j XI 30

32 Multi-Step-Ahead Prediction: V since X n+1 n,..., X n+h 1 n are all linear combinations of X n,..., X 1, it follows that X n+h n is also such: X n+h n = h 1 j=1 φ n+h 1,j Xn+h j n + n a j X n j+1 j=1 n+h 1 j=h φ n+h 1,j X n+h j can show that a n = [a 1,..., a n ] so defined is a solution to Γ n a n = γ n (h), where n n matrix Γ n has (i, j)th entry of γ(i j), while γ n (h) = [γ(h),..., γ(h + n 1)] XI 31

33 Multi-Step-Ahead Prediction: VI second approach: replacing n in n X n+1 n = θ n,j U n j+1 with n + h 1 gives X n+h n+h 1 = j=1 n+h 1 j=1 θ n+h 1,j U n+h j above involves unobserved U n+h 1,..., U n+1, but replacing these with their expected values (zero!) gives desired predictor: n+h 1 n X n+h n = θ n+h 1,j U n+h j = θ n+h 1,n+h j U j j=h j=1 XI 32

34 Multi-Step-Ahead Prediction: VII MSE of h-step-ahead forecast is E{(X n+h X n+h n ) 2 } = E{X 2 n+h } 2E{X n+h X n+h n } + E{ X 2 n+h n } = γ(0) E{ X 2 n+h n } = γ(0) var { X n+h n } since E{X 2 n+h } = γ(0) and E{X n+h X n+h n } = E{ X 2 n+h n } (homework exercise!) since var {U j } = v j 1 and U j s are uncorrelated, { n } var { X n n+h n } = var θ n+h 1,n+h j U j = θn+h 1,n+h j 2 v j 1 j=1 MSE is thus given by E{(X n+h X n+h n ) 2 } = γ(0) j=1 n θn+h 1,n+h j 2 v j 1 σn(h) 2 j=1 BD 74, 75 XI 33

35 Multi-Step-Ahead Prediction: VIII under a Gaussian assumption, can use above to form 95% prediction bounds for unknown X n+h : X n+h n ± 1.96σ n (h) as example, consider 1st part of wind speed series x 1,..., x 100 after centering x t by subtracting off its sample mean x, we model x t = x t x as an AR(1) process X t = φx t 1 + Z t with φ estimated by ˆφ = ˆρ(1). = (cf. overhead X 16) based on x 1,..., x 100, forecast last 28 values x 101 x,..., x 128 x of time series and see how well we do following overheads show results from homegrown R code based on theory presented above built-in R functions ar and predict BD 74, 75 XI 34

36 Multi-Step-Ahead Prediction of Wind Speed x t t XI 35

37 Multi-Step-Ahead Prediction of Wind Speed using R x t t XI 36

38 Predictions Based on Infinite Past: I rather than using X n,..., X 1 to predict X n+h, suppose we use, for some m 0, X n,..., X 1, X 0, X 1,..., X m and form a predictor to be denoted by X n+h n,m by letting m and assuming limit exists (in MS sense), can write X n+h n, = α j X n j+1 j=1 where α j s are set by a version of the orthogonality principle: cov {X n+h α j X n j+1, X n i } = 0, i = 0, 1,... j=1 BD 75, SS 115 XI 37

39 Predictions Based on Infinite Past: II refer to X n+h n, as predictor of X n+h based on infinite past X n, X n 1,... associated prediction error X n+h X n+h n, has MSE E{(X n+h X n+h n, ) 2 } = var {X n+h X n+h n, }, which can be compared to var {X n+h X n+h n } to see how much can be gained from having lots more data (recall that X n+h n is based on just X n, X n 1,..., X 1 ) BD 75, 76, SS 115 XI 38

40 Predictions Based on Infinite Past: III applying representation X t = ψ j Z t j at t = n + h yields X n+h = j=0 ψ j Z n+h j j=0 consider Z t s that make up X n+h but not X n ; i.e., Z n+h, Z n+h 1,..., Z n+1 replacing these h RVs by their expected values (zero) gives X n+h n, = ψ j Z n+h j j=h prediction error is thus X n+h X n+h n, = ψ j Z n+h j j=0 XI 39 ψ j Z n+h j = j=h h 1 j=0 ψ j Z n+h j

41 Predictions Based on Infinite Past: IV since {Z t } WN(0, σ 2 ), variance of X n+h X n+h n, = i.e., MSE of X n+h n,, is given by h 1 j=0 ψ j Z n+h j var {X n+h X h 1 n+h n, } = σ 2 in particular, for h = 1, MSE is var {X n+1 X n+1 n, } = σ 2 rather than v n = var {X n+1 X n+1 n } j=0 homework exercise: compare MSEs for specific MA(1) and AR(1) processes with specific sample sizes n ψ 2 j CC 196, SS 116 XI 40

42 References H. A. David (1985), Bias of S 2 Under Dependence, The American Statistician, 39, p. 201 J. Durbin (1960), The Fitting of Time Series Models, Revue de l Institut International de Statistique/Review of the International Statistical Institute, 28, pp S. M. Kay (1981), Efficient Generation of Colored Noise, Proceedings of the IEEE, 69, pp N. Levinson (1947), The Wiener RMS Error Criterion in Filter Design and Prediction, Journal of Mathematical Physics, 25, pp XI 41

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

ARMA Models: I VIII 1

ARMA Models: I VIII 1 ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

MATH 5075: Time Series Analysis

MATH 5075: Time Series Analysis NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

STA 6857 Forecasting ( 3.5 cont.)

STA 6857 Forecasting ( 3.5 cont.) STA 6857 Forecasting ( 3.5 cont.) Outline 1 Forecasting 2 Arthur Berg STA 6857 Forecasting ( 3.5 cont.) 2/ 20 Outline 1 Forecasting 2 Arthur Berg STA 6857 Forecasting ( 3.5 cont.) 3/ 20 Best Linear Predictor

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Statistics 349(02) Review Questions

Statistics 349(02) Review Questions Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation

More information

Statistics 910, #15 1. Kalman Filter

Statistics 910, #15 1. Kalman Filter Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

Classical Decomposition Model Revisited: I

Classical Decomposition Model Revisited: I Classical Decomposition Model Revisited: I recall classical decomposition model for time series Y t, namely, Y t = m t + s t + W t, where m t is trend; s t is periodic with known period s (i.e., s t s

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Introduction to Time Series Analysis. Lecture 12.

Introduction to Time Series Analysis. Lecture 12. Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before. ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Examination paper for Solution: TMA4285 Time series models

Examination paper for Solution: TMA4285 Time series models Department of Mathematical Sciences Examination paper for Solution: TMA4285 Time series models Academic contact during examination: Håkon Tjelmeland Phone: 4822 1896 Examination date: December 7th 2013

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Circle a single answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

Open Economy Macroeconomics: Theory, methods and applications

Open Economy Macroeconomics: Theory, methods and applications Open Economy Macroeconomics: Theory, methods and applications Lecture 4: The state space representation and the Kalman Filter Hernán D. Seoane UC3M January, 2016 Today s lecture State space representation

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

STAT 100C: Linear models

STAT 100C: Linear models STAT 100C: Linear models Arash A. Amini June 9, 2018 1 / 56 Table of Contents Multiple linear regression Linear model setup Estimation of β Geometric interpretation Estimation of σ 2 Hat matrix Gram matrix

More information

X n = c n + c n,k Y k, (4.2)

X n = c n + c n,k Y k, (4.2) 4. Linear filtering. Wiener filter Assume (X n, Y n is a pair of random sequences in which the first component X n is referred a signal and the second an observation. The paths of the signal are unobservable

More information

INTRODUCTION Noise is present in many situations of daily life for ex: Microphones will record noise and speech. Goal: Reconstruct original signal Wie

INTRODUCTION Noise is present in many situations of daily life for ex: Microphones will record noise and speech. Goal: Reconstruct original signal Wie WIENER FILTERING Presented by N.Srikanth(Y8104060), M.Manikanta PhaniKumar(Y8104031). INDIAN INSTITUTE OF TECHNOLOGY KANPUR Electrical Engineering dept. INTRODUCTION Noise is present in many situations

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Advanced Digital Signal Processing -Introduction

Advanced Digital Signal Processing -Introduction Advanced Digital Signal Processing -Introduction LECTURE-2 1 AP9211- ADVANCED DIGITAL SIGNAL PROCESSING UNIT I DISCRETE RANDOM SIGNAL PROCESSING Discrete Random Processes- Ensemble Averages, Stationary

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

ECE534, Spring 2018: Solutions for Problem Set #5

ECE534, Spring 2018: Solutions for Problem Set #5 ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described

More information

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2)

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) Outline 1 Signal Extraction and Optimal Filtering 2 Arthur Berg STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) 2/ 17 Outline 1 Signal

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Moving Average (MA) representations

Moving Average (MA) representations Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

1 Introduction to Generalized Least Squares

1 Introduction to Generalized Least Squares ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the

More information

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Time series models 2007

Time series models 2007 Norwegia Uiversity of Sciece ad Techology Departmet of Mathematical Scieces Solutios to problem sheet 1, 2007 Exercise 1.1 a Let Sc = E[Y c 2 ]. The This gives Sc = EY 2 2cEY + c 2 ds dc = 2EY + 2c = 0

More information