Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Size: px
Start display at page:

Download "Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by"

Transcription

1 Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary methods 1st method is based on MA( ) representation X t = ψ j Z t j = ψ(b)z t, j=0 where ψ(b) = φ 1 (B)θ(B) have noted (overhead VII 8) that ACVF can be expressed as γ(h) = σ 2 ψ j ψ j+ h j=0 BD: 78, CC: 56, SS: 25 IX 1

2 Calculation of ACVF for ARMA Process: II can use recursive scheme to compute ψ j s (overhead VIII 16): p ψ j = φ k ψ j k + θ j, j = 0, 1, 2,..., k=1 but in general need ψ j s for infinite number of integers j since ψ j 0 as j, could compute ψ j s out to, say, j = J + h and use γ(h) = σ 2 J ψ j ψ j+ h σ 2 ψ j ψ j+ h, j=0 j=0 with approximation getting better with increasing J if we have a manageable expression for ψ j s (true for some processes), can get analytic expression for γ(h) BD: 78, CC: 79, SS: 93 IX 2

3 Example ARMA(1,1) Process: I for an ARMA(1,1) process, overhead VII 25 says that X t = Z t + (φ + θ) φ j 1 def Z t j = ψ j Z t j, j=1 so ψ 0 = 1 and ψ j = (φ + θ)φ j 1 for j 1 armed with j=0 x j = 1 1 x γ(0) σ 2 = ψj 2 = 1 + (φ + θ)2 j=0 j=0 (valid for x < 1), away we go: j=1 φ 2j 2 = 1 + (φ + θ) 2 φ 2j = 1 + j=0 (φ + θ)2 1 φ 2 BD: 78, CC: 78, SS: 96 IX 3

4 Example ARMA(1,1) Process: II for h > 0, with ψ j = (φ + θ)φ j 1 for j 1, have γ(h) σ 2 = ψ j ψ j+h = ψ h + ψ j ψ j+h j=0 j=1 = (φ + θ)φ h 1 + (φ + θ) 2 j=1 = (φ + θ)φ h 1 + φh (φ + θ) 2 = φ h 1 ( note: γ(h) = φγ(h 1) for h 2 φ + θ + 1 φ 2 φ(φ + θ)2 1 φ 2 ) φ 2j+h 2 BD: 78, CC: 78, SS: 96 IX 4

5 Example ARMA(1,1) Process: III following overheads show ACVFs (circles) for ARMA(1,1) processes {X t } with σ 2 = 1 AR parameter φ = 0.9 MA parameter θ ranging from 0.99 down to 0.99 have γ X (h) = φγ X (h 1) for h 2, but not for h = 1 casual AR(1) process {Y t } has ACVF γ Y (h) = φ h γ Y (0) have γ Y (h) = φγ Y (h 1) for h 1 overheads also show ACVFs (asterisks) for {Y t } with φ = 0.9 and with γ Y (0) set such that γ Y (1) = γ X (1) note: for some θ, not possible to do! IX 5

6 ACVF for ARMA(1,1) Process, φ = 0.9, θ = 0.99 ACVF h (lag) IX 6

7 ACVF for ARMA(1,1) Process, φ = 0.9, θ = 0.5 ACVF h (lag) IX 7

8 ACVF for ARMA(1,1) Process, φ = 0.9, θ = 0 ACVF h (lag) IX 8

9 ACVF for ARMA(1,1) Process, φ = 0.9, θ = 0.25 ACVF h (lag) IX 9

10 ACVF for ARMA(1,1) Process, φ = 0.9, θ = 0.5 ACVF h (lag) IX 10

11 ACVF for ARMA(1,1) Process, φ = 0.9, θ = 0.75 ACVF h (lag) IX 11

12 ACVF for ARMA(1,1) Process, φ = 0.9, θ = 0.9 ACVF h (lag) IX 12

13 ACVF for ARMA(1,1) Process, φ = 0.9, θ = 0.99 ACVF h (lag) IX 13

14 Example ARMA(1,1) Process: IV for three cases (θ = 0.25, 0.5 and 0.75), ARMA(1,1) process {X t } has an ACVF γ X (h) that can be expressed as { γ γ X (h) = Y (h) + C, h = 0; γ Y (h), h 0, where C > 0, and γ Y (h) is ACVF for AR(1) process called a nugget effect in geological literature for θ = 0.99 and 0.5, AR(1) ACVF emerges at lags h 1, while setting θ = 0 reduces ARMA(1,1) process to AR(1) θ = 0.9 reduces ARMA(1,1) to white noise θ = 0.99 causes slow decay of γ X (h) < 0 toward zero different from AR(1) ACVF when φ = γ Y (1) < 0: has γ Y (h) alternating between positive & negative as h increases IX 14

15 for an MA(q) process, have Example MA(q) Process X t = Z t + θ 1 Z t 1 + θ q Z t q = ψ j Z t j, j=0 so ψ 0 = 1, ψ j = θ j for 1 j q, and ψ j = 0 for j > q letting θ 0 = 1, have already noted (overhead VII 7) that { γ(h) = σ 2 σ 2 q h ψ j ψ j+ h = j=0 θ j θ j+ h, h q; 0, h > q j=0 consider q = 12 with θ 1 = = θ 12 = 1 (13-point sums) BD: 79, CC: 65, SS: 94 IX 15

16 ACVF for MA(12) Process, θ 1 = = θ 12 = 1 ACVF h (lag) IX 16

17 Roots of θ(z) y * * * * * * * * * * * * x IX 17

18 Realization of MA(12) Process x t t IX 18

19 Calculation of ACVF for ARMA Process: III 2nd method (of interest when p 1): multiply both sides of X t φ 1 X t 1 φ p X t p = Z t + θ 1 Z t θ q Z t q by X t k for k 0 and take expectations: γ(k) φ 1 γ(k 1) φ p γ(k p) = E{Z t X t k } + θ 1 E{Z t 1 X t k } + + θ q E{Z t q X t k } X t = since ψ j Z t j, get E{Z t l X t k } = j=0 ψ j E{Z t l Z t k j } = σ 2 ψ l k j=0 (recall that ψ l k def = 0 when l k < 0), yielding γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 q θ l ψ l k l=0 BD: 79, SS: 95 IX 19

20 Calculation of ACVF for ARMA Process: IV since ψ j = 0 for j < 0, right-hand side of γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 q θ l ψ l k l=0 will be 0 when k > l for all l = 0,..., q, i.e., when k q + 1 if k p so that k p 0, then γ(k), γ(k 1),..., γ(k p) on left-hand side will involve p + 1 distinct elements of {γ(h)} thus, letting m = max{q + 1, p}, we have γ(k) φ 1 γ(k 1) φ p γ(k p) = 0, k = m, m + 1,... theory of homogeneous linear difference equations says that, if p roots z j of φ(z) = 0 are distinct, then γ(h) = α 1 z h 1 + α 2 z h α p z h p, h m p BD: 79, SS: 95 IX 20

21 Calculation of ACVF for ARMA Process: V to determine α j s for given z j s, plug γ(h) s expressible as γ(h) = α 1 z1 h + α 2 z2 h + + α p zp h into q γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 θ l ψ l k, 0 k < m, yielding a system of m linear equations to be solved for m unknowns, where the unknowns are either l=0 α 1,..., α p when m = p or α 1,..., α p, γ(0),..., γ(m p 1) when m = q + 1 > p (recall that this method is only of interest when p 1) note: need to recall recursive scheme for computing ψ j s given φ j s and θ j s (see overhead VIII 16) BD: 79, SS: 95 IX 21

22 Example ARMA(1,1) Process: V for ARMA(1,1) process X t φx t 1 = Z t + θz t 1, q γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 θ l ψ l k l=0 becomes γ(k) φγ(k 1) = σ 2 (ψ k + θψ 1 k ), k = 0, 1,... overhead VII 24: ψ j = 0, j < 0; ψ 0 = 1 & ψ 1 = φ + θ, so γ(0) φγ(1) = σ 2 (1 + θψ 1 ) = σ 2 (1 + θ[φ + θ]) (1) γ(1) φγ(0) = σ 2 θ (2) γ(k) φγ(k 1) = 0, k = 2, 3,... (3) root z of φ(z) = 1 φz is 1/φ, so γ(h) = αz h = αφ h, h 1 using γ(1) = αφ in (1) and (2) yields two linear equations to be solved to get unknowns α and γ(0) BD: 79, 80, SS: IX 22

23 Example ARMA(1,1) Process: VI two equations are thus matrix formulation [ is ] [ ] 1 φ 2 γ(0) φ φ α γ(0) αφ 2 = σ 2 (1 + θ[φ + θ]) αφ φγ(0) = σ 2 θ = σ 2 = [ σ 2 (1 + θ[φ + θ]) σ 2 θ assuming φ 0 (i.e., ARMA(1,1) is not an MA(1)), have [ ] [ ] [ γ(0) 1 φ φ 2 σ = 2 ] (1 + θ[φ + θ]) α φ(1 φ 2 ) φ 1 σ 2 θ 1 + (φ+θ)2 1 φ 2 1+θ(φ+θ) 1 φ 2 + θ φ(1 φ 2 ) ] BD: 79, 80, SS: IX 23

24 Example ARMA(1,1) Process: VII thus γ(0) (φ + θ)2 = 1 + σ2 1 φ 2 in agreement with expression obtained by 1st method (cf. overhead IX 3) hurray!!! substituting value for α into γ(h) = αφ h yields ( ) γ(h) σ 2 = φ h 1 + θ(φ + θ) θ 1 φ 2 + φ(1 φ 2, ) which, after some algebra, ( becomes ) γ(h) σ 2 = φ h 1 φ(φ + θ)2 φ + θ + 1 φ 2, the same expression we got using 1st method (see overhead IX 4) we re really on a roll!!! BD: 79, 80, SS: IX 24

25 Example AR(2) Process: I for causal AR(2) process X t φ 1 X t 1 φ 2 X t 1 = Z t, q γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 θ l ψ l k becomes γ(k) φ 1 γ(k 1) φ 2 γ(k 2) = σ 2 ψ k, k = 0, 1,... since ψ 0 = 1 while ψ j = 0 when j < 0, get l=0 γ(0) φ 1 γ(1) φ 2 γ(2) = σ 2 γ(k) φ 1 γ(k 1) φ 2 γ(k 2) = 0, k = 1, 2,... assuming roots z 1 & z 2 of φ(z) = 0 are such that z 1 z 2, solution takes form γ(h) = α 1 z h 1 + α 2 z h 2, h 0 BD: 80 81, CC: 72, SS: IX 25

26 Example AR(2) Process: II substituting γ(h) = α 1 z h 1 + α 2 z h 2 into γ(0) φ 1 γ(1) φ 2 γ(2) = σ 2 γ(1) φ 1 γ(0) φ 2 γ(1) = 0 leads to ( ) ( ) α 1 + α 2 φ 1 α 1 z1 1 + α 2 z2 1 φ 2 α 1 z1 2 + α 2 z2 2 ( ) α 1 z1 1 + α 2 z2 1 φ 1 (α 1 + α 2 ) φ 2 α 1 z1 1 + α 2 z2 1 collecting terms gives ( ) ( ) α 1 1 φ 1 z1 1 φ 2 z1 2 + α 2 1 φ 1 z2 1 φ 2 z2 2 ( ) ( ) α 1 z1 1 φ 1 φ 2 z1 1 + α 2 z2 1 φ 1 φ 2 z2 1 = σ 2 = 0 = σ 2 = 0 BD: 80 81, CC: 72, SS: IX 26

27 Example AR(2) Process: III in matrix form, we have [ 1 φ 1 z1 1 φ 2 z1 2 1 φ 1 z2 1 φ 2 z2 2 z1 1 φ 1 φ 2 z1 1 z2 1 φ 1 φ 2 z2 1 now ] [ α1 α 2 ] = [ σ 2 φ(z) = 1 φ 1 z φ 2 z 2 = (1 z )(1 z ) ( 1 = )z+ z2 z 1 z 2 z 1 z 2 z 1 z 2 tells us that φ 1 = z1 1 + z2 1 and φ 2 = z1 1 z 1 2, which yields ] [ 1 z1 2 z1 1 z z1 3 z z2 2 z1 1 z z1 1 ] z 3 2 α1 z2 1 + z1 2 z 1 2 z1 1 + z1 1 = z 2 α 2 2 solving above for α 1 and α 2 yields solutions in terms of σ 2 and roots z 1 and z 2 0 ] [ σ 2 0 ] BD: 80 81, CC: 72, SS: IX 27

28 Example AR(2) Process: IV plugging solutions for α 1 and α 2 into γ(h) = α 1 z1 h + α 2 z2 h yields (after a considerable amount of reduction!) σ 2 z 2 [ ] γ(h) = 1 z2 2 z 1 h 1 (z 1 z 2 1)(z 2 z 1 ) z1 2 1 z1 h 2 z2 2 1 for complex conjugate roots z 1 = re iω and z 2 = z 1, have γ(h)/γ(0) = r h sin(hω + ψ)/ sin(ψ), where σ 2 (r 6 + r 4 ) γ(0) = (r 2 1)(r 4 2r 2 cos(2ω) + 1) and tan(ψ) = r2 + 1 r 2 1 tan(ω) note: r > 1 (roots are assumed to be outside unit circle) ( ) is damped sinusoid with period 2π ω, with damping slow when r 1, i.e., when roots are close to unit circle ( ) BD: 80 81, CC: 72, SS: IX 28

29 ACVF for AR(2) Process with z 1 = 2 & z 2 = 5 ACVF h (lag) BD: 80 IX 29

30 Reciprocal Roots Plot for z 1 = 2 & z 2 = 5 y x IX 30

31 Realization of AR(2) Process x t t IX 31

32 ACVF for AR(2) Process with z 1 = 10 9 & z 2 = 2 ACVF h (lag) BD: 81 IX 32

33 Reciprocal Roots Plot for z 1 = 10 9 & z 2 = 2 y x IX 33

34 Realization of AR(2) Process x t t IX 34

35 ACVF for AR(2) Process with z 1 = 10 9 & z 2 = 2 ACVF h (lag) BD: 81 IX 35

36 Reciprocal Roots Plot for z 1 = 10 9 & z 2 = 2 y x IX 36

37 Realization of AR(2) Process t x t IX 37

38 ACVF for AR(2) Process with z 1 = i 2 3 & z 2 = z 1 ACVF h (lag) BD: 82 IX 38

39 Reciprocal Roots Plot for z 1 = i 2 3 & z 2 = z 1 y x IX 39

40 Realization of AR(2) Process t x t IX 40

41 ACVF for AR(2) Process, z 1. = i & z2 = z 1 ACVF h (lag) IX 41

42 Reciprocal Roots Plot for z 1. = i & z2 = z 1 y x IX 42

43 Realization of AR(2) Process t x t IX 43

44 Calculation of ACVF for ARMA Process: VI 3rd method uses equations derived for 2nd method (IX 19): q γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 θ l ψ l k, k 0 ( ) l=0 letting k = 0, 1,..., p gives following system of equations: q γ(0) φ 1 γ(1) φ p γ(p) = σ 2 def θ l ψ l = c 0 γ(1) φ 1 γ(0) φ p γ(p 1) = σ 2 γ(p) φ 1 γ(p 1) φ p γ(0) = σ 2. l=0 q l=0 q l=0 θ l ψ l 1 def = c 1 θ l ψ l p def = c p BD: 81 IX 44

45 Calculation of ACVF for ARMA Process: VII leads to following matrix equation: 1 φ 1 φ 2 φ p 1 φ p φ 1 1 φ 2 φ 3 φ p 0 φ 2 φ 1 φ 3 1 φ φ p 1 φ p 2 φ p φ p φ p φ p 1 φ p 2 φ 1 1 solving above gives ACVF for lags 0 to p γ(0) γ(1) γ(2). γ(p 1) γ(p) lags k p + 1 can be gotten recursively by rearranging ( ): q γ(k) = φ 1 γ(k 1) + + φ p γ(k p) + σ 2 θ l ψ l k l=0 = c 0 c 1 c 2. c p 1 c p BD: 81 IX 45

46 Example ARMA(1,1) Process: VIII as noted before, for ARMA(1,1) process, γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 q θ l ψ l k l=0 becomes γ(k) φ 1 γ(k 1) = σ 2 (ψ k + θψ 1 k ), yielding γ(0) φγ(1) = σ 2 (1 + θψ 1 ) = σ 2 (1 + θ[φ + θ]) γ(1) φγ(0) = σ 2 θ γ(k) φγ(k 1) = 0, k = 2, 3,... can get γ(0) & γ(1) by solving [ ] [ ] 1 φ γ(0) = φ 1 γ(1) [ σ 2 (1 + θ[φ + θ]) σ 2 θ remaining values gotten from γ(k) = φγ(k 1), k = 2, 3,... ] BD: IX 46

47 Calculation of ACVF for ARMA Process: VIII 4th method uses the fact that an ARMA(p, q) process can be created by filtering an AR(p) process starting with the representation φ(b)x t = θ(b)z t, note that causality allows us to write X t = φ 1 (B)θ(B)Z t = θ(b)φ 1 (B)Z t = θ(b)y t, where Y t def = φ 1 (B)Z t since we can also write φ(b)y t = Z t, it follows that {Y t } is an AR(p) process, from which we can get {X t } by subjecting {Y t } to the MA filter θ(b) = 1 + θ 1 B + + θ q B q IX 47

48 Calculation of ACVF for ARMA Process: IX recall that, if {Y t } is a stationary process with mean 0 and ACVF {γ Y (h)}, then q def X t = θ j Y t j, where, as usual, θ 0 = 1, j=0 is stationary with mean 0 and ACVF q q γ X (h) = θ j θ k γ Y (h + k j) j=0 k=0 (the above follows readily from overhead VII 4) hence, if we can get ACVF for AR(p) process, we can readily compute ACVF for ARMA(p, q) process IX 48

49 Calculation of ACVF for ARMA Process: X reconsider equations derived for 2nd method (IX 19), namely, q γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 θ l ψ l k, k 0 l=0 and specialize them for AR(p) case (i.e., q = 0): γ(k) φ 1 γ(k 1) φ p γ(k p) = σ 2 ψ k, k 0, where ψ 0 = 1, while ψ k = 0 for all k > 0 IX 49

50 Calculation of ACVF for ARMA Process: XI leads to following matrix equation (special case of 3rd method): 1 φ 1 φ 2 φ p 1 φ p γ Y (0) φ 1 1 φ 2 φ 3 φ p 0 γ Y (1) φ 2 φ 1 φ 3 1 φ γ Y (2) = φ p 1 φ p 2 φ p φ p γ Y (p 1) φ p φ p 1 φ p 2 φ 1 1 γ Y (p) after solving above to get γ Y (k) for lags 0 to p, can get it for lags k p + 1 recursively from γ Y (k) = φ 1 γ Y (k 1) + + φ p γ Y (k p) exercise: use this approach to get ARMA(1,1) γ X (h) note: after discussion of Levinson Durbin recursions, can formulate a 5th method that is a variation on the 4th σ IX 50

51 Calculation of ACVF for ARMA Process Summary 1st method can lead to analytic expression based directly on γ(h) = σ 2 ψ j ψ j+ h j=0 (if not, gives easy way to calculate γ(h) approximately) 2nd method based on γ(h) φ 1 γ(h 1) φ p γ(h p) = σ 2 q θ l ψ l h l=0 (gives analytic expression and/or exact calculation of γ(h)) 3rd method is variation on 2nd (starts with same equations) 4th method gets γ(h) via two-stage procedure using idea that ARMA process is result of filtering AR process with MA filter IX 51

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

ARMA Models: I VIII 1

ARMA Models: I VIII 1 ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

14 Autoregressive Moving Average Models

14 Autoregressive Moving Average Models 14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class

More information

MATH 5075: Time Series Analysis

MATH 5075: Time Series Analysis NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina STAT 720 TIME SERIES ANALYSIS Spring 2015 Lecture Notes Dewei Wang Department of Statistics University of South Carolina 1 Contents 1 Introduction 1 1.1 Some examples........................................

More information

Introduction to Time Series Analysis. Lecture 12.

Introduction to Time Series Analysis. Lecture 12. Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before. ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

Strictly Stationary Solutions of Autoregressive Moving Average Equations

Strictly Stationary Solutions of Autoregressive Moving Average Equations Strictly Stationary Solutions of Autoregressive Moving Average Equations Peter J. Brockwell Alexander Lindner Abstract Necessary and sufficient conditions for the existence of a strictly stationary solution

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations. Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Examination paper for Solution: TMA4285 Time series models

Examination paper for Solution: TMA4285 Time series models Department of Mathematical Sciences Examination paper for Solution: TMA4285 Time series models Academic contact during examination: Håkon Tjelmeland Phone: 4822 1896 Examination date: December 7th 2013

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

Moving Average (MA) representations

Moving Average (MA) representations Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2)

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) Outline 1 Signal Extraction and Optimal Filtering 2 Arthur Berg STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) 2/ 17 Outline 1 Signal

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

{ ϕ(f )Xt, 1 t n p, X t, n p + 1 t n, { 0, t=1, W t = E(W t W s, 1 s t 1), 2 t n.

{ ϕ(f )Xt, 1 t n p, X t, n p + 1 t n, { 0, t=1, W t = E(W t W s, 1 s t 1), 2 t n. Innovations, Sufficient Statistics, And Maximum Lielihood In ARM A Models Georgi N. Boshnaov Institute of Mathematics, Bulgarian Academy of Sciences 1. Introduction. Let X t, t Z} be a zero mean ARMA(p,

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Time series and spectral analysis. Peter F. Craigmile

Time series and spectral analysis. Peter F. Craigmile Time series and spectral analysis Peter F. Craigmile http://www.stat.osu.edu/~pfc/ Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France. 13-24 Jun 2016 Thank you to The

More information

A time series is a set of observations made sequentially in time.

A time series is a set of observations made sequentially in time. Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Classical Decomposition Model Revisited: I

Classical Decomposition Model Revisited: I Classical Decomposition Model Revisited: I recall classical decomposition model for time series Y t, namely, Y t = m t + s t + W t, where m t is trend; s t is periodic with known period s (i.e., s t s

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2

THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2 THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS Mor Ndongo & Abdou Kâ Diongue UFR SAT, Universit Gaston Berger, BP 34 Saint-Louis, Sénégal (morndongo000@yahoo.fr) UFR SAT, Universit Gaston

More information

Convergence in Mean Square Tidsserieanalys SF2945 Timo Koski

Convergence in Mean Square Tidsserieanalys SF2945 Timo Koski Avd. Matematisk statistik Convergence in Mean Square Tidsserieanalys SF2945 Timo Koski MEAN SQUARE CONVERGENCE. MEAN SQUARE CONVERGENCE OF SUMS. CAUSAL LINEAR PROCESSES 1 Definition of convergence in mean

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Part III Spectrum Estimation

Part III Spectrum Estimation ECE79-4 Part III Part III Spectrum Estimation 3. Parametric Methods for Spectral Estimation Electrical & Computer Engineering North Carolina State University Acnowledgment: ECE79-4 slides were adapted

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

Lecture 3-4: Probability models for time series

Lecture 3-4: Probability models for time series Lecture 3-4, page 1 Lecture 3-4: Probability models for time series Outline of lesson 3-4 (chapter 3) The most heavy (and theoretical) part (and lesson) AR(p) processes MA(q) processes ARMA(p,q) processes

More information