Introduction to Time Series Analysis. Lecture 15.

Size: px
Start display at page:

Download "Introduction to Time Series Analysis. Lecture 15."

Transcription

1 Introduction to Time Series Analysis. Lecture 15. Spectral Analysis 1. Spectral density: Facts and examples. 2. Spectral distribution function. 3. Wold s decomposition. 1

2 Spectral Analysis Idea: decompose a stationary time series {X t } into a combination of sinusoids, with random (and uncorrelated) coefficients. Just as in Fourier analysis, where we decompose (deterministic) functions into combinations of sinusoids. This is referred to as spectral analysis or analysis in the frequency domain, in contrast to the time domain approach we have considered so far. The frequency domain approach considers regression on sinusoids; the time domain approach considers regression on past values of the time series. 2

3 A periodic time series Consider X t = A sin(2πνt) + B cos(2πνt) = C sin(2πνt + φ), where A, B are uncorrelated, mean zero, variance σ 2 = 1, and C 2 = A 2 + B 2, tan φ = B/A. Then So {X t } is stationary. µ t = E[X t ] = 0 γ(t, t + h) = cos(2πνh). 3

4 An aside: Some trigonometric identities tanθ = sin θ cosθ, sin 2 θ + cos 2 θ = 1, sin(a + b) = sinacosb + cosasinb, cos(a + b) = cosacosb sinasinb. 4

5 A periodic time series For X t = A sin(2πνt) + B cos(2πνt), with uncorrelated A, B (mean 0, variance σ 2 ), γ(h) = σ 2 cos(2πνh). The autocovariance of the sum of two uncorrelated time series is the sum of their autocovariances. Thus, the autocovariance of a sum of random sinusoids is a sum of sinusoids with the corresponding frequencies: X t = γ(h) = k (A j sin(2πν j t) + B j cos(2πν j t)), j=1 k σj 2 cos(2πν j h), j=1 where A j, B j are uncorrelated, mean zero, and Var(A j ) = Var(B j ) = σ 2 j. 5

6 A periodic time series X t = k (A j sin(2πν j t) + B j cos(2πν j t)), γ(h) = j=1 k σj 2 cos(2πν j h). j=1 Thus, we can represent γ(h) using a Fourier series. The coefficients are the variances of the sinusoidal components. The spectral density is the continuous analog: the Fourier transform of γ. (The analogous spectral representation of a stationary process X t involves a stochastic integral a sum of discrete components at a finite number of frequencies is a special case. We won t consider this representation in this course.) 6

7 Spectral density If a time series {X t } has autocovariance γ satisfying h= γ(h) <, then we define its spectral density as f(ν) = h= γ(h)e 2πiνh for < ν <. 7

8 Spectral density: Some facts 1. We have h= γ(h)e 2πiνh <. This is because e iθ = cosθ + isinθ = (cos 2 θ + sin 2 θ) 1/2 = 1, and because of the absolute summability of γ. 2. f is periodic, with period 1. This is true since e 2πiνh is a periodic function of ν with period 1. Thus, we can restrict the domain of f to 1/2 ν 1/2. (The text does this.) 8

9 Spectral density: Some facts 3. f is even (that is, f(ν) = f( ν)). To see this, write f(ν) = f( ν) = 4. f(ν) 0. = 1 h= 1 h= γ(h)e 2πiνh + γ(0) + γ(h)e 2πiνh, h=1 γ(h)e 2πiν( h) + γ(0) + γ( h)e 2πiνh + γ(0) + h=1 = f(ν). 1 γ(h)e 2πiν( h), h=1 h= γ( h)e 2πiνh 9

10 Spectral density: Some facts 5. γ(h) = 1/2 1/2 e 2πiνh f(ν) dν. 1/2 1/2 e 2πiνh f(ν) dν = = 1/2 1/2 j= 1/2 j= γ(j) = γ(h) + j h e 2πiν(j h) γ(j) dν 1/2 γ(j) 2πi(j h) e 2πiν(j h) dν (e πi(j h) e πi(j h) ) = γ(h) + j h γ(j) sin(π(j h)) π(j h) = γ(h). 10

11 Example: White noise For white noise {W t }, we have seen that γ(0) = σ 2 w and γ(h) = 0 for h 0. Thus, f(ν) = h= = γ(0) = σ 2 w. γ(h)e 2πiνh That is, the spectral density is constant across all frequencies: each frequency in the spectrum contributes equally to the variance. This is the origin of the name white noise: it is like white light, which is a uniform mixture of all frequencies in the visible spectrum. 11

12 Example: AR(1) For X t = φ 1 X t 1 + W t, we have seen that γ(h) = σwφ 2 h 1 /(1 φ2 1). Thus, f(ν) = γ(h)e 2πiνh = σ2 w 1 φ 2 φ h 1 e 2πiνh h= 1 h= ( = σ2 w 1 φ φ h ( 1 e 2πiνh + e 2πiνh)) 1 h=1 (1 + φ 1e 2πiν 1 φ 1 e 2πiν + φ 1e 2πiν 1 φ 1 e 2πiν = σ2 w 1 φ 2 1 σw 2 1 φ 1 e 2πiν φ 1 e 2πiν = (1 φ 2 1 ) (1 φ 1 e 2πiν )(1 φ 1 e 2πiν ) ) = σ 2 w 1 2φ 1 cos(2πν) + φ

13 Examples White noise: {W t }, γ(0) = σ 2 w and γ(h) = 0 for h 0. f(ν) = γ(0) = σ 2 w. AR(1): X t = φ 1 X t 1 + W t, γ(h) = σwφ 2 h 1 /(1 φ2 1). f(ν) = σ 2 w 1 2φ 1. cos(2πν)+φ 2 1 If φ 1 > 0 (positive autocorrelation), spectrum is dominated by low frequency components smooth in the time domain. If φ 1 < 0 (negative autocorrelation), spectrum is dominated by high frequency components rough in the time domain. 13

14 Example: AR(1) 100 Spectral density of AR(1): X t = +0.9 X t 1 + W t f(ν) ν 14

15 Example: AR(1) 100 Spectral density of AR(1): X t = 0.9 X t 1 + W t f(ν) ν 15

16 Introduction to Time Series Analysis. Lecture Review: Spectral density 2. Examples 3. Spectral distribution function. 4. Autocovariance generating function and spectral density. 1

17 Review: Spectral density If a time series {X t } has autocovariance γ satisfying h= γ(h) <, then we define its spectral density as f(ν) = h= γ(h)e 2πiνh for < ν <. 2

18 Review: Spectral density 1. f(ν) is real. 2. f(ν) f is periodic, with period 1. So we restrict the domain off to 1/2 ν 1/2. 4. f is even (that is,f(ν) = f( ν)). 5. γ(h) = 1/2 1/2 e 2πiνh f(ν)dν. 3

19 Examples White noise: {W t },γ(0) = σ 2 w and γ(h) = 0 for h 0. f(ν) = γ(0) = σ 2 w. AR(1): X t = φ 1 X t 1 +W t,γ(h) = σwφ 2 h 1 /(1 φ2 1). f(ν) = σ 2 w 1 2φ 1. cos(2πν)+φ 2 1 If φ 1 > 0 (positive autocorrelation), spectrum is dominated by low frequency components smooth in the time domain. If φ 1 < 0 (negative autocorrelation), spectrum is dominated by high frequency components rough in the time domain. 4

20 Example: AR(1) 100 Spectral density of AR(1): X t = +0.9 X t 1 + W t f(ν) ν 5

21 Example: AR(1) 100 Spectral density of AR(1): X t = 0.9 X t 1 + W t f(ν) ν 6

22 Example: MA(1) X t = W t +θ 1 W t 1. σw(1+θ 2 1) 2 ifh = 0, γ(h) = σwθ 2 1 if h = 1, 0 otherwise. 1 f(ν) = h= 1 γ(h)e 2πiνh = γ(0)+2γ(1)cos(2πν) ( = σw 2 1+θ θ 1 cos(2πν) ). 7

23 Example: MA(1) X t = W t +θ 1 W t 1. f(ν) = σ 2 w ( 1+θ θ 1 cos(2πν) ). If θ 1 > 0 (positive autocorrelation), spectrum is dominated by low frequency components smooth in the time domain. If θ 1 < 0 (negative autocorrelation), spectrum is dominated by high frequency components rough in the time domain. 8

24 Example: MA(1) 4 Spectral density of MA(1): X t = W t +0.9 W t f(ν) ν 9

25 Example: MA(1) 4 Spectral density of MA(1): X t = W t 0.9 W t f(ν) ν 10

26 Introduction to Time Series Analysis. Lecture Review: Spectral density 2. Examples 3. Spectral distribution function. 4. Autocovariance generating function and spectral density. 5. Rational spectra. Poles and zeros. 11

27 Recall: A periodic time series k X t = (A j sin(2πν j t)+b j cos(2πν j t)) j=1 k = (A 2 j +Bj) 2 1/2 sin(2πν j t+tan 1 (B j /A j )). j=1 E[X t ] = 0 k γ(h) = σj 2 cos(2πν j h) j=1 γ(h) =. h 12

28 Discrete spectral distribution function ForX t = Asin(2πλt)+Bcos(2πλt), we have γ(h) = σ 2 cos(2πλh), and we can write γ(h) = 1/2 1/2 e 2πiνh df(ν), where F is the discrete distribution 0 ifν < λ, F(ν) = σ 2 2 if λ ν < λ, σ 2 otherwise. 13

29 The spectral distribution function For any stationary {X t } with autocovariance γ, we can write γ(h) = 1/2 1/2 e 2πiνh df(ν), where F is the spectral distribution function of {X t }. We can split F into three components: discrete, continuous, and singular. If γ is absolutely summable, F is continuous: df(ν) = f(ν)dν. If γ is a sum of sinusoids,f is discrete. 14

30 The spectral distribution function ForX t = k j=1 (A j sin(2πν j t)+b j cos(2πν j t)), the spectral distribution function isf(ν) = k j=1 σ2 j F j(ν), where 0 ifν < ν j, F j (ν) = 1 2 if ν j ν < ν j, 1 otherwise. 15

31 Wold s decomposition Notice that X t = k j=1 (A j sin(2πν j t)+b j cos(2πν j t)) is deterministic (once we ve seen the past, we can predict the future without error). Wold showed that every stationary process can be represented as X t = X (d) t +X (n) t, where X (d) t is purely deterministic andx (n) t is purely nondeterministic. (c.f. the decomposition of a spectral distribution function asf (d) +F (c).) Example: X t = Asin(2πλt)+ θ(b) φ(b) W t. 16

32 Introduction to Time Series Analysis. Lecture Review: Spectral density 2. Examples 3. Spectral distribution function. 4. Autocovariance generating function and spectral density. 17

33 Autocovariance generating function and spectral density SupposeX t is a linear process, so it can be written X t = i=0 ψ iw t i = ψ(b)w t. Consider the autocovariance sequence, γ h = Cov(X t,x t+h ) = E ψ i W t i ψ j W t+h j = σ 2 w i=0 ψ i ψ i+h. i=0 j=0 18

34 Autocovariance generating function and spectral density Define the autocovariance generating function as Then, γ(b) = γ h B h. γ(b) = σw 2 = σw 2 = σw 2 h= h= i=0 ψ i ψ i+h B h i=0 ψ i ψ j B j i j=0 ψ i B i ψ j B j = σwψ(b 2 1 )ψ(b). i=0 j=0 19

35 Autocovariance generating function and spectral density Notice that γ(b) = f(ν) = h= h= γ h B h. = γ ( e 2πiν) γ h e 2πiνh = σwψ ( 2 e 2πiν) ψ ( e 2πiν) = σw 2 ψ ( e 2πiν) 2. 20

36 Autocovariance generating function and spectral density For example, for an MA(q), we have ψ(b) = θ(b), so f(ν) = σwθ 2 ( e 2πiν) θ ( e 2πiν) = σw 2 θ ( e 2πiν) 2. For MA(1), f(ν) = σ 2 w 1+θ 1 e 2πiν 2 = σw 1+θ 2 1 cos( 2πν)+iθ 1 sin( 2πν) 2 = σw 2 ( 1+2θ1 cos(2πν)+θ1 2 ). 21

37 Autocovariance generating function and spectral density For an AR(p), we have ψ(b) = 1/φ(B), so f(ν) = = σ 2 w φ(e 2πiν )φ(e 2πiν ) σ 2 w φ(e 2πiν ) 2. For AR(1), f(ν) = = σ 2 w 1 φ 1 e 2πiν 2 σ 2 w 1 2φ 1 cos(2πν)+φ

38 Spectral density of a linear process If X t is a linear process, it can be written X t = i=0 ψ iw t i = ψ(b)w t. Then f(ν) = σw 2 ψ ( e 2πiν) 2. That is, the spectral density f(ν) of a linear process measures the modulus of the ψ (MA( )) polynomial at the point e 2πiν on the unit circle. 23

39 Introduction to Time Series Analysis. Lecture Review: Spectral distribution function, spectral density. 2. Rational spectra. Poles and zeros. 3. Examples. 4. Time-invariant linear filters 5. Frequency response 1

40 Review: Spectral density and spectral distribution function If a time series {X t } has autocovariance γ satisfying h= γ(h) <, then we define its spectral density as f(ν) = h= γ(h)e 2πiνh for < ν <. We have γ(h) = 1/2 1/2 e 2πiνh f(ν)dν = 1/2 1/2 e 2πiνh df(ν), where df(ν) = f(ν)dν. f measures how the variance of X t is distributed across the spectrum. 2

41 Review: Spectral density of a linear process If X t is a linear process, it can be written X t = i=0 ψ iw t i = ψ(b)w t. Then f(ν) = σw 2 ψ ( e 2πiν) 2. That is, the spectral density f(ν) of a linear process measures the modulus of the ψ (MA( )) polynomial at the point e 2πiν on the unit circle. 3

42 Spectral density of a linear process For an ARMA(p,q), ψ(b) = θ(b)/φ(b), so f(ν) = σw 2 θ(e 2πiν )θ(e 2πiν ) φ(e 2πiν )φ(e 2πiν ) = σw 2 θ(e 2πiν 2 ) φ(e 2πiν ). This is known as a rational spectrum. 4

43 Rational spectra Consider the factorization ofθ andφas θ(z) = θ q (z z 1 )(z z 2 ) (z z q ) φ(z) = φ p (z p 1 )(z p 2 ) (z p p ), where z 1,...,z q andp 1,...,p p are called the zeros and poles. q f(ν) = σw 2 θ q j=1 (e 2πiν z j ) p φ p j=1 (e 2πiν p j ) θ 2 q = σw 2 q j=1 e 2πiν z j 2 p j=1 e 2πiν p j 2. φ 2 p 2 5

44 Rational spectra θ 2 q f(ν) = σw 2 q j=1 e 2πiν z j 2 p j=1 e 2πiν p j 2. φ 2 p Asν varies from 0 to1/2,e 2πiν moves clockwise around the unit circle from 1 toe πi = 1. And the value off(ν) goes up as this point moves closer to (further from) the poles p j (zeros z j ). 6

45 Example: ARMA Recall AR(1): φ(z) = 1 φ 1 z. The pole is at 1/φ 1. If φ 1 > 0, the pole is to the right of1, so the spectral density decreases asν moves away from0. If φ 1 < 0, the pole is to the left of 1, so the spectral density is at its maximum when ν = 0.5. Recall MA(1): θ(z) = 1+θ 1 z. The zero is at 1/θ 1. Ifθ 1 > 0, the zero is to the left of 1, so the spectral density decreases asν moves towards 1. If θ 1 < 0, the zero is to the right of1, so the spectral density is at its minimum when ν = 0. 7

46 Example: AR(2) ConsiderX t = φ 1 X t 1 +φ 2 X t 2 +W t. Example 4.6 in the text considers this model withφ 1 = 1,φ 2 = 0.9, and σ 2 w = 1. In this case, the poles are at p 1,p ±i e ±i e ±2πi Thus, we have f(ν) = σ 2 w φ 2 2 e 2πiν p 1 2 e 2πiν p 2 2, and this gets very peaked when e 2πiν passes near 1.054e 2πi

47 Example: AR(2) 140 Spectral density of AR(2): X t = X t X t 2 + W t f(ν) ν 9

48 Example: Seasonal ARMA Consider X t = Φ 1 X t 12 +W t. 1 ψ(b) = 1 Φ 1 B 12, f(ν) = σw 2 1 (1 Φ 1 e 2πi12ν )(1 Φ 1 e 2πi12ν ) = σw Φ 1 cos(24πν)+φ 2. 1 Notice that f(ν) is periodic with period 1/12. 10

49 Example: Seasonal ARMA 1.6 Spectral density of AR(1) 12 : X t = +0.2 X t 12 + W t f(ν) ν 11

50 Example: Seasonal ARMA Another view: 1 Φ 1 z 12 = 0 z = re iθ, with r = Φ 1 1/12, e i12θ = e iarg(φ 1). ForΦ 1 > 0, the twelve poles are at Φ 1 1/12 e ikπ/6 for k = 0,±1,...,±5,6. So the spectral density gets peaked ase 2πiν passes near Φ 1 1/12 { 1,e iπ/6,e iπ/3,e iπ/2,e i2π/3,e i5π/6, 1 }. 12

51 Example: Multiplicative seasonal ARMA Consider (1 Φ 1 B 12 )(1 φ 1 B)X t = W t. f(ν) = σ 2 w 1 (1 2Φ 1 cos(24πν)+φ 2 1 )(1 2φ 1cos(2πν)+φ 2 1 ). This is a scaled product of the AR(1) spectrum and the (periodic) AR(1) 12 spectrum. The AR(1) 12 poles give peaks when e 2πiν is at one of the 12th roots of1; the AR(1) poles give a peak near e 2πiν = 1. 13

52 Example: Multiplicative seasonal ARMA 7 Spectral density of AR(1)AR(1) 12 : (1+0.5 B)(1+0.2 B 12 ) X t = W t f(ν) ν 14

53 Introduction to Time Series Analysis. Lecture Review: Spectral distribution function, spectral density. 2. Rational spectra. Poles and zeros. 3. Examples. 4. Time-invariant linear filters 5. Frequency response 15

54 Time-invariant linear filters A filter is an operator; given a time series {X t }, it maps to a time series {Y t }. We can think of a linear processx t = j=0 ψ jw t j as the output of a causal linear filter with a white noise input. A time series {Y t } is the output of a linear filter A = {a t,j : t,j Z} with input {X t } if Y t = j= a t,j X j. If a t,t j is independent of t (a t,t j = ψ j ), then we say that the filter is time-invariant. If ψ j = 0 forj < 0, we say the filterψ is causal. We ll see that the name filter arises from the frequency domain viewpoint. 16

55 Time-invariant linear filters: Examples 1. Y t = X t is linear, but not time-invariant. 2. Y t = 1 3 (X t 1 +X t +X t+1 ) is linear, time-invariant, but not causal: 1 3 if j 1, ψ j = 0 otherwise. 3. For polynomials φ(b),θ(b) with roots outside the unit circle, ψ(b) = θ(b)/φ(b) is a linear, time-invariant, causal filter. 17

56 Time-invariant linear filters The operation ψ j X t j j= is called the convolution ofx withψ. 18

57 Time-invariant linear filters The sequence ψ is also called the impulse response, since the output{y t } of the linear filter in response to a unit impulse, 1 if t = 0, X t = 0 otherwise, is Y t = ψ(b)x t = j= ψ j X t j = ψ t. 19

58 Introduction to Time Series Analysis. Lecture Review: Spectral distribution function, spectral density. 2. Rational spectra. Poles and zeros. 3. Examples. 4. Time-invariant linear filters 5. Frequency response 20

59 Frequency response of a time-invariant linear filter Suppose that {X t } has spectral density f x (ν) and ψ is stable, that is, j= ψ j <. Then Y t = ψ(b)x t has spectral density f y (ν) = ψ ( e 2πiν) 2 f x (ν). The function ν ψ(e 2πiν ) (the polynomial ψ(z) evaluated on the unit circle) is known as the frequency response or transfer function of the linear filter. The squared modulus,ν ψ(e 2πiν ) 2 is known as the power transfer function of the filter. 21

60 Frequency response of a time-invariant linear filter For stable ψ, Y t = ψ(b)x t has spectral density f y (ν) = ψ ( e 2πiν ) 2 f x (ν). We have seen that a linear process, Y t = ψ(b)w t, is a special case, since f y (ν) = ψ(e 2πiν ) 2 σ 2 w = ψ(e 2πiν ) 2 f w (ν). When we pass a time series {X t } through a linear filter, the spectral density is multiplied, frequency-by-frequency, by the squared modulus of the frequency response ν ψ(e 2πiν ) 2. This is a version of the equality Var(aX) = a 2 Var(X), but the equality is true for the component of the variance at every frequency. This is also the origin of the name filter. 22

61 Frequency response of a filter: Details Why isf y (ν) = ψ ( e 2πiν) 2 f x (ν)? First, γ y (h) = E ψ j X t j ψ k X t+h k = = j= j= j= ψ j ψ j k= k= k= ψ k E[X t+h k X t j ] ψ k γ x (h+j k) = j= ψ j l= ψ h+j l γ x (l). It is easy to check that j= ψ j < and h= γ x(h) < imply that h= γ y(h) <. Thus, the spectral density of y is defined. 23

62 Frequency response of a filter: Details f y (ν) = = = h= h= j= j= γ(h)e 2πiνh ψ j ψ h+j l γ x (l)e 2πiνh l= ψ j e 2πiνj γ x (l)e 2πiνl = ψ(e 2πiνj )f x (ν) l= h= = ψ(e 2πiνj ) 2 f x (ν). ψ h e 2πiνh h= ψ h+j l e 2πiν(h+j l) 24

63 Frequency response: Examples For a linear process Y t = ψ(b)w t, f y (ν) = ψ ( e 2πiν) 2 σ 2 w. For an ARMA model,ψ(b) = θ(b)/φ(b), so{y t } has the rational spectrum f y (ν) = σw 2 θ(e 2πiν 2 ) φ(e 2πiν ) θ 2 q = σw 2 q j=1 e 2πiν z j 2 p j=1 e 2πiν p j 2, where p j andz j are the poles and zeros of the rational function z θ(z)/φ(z). φ 2 p 25

64 Frequency response: Examples Consider the moving average Y t = 1 2k +1 k j= k X t j. This is a time invariant linear filter (but it is not causal). Its transfer function is the Dirichlet kernel ψ(e 2πiν ) = D k (2πν) = 1 k e 2πijν 2k +1 j= k 1 if ν = 0, = otherwise. sin(2π(k+1/2)ν) (2k+1) sin(πν) 26

65 Example: Moving average 1 Transfer function of moving average (k=5) ν 27

66 Example: Moving average 1 Squared modulus of transfer function of moving average (k=5) ν This is a low-pass filter: It preserves low frequencies and diminishes high frequencies. It is often used to estimate a monotonic trend component of a series. 28

67 Example: Differencing Consider the first difference Y t = (1 B)X t. This is a time invariant, causal, linear filter. Its transfer function is ψ(e 2πiν ) = 1 e 2πiν, so ψ(e 2πiν ) 2 = 2(1 cos(2πν)). 29

68 Example: Differencing 4 Transfer function of first difference ν This is a high-pass filter: It preserves high frequencies and diminishes low frequencies. It is often used to eliminate a trend component of a series. 30

69 Introduction to Time Series Analysis. Lecture Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance 5. Discrete Fourier transform and the periodogram 1

70 Review: Spectral density If a time series {X t } has autocovariance γ satisfying h= γ(h) <, then we define its spectral density as f(ν) = h= γ(h)e 2πiνh for < ν <. We have γ(h) = 1/2 1/2 e 2πiνh f(ν)dν. 2

71 Review: Rational spectra For a linear time series with MA( ) polynomial ψ, f(ν) = σ 2 w ψ ( e 2πiν) 2. If it is an ARMA(p,q), we have f(ν) = σw 2 θ(e 2πiν 2 ) φ(e 2πiν ) θ 2 q = σw 2 q j=1 e 2πiν z j 2 p j=1 e 2πiν p j 2, φ 2 p where z 1,...,z q are the zeros (roots of θ(z)) and p 1,...,p p are the poles (roots of φ(z)). 3

72 Review: Time-invariant linear filters A filter is an operator; given a time series {X t }, it maps to a time series {Y t }. A linear filter satisfies time-invariant: a t,t j = ψ j : Y t = j= a t,j X j. causal: j < 0 impliesψ j = 0. Y t = j= ψ j X t j. Y t = ψ j X t j. j=0 4

73 Time-invariant linear filters The operation ψ j X t j j= is called the convolution ofx withψ. 5

74 Time-invariant linear filters The sequence ψ is also called the impulse response, since the output{y t } of the linear filter in response to a unit impulse, 1 if t = 0, X t = 0 otherwise, is Y t = ψ(b)x t = j= ψ j X t j = ψ t. 6

75 Introduction to Time Series Analysis. Lecture Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance 5. Discrete Fourier transform and the periodogram 7

76 Frequency response of a time-invariant linear filter Suppose that {X t } has spectral density f x (ν) and ψ is stable, that is, j= ψ j <. Then Y t = ψ(b)x t has spectral density f y (ν) = ψ ( e 2πiν) 2 f x (ν). The function ν ψ(e 2πiν ) (the polynomial ψ(z) evaluated on the unit circle) is known as the frequency response or transfer function of the linear filter. The squared modulus,ν ψ(e 2πiν ) 2 is known as the power transfer function of the filter. 8

77 Frequency response of a time-invariant linear filter For stable ψ, Y t = ψ(b)x t has spectral density f y (ν) = ψ ( e 2πiν ) 2 f x (ν). We have seen that a linear process, Y t = ψ(b)w t, is a special case, since f y (ν) = ψ(e 2πiν ) 2 σ 2 w = ψ(e 2πiν ) 2 f w (ν). When we pass a time series {X t } through a linear filter, the spectral density is multiplied, frequency-by-frequency, by the squared modulus of the frequency response ν ψ(e 2πiν ) 2. This is a version of the equality Var(aX) = a 2 Var(X), but the equality is true for the component of the variance at every frequency. This is also the origin of the name filter. 9

78 Frequency response of a filter: Details Why isf y (ν) = ψ ( e 2πiν) 2 f x (ν)? First, γ y (h) = E ψ j X t j ψ k X t+h k = = j= j= j= ψ j ψ j k= k= k= ψ k E[X t+h k X t j ] ψ k γ x (h+j k) = j= ψ j l= ψ h+j l γ x (l). It is easy to check that j= ψ j < and h= γ x(h) < imply that h= γ y(h) <. Thus, the spectral density of y is defined. 10

79 Frequency response of a filter: Details f y (ν) = = = h= h= j= j= γ(h)e 2πiνh ψ j ψ h+j l γ x (l)e 2πiνh l= ψ j e 2πiνj γ x (l)e 2πiνl = ψ(e 2πiνj )f x (ν) l= h= = ψ(e 2πiνj ) 2 f x (ν). ψ h e 2πiνh h= ψ h+j l e 2πiν(h+j l) 11

80 Frequency response: Examples For a linear process Y t = ψ(b)w t, f y (ν) = ψ ( e 2πiν) 2 σ 2 w. For an ARMA model,ψ(b) = θ(b)/φ(b), so{y t } has the rational spectrum f y (ν) = σw 2 θ(e 2πiν 2 ) φ(e 2πiν ) θ 2 q = σw 2 q j=1 e 2πiν z j 2 p j=1 e 2πiν p j 2, where p j andz j are the poles and zeros of the rational function z θ(z)/φ(z). φ 2 p 12

81 Frequency response: Examples Consider the moving average Y t = 1 2k +1 k j= k X t j. This is a time invariant linear filter (but it is not causal). Its transfer function is the Dirichlet kernel ψ(e 2πiν ) = D k (2πν) = 1 k e 2πijν 2k +1 j= k 1 if ν = 0, = otherwise. sin(2π(k+1/2)ν) (2k+1) sin(πν) 13

82 Example: Moving average 1 Transfer function of moving average (k=5) ν 14

83 Example: Moving average 1 Squared modulus of transfer function of moving average (k=5) ν This is a low-pass filter: It preserves low frequencies and diminishes high frequencies. It is often used to estimate a monotonic trend component of a series. 15

84 Example: Differencing Consider the first difference Y t = (1 B)X t. This is a time invariant, causal, linear filter. Its transfer function is ψ(e 2πiν ) = 1 e 2πiν, so ψ(e 2πiν ) 2 = 2(1 cos(2πν)). 16

85 Example: Differencing 4 Transfer function of first difference ν This is a high-pass filter: It preserves high frequencies and diminishes low frequencies. It is often used to eliminate a trend component of a series. 17

86 Introduction to Time Series Analysis. Lecture Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance 5. Discrete Fourier transform and the periodogram 18

87 Estimating the Spectrum: Outline We have seen that the spectral density gives an alternative view of stationary time series. Given a realization x 1,...,x n of a time series, how can we estimate the spectral density? One approach: replace γ( ) in the definition f(ν) = h= γ(h)e 2πiνh, with the sample autocovariance ˆγ( ). Another approach, called the periodogram: compute I(ν), the squared modulus of the discrete Fourier transform (at frequencies ν = k/n). 19

88 Estimating the spectrum: Outline These two approaches are identical at the Fourier frequencies ν = k/n. The asymptotic expectation of the periodogram I(ν) isf(ν). We can derive some asymptotic properties, and hence do hypothesis testing. Unfortunately, the asymptotic variance ofi(ν) is constant. It is not a consistent estimator of f(ν). We can reduce the variance by smoothing the periodogram averaging over adjacent frequencies. If we average over a narrower range as n, we can obtain a consistent estimator of the spectral density. 20

89 Estimating the spectrum: Sample autocovariance Idea: use the sample autocovariance ˆγ( ), defined by ˆγ(h) = 1 n n h t=1 (x t+ h x)(x t x), for n < h < n, as an estimate of the autocovariance γ( ), and then use a sample version of f(ν) = γ(h)e 2πiνh, h= That is, for 1/2 ν 1/2, estimate f(ν) with ˆf(ν) = n 1 h= n+1 ˆγ(h)e 2πiνh. 21

90 Estimating the spectrum: Periodogram Another approach to estimating the spectrum is called the periodogram. It was proposed in 1897 by Arthur Schuster (at Owens College, which later became part of the University of Manchester), who used it to investigate periodicity in the occurrence of earthquakes, and in sunspot activity. Arthur Schuster, On Lunar and Solar Periodicities of Earthquakes, Proceedings of the Royal Society of London, Vol. 61 (1897), pp To define the periodogram, we need to introduce the discrete Fourier transform of a finite sequence x 1,...,x n. 22

91 Introduction to Time Series Analysis. Lecture Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance 5. Discrete Fourier transform and the periodogram 23

92 Discrete Fourier transform For a sequence (x 1,...,x n ), define the discrete Fourier transform (DFT) as (X(ν 0 ),X(ν 1 ),...,X(ν n 1 )), where X(ν k ) = 1 n n t=1 x t e 2πiν kt, and ν k = k/n (for k = 0,1,...,n 1) are called the Fourier frequencies. (Think of{ν k : k = 0,...,n 1} as the discrete version of the frequency range ν [0,1].) First, let s show that we can view the DFT as a representation ofxin a different basis, the Fourier basis. 24

93 Discrete Fourier transform Consider the space C n of vectors ofncomplex numbers, with inner product a,b = a b, where a is the complex conjugate transpose of the vector a C n. Suppose that a set {φ j : j = 0,1,...,n 1} of n vectors inc n are orthonormal: 1 ifj = k, φ j,φ k = 0 otherwise. Then these {φ j } span the vector space C n, and so for any vector x, we can write x in terms of this new orthonormal basis, x = n 1 φ j,x φ j. (picture) j=0 25

94 Discrete Fourier transform Consider the following set ofnvectors inc n : { e j = 1 ( e 2πiν j,e 2πi2ν j,...,e 2πinν ) } j : j = 0,...,n 1. n It is easy to check that these vectors are orthonormal: e j,e k = 1 n e 2πit(ν k ν j ) = 1 n (e 2πi(k j)/n) t n n t=1 t=1 1 ifj = k, = 1 n e2πi(k j)/n 1 (e 2πi(k j)/n ) n otherwise 1 e 2πi(k j)/n 1 ifj = k, = 0 otherwise, 26

95 Discrete Fourier transform where we have used the fact that S n = n t=1 αt satisfies αs n = S n +α n+1 αand sos n = α(1 α n )/(1 α) for α 1. So we can represent the real vector x = (x 1,...,x n ) C n in terms of this orthonormal basis, x = n 1 e j,x e j = j=0 n 1 j=0 X(ν j )e j. That is, the vector of discrete Fourier transform coefficients (X(ν 0 ),...,X(ν n 1 )) is the representation ofxin the Fourier basis. 27

96 Discrete Fourier transform An alternative way to represent the DFT is by separately considering the real and imaginary parts, X(ν j ) = e j,x = 1 n n t=1 = 1 n n t=1 e 2πitν j x t = X c (ν j ) ix s (ν j ), cos(2πtν j )x t i 1 n n t=1 sin(2πtν j )x t where this defines the sine and cosine transforms, X s andx c, of x. 28

97 Introduction to Time Series Analysis. Lecture Review: Spectral density estimation, sample autocovariance. 2. The periodogram and sample autocovariance. 3. Asymptotics of the periodogram. 1

98 Estimating the Spectrum: Outline We have seen that the spectral density gives an alternative view of stationary time series. Given a realization x 1,...,x n of a time series, how can we estimate the spectral density? One approach: replace γ( ) in the definition f(ν) = h= γ(h)e 2πiνh, with the sample autocovariance ˆγ( ). Another approach, called the periodogram: compute I(ν), the squared modulus of the discrete Fourier transform (at frequencies ν = k/n). 2

99 Estimating the spectrum: Outline These two approaches are identical at the Fourier frequencies ν = k/n. The asymptotic expectation of the periodogram I(ν) isf(ν). We can derive some asymptotic properties, and hence do hypothesis testing. Unfortunately, the asymptotic variance ofi(ν) is constant. It is not a consistent estimator of f(ν). 3

100 Review: Spectral density estimation If a time series {X t } has autocovariance γ satisfying h= γ(h) <, then we define its spectral density as f(ν) = h= γ(h)e 2πiνh for < ν <. 4

101 Review: Sample autocovariance Idea: use the sample autocovariance ˆγ( ), defined by ˆγ(h) = 1 n n h t=1 (x t+ h x)(x t x), for n < h < n, as an estimate of the autocovariance γ( ), and then use ˆf(ν) = n 1 h= n+1 ˆγ(h)e 2πiνh for 1/2 ν 1/2. 5

102 Discrete Fourier transform For a sequence (x 1,...,x n ), define the discrete Fourier transform (DFT) as (X(ν 0 ),X(ν 1 ),...,X(ν n 1 )), where X(ν k ) = 1 n n t=1 x t e 2πiν kt, and ν k = k/n (for k = 0,1,...,n 1) are called the Fourier frequencies. (Think of{ν k : k = 0,...,n 1} as the discrete version of the frequency range ν [0,1].) First, let s show that we can view the DFT as a representation ofxin a different basis, the Fourier basis. 6

103 Discrete Fourier transform Consider the space C n of vectors ofncomplex numbers, with inner product a,b = a b, where a is the complex conjugate transpose of the vector a C n. Suppose that a set {φ j : j = 0,1,...,n 1} of n vectors inc n are orthonormal: 1 ifj = k, φ j,φ k = 0 otherwise. Then these {φ j } span the vector space C n, and so for any vector x, we can write x in terms of this new orthonormal basis, x = n 1 φ j,x φ j. (picture) j=0 7

104 Discrete Fourier transform Consider the following set ofnvectors inc n : { e j = 1 ( e 2πiν j,e 2πi2ν j,...,e 2πinν ) } j : j = 0,...,n 1. n It is easy to check that these vectors are orthonormal: e j,e k = 1 n e 2πit(ν k ν j ) = 1 n (e 2πi(k j)/n) t n n t=1 t=1 1 ifj = k, = 1 n e2πi(k j)/n 1 (e 2πi(k j)/n ) n otherwise 1 e 2πi(k j)/n 1 ifj = k, = 0 otherwise, 8

105 Discrete Fourier transform where we have used the fact that S n = n t=1 αt satisfies αs n = S n +α n+1 αand sos n = α(1 α n )/(1 α) for α 1. So we can represent the real vector x = (x 1,...,x n ) C n in terms of this orthonormal basis, x = n 1 e j,x e j = j=0 n 1 j=0 X(ν j )e j. That is, the vector of discrete Fourier transform coefficients (X(ν 0 ),...,X(ν n 1 )) is the representation ofxin the Fourier basis. 9

106 Discrete Fourier transform An alternative way to represent the DFT is by separately considering the real and imaginary parts, X(ν j ) = e j,x = 1 n n t=1 = 1 n n t=1 e 2πitν j x t = X c (ν j ) ix s (ν j ), cos(2πtν j )x t i 1 n n t=1 sin(2πtν j )x t where this defines the sine and cosine transforms, X s andx c, of x. 10

107 Periodogram The periodogram is defined as I(ν j ) = X(ν j ) 2 = 1 n n 2 e 2πitν j x t t=1 = X 2 c(ν j )+X 2 s(ν j ). X c (ν j ) = 1 n n t=1 X s (ν j ) = 1 n n t=1 cos(2πtν j )x t, sin(2πtν j )x t. 11

108 Periodogram SinceI(ν j ) = X(ν j ) 2 for one of the Fourier frequencies ν j = j/n (for j = 0,1,...,n 1), the orthonormality of thee j implies that we can write n 1 n 1 x x = X(ν j )e j X(ν j )e j j=0 j=0 = n 1 j=0 X(ν j ) 2 = n 1 j=0 I(ν j ). For x = 0, we can write this as ˆσ 2 x = 1 n n x 2 t = 1 n t=1 n 1 j=0 I(ν j ). 12

109 Periodogram This is the discrete analog of the identity σ 2 x = γ x (0) = 1/2 1/2 f x (ν)dν. (Think ofi(ν j ) as the discrete version off(ν) at the frequency ν j = j/n, and think of(1/n) ν j as the discrete version of ν dν.) 13

110 Estimating the spectrum: Periodogram Why is the periodogram at a Fourier frequency (that is,ν = ν j ) the same as computingf(ν) from the sample autocovariance? Almost the same they are not the same at ν 0 = 0 when x 0. But if either x = 0, or we consider a Fourier frequency ν j with j {1,...,n 1},... 14

111 Estimating the spectrum: Periodogram I(ν j ) = 1 n = 1 n = 1 n n 2 e 2πitν j x t = 1 2 n e 2πitν j (x n t x) t=1 t=1 ( n )( n ) e 2πitν j (x t x) e 2πitν j (x t x) t=1 t=1 e 2πi(s t)ν j (x s x)(x t x) = s,t n 1 h= n+1 ˆγ(h)e 2πihν j, where the fact that ν j 0 implies n t=1 e 2πitν j = 0 (we showed this when we were verifying the orthonormality of the Fourier basis) has allowed us to subtract the sample mean in that case. 15

112 Asymptotic properties of the periodogram We want to understand the asymptotic behavior of the periodogram I(ν) at a particular frequency ν, as n increases. We ll see that its expectation converges tof(ν). We ll start with a simple example: Suppose that X 1,...,X n are i.i.d.n(0,σ 2 ) (Gaussian white noise). From the definitions, X c (ν j ) = 1 n n t=1 cos(2πtν j )x t, X s (ν j ) = 1 n n we have that X c (ν j ) and X s (ν j ) are normal, with EX c (ν j ) = EX s (ν j ) = 0. t=1 sin(2πtν j )x t, 16

113 Asymptotic properties of the periodogram Also, Var(X c (ν j )) = σ2 n = σ2 2n n cos 2 (2πtν j ) t=1 n (cos(4πtν j )+1) = σ2 2. t=1 Similarly, Var(X s (ν j )) = σ 2 /2. 17

114 Asymptotic properties of the periodogram Also, Cov(X c (ν j ),X s (ν j )) = σ2 n = σ2 2n Cov(X c (ν j ),X c (ν k )) = 0 Cov(X s (ν j ),X s (ν k )) = 0 Cov(X c (ν j ),X s (ν k )) = 0. n cos(2πtν j )sin(2πtν j ) t=1 n sin(4πtν j ) = 0, t=1 for any j k. 18

115 Asymptotic properties of the periodogram That is, if X 1,...,X n are i.i.d.n(0,σ 2 ) (Gaussian white noise; f(ν) = σ 2 ), then thex c (ν j ) and X s (ν j ) are all i.i.d.n(0,σ 2 /2). Thus, 2 σ 2I(ν j) = 2 σ 2 ( X 2 c (ν j )+X 2 s(ν j ) ) χ 2 2. So for the case of Gaussian white noise, the periodogram has a chi-squared distribution that depends on the variance σ 2 (which, in this case, is the spectral density). 19

116 Asymptotic properties of the periodogram Under more general conditions (e.g., normal {X t }, or linear process {X t } with rapidly decaying ACF), thex c (ν j ),X s (ν j ) are all asymptotically independent andn(0,f(ν j )/2). Consider a frequency ν. For a given value of n, let ˆν (n) be the closest Fourier frequency (that is, ˆν (n) = j/n for a value ofj that minimizes ν j/n ). Asnincreases, ˆν (n) ν, and (under the same conditions that ensure the asymptotic normality and independence of the sine/cosine transforms), f(ˆν (n) ) f(ν). (picture) In that case, we have 2 f(ν) I(ˆν(n) ) = 2 f(ν) ( X 2 c(ˆν (n) )+X 2 s(ˆν (n) )) d χ

117 Asymptotic properties of the periodogram Thus, EI(ˆν (n) ) = f(ν) ( 2 2 E f(ν) f(ν) 2 E(Z2 1 +Z 2 2) = f(ν), ( ) ) Xc(ˆν 2 (n) )+Xs(ˆν 2 (n) ) where Z 1,Z 2 are independent N(0,1). Thus, the periodogram is asymptotically unbiased. 21

Introduction to Time Series Analysis. Lecture 18.

Introduction to Time Series Analysis. Lecture 18. Introduction to Time Series Analysis. Lecture 18. 1. Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance

More information

Introduction to Time Series Analysis. Lecture 19.

Introduction to Time Series Analysis. Lecture 19. Introduction to Time Series Analysis. Lecture 19. 1. Review: Spectral density estimation, sample autocovariance. 2. The periodogram and sample autocovariance. 3. Asymptotics of the periodogram. 1 Estimating

More information

Introduction to Time Series Analysis. Lecture 21.

Introduction to Time Series Analysis. Lecture 21. Introduction to Time Series Analysis. Lecture 21. 1. Review: The periodogram, the smoothed periodogram. 2. Other smoothed spectral estimators. 3. Consistency. 4. Asymptotic distribution. 1 Review: Periodogram

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Cross Spectra, Linear Filters, and Parametric Spectral Estimation

Cross Spectra, Linear Filters, and Parametric Spectral Estimation Cross Spectra,, and Parametric Spectral Estimation Outline 1 Multiple Series and Cross-Spectra 2 Arthur Berg Cross Spectra,, and Parametric Spectral Estimation 2/ 20 Outline 1 Multiple Series and Cross-Spectra

More information

Outline. STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Summary Statistics of Midterm Scores

Outline. STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Summary Statistics of Midterm Scores STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Outline 1 Midterm Results 2 Multiple Series and Cross-Spectra 3 Linear Filters Arthur Berg STA 6857 Cross Spectra,

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8)

STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Outline 1 Midterm Results 2 Multiple Series and Cross-Spectra 3 Linear Filters Arthur Berg STA 6857 Cross Spectra,

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

STAD57 Time Series Analysis. Lecture 23

STAD57 Time Series Analysis. Lecture 23 STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Time series and spectral analysis. Peter F. Craigmile

Time series and spectral analysis. Peter F. Craigmile Time series and spectral analysis Peter F. Craigmile http://www.stat.osu.edu/~pfc/ Summer School on Extreme Value Modeling and Water Resources Universite Lyon 1, France. 13-24 Jun 2016 Thank you to The

More information

A time series is a set of observations made sequentially in time.

A time series is a set of observations made sequentially in time. Time series and spectral analysis Peter F. Craigmile Analyzing time series A time series is a set of observations made sequentially in time. R. A. Fisher: One damned thing after another. Time series analysis

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Spectral Analysis (Theory)

Spectral Analysis (Theory) Spectral Analysis (Theory) Al Nosedal University of Toronto Winter 2016 Al Nosedal University of Toronto Spectral Analysis (Theory) Winter 2016 1 / 28 Idea: Decompose a stationary time series {X t } into

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Introduction to Signal Processing

Introduction to Signal Processing to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

Statistics of Stochastic Processes

Statistics of Stochastic Processes Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

Introduction to Time Series Analysis. Lecture 12.

Introduction to Time Series Analysis. Lecture 12. Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania Spectral Analysis Jesús Fernández-Villaverde University of Pennsylvania 1 Why Spectral Analysis? We want to develop a theory to obtain the business cycle properties of the data. Burns and Mitchell (1946).

More information

Statistics 910, #5 1. Regression Methods

Statistics 910, #5 1. Regression Methods Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Time Series. Anthony Davison. c

Time Series. Anthony Davison. c Series Anthony Davison c 2008 http://stat.epfl.ch Periodogram 76 Motivation............................................................ 77 Lutenizing hormone data..................................................

More information

Lecture 4 - Spectral Estimation

Lecture 4 - Spectral Estimation Lecture 4 - Spectral Estimation The Discrete Fourier Transform The Discrete Fourier Transform (DFT) is the equivalent of the continuous Fourier Transform for signals known only at N instants separated

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing Linear Stochastic Models Special Types of Random Processes: AR, MA, and ARMA Digital Signal Processing Department of Electrical and Electronic Engineering, Imperial College d.mandic@imperial.ac.uk c Danilo

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Review of Concepts from Fourier & Filtering Theory. Fourier theory for finite sequences. convolution/filtering of infinite sequences filter cascades

Review of Concepts from Fourier & Filtering Theory. Fourier theory for finite sequences. convolution/filtering of infinite sequences filter cascades Review of Concepts from Fourier & Filtering Theory precise definition of DWT requires a few basic concepts from Fourier analysis and theory of linear filters will start with discussion/review of: basic

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations. Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Practical Spectral Estimation

Practical Spectral Estimation Digital Signal Processing/F.G. Meyer Lecture 4 Copyright 2015 François G. Meyer. All Rights Reserved. Practical Spectral Estimation 1 Introduction The goal of spectral estimation is to estimate how the

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Figure 18: Top row: example of a purely continuous spectrum (left) and one realization

Figure 18: Top row: example of a purely continuous spectrum (left) and one realization 1..5 S(). -.2 -.5 -.25..25.5 64 128 64 128 16 32 requency time time Lag 1..5 S(). -.5-1. -.5 -.1.1.5 64 128 64 128 16 32 requency time time Lag Figure 18: Top row: example o a purely continuous spectrum

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

ARMA Models: I VIII 1

ARMA Models: I VIII 1 ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Nonparametric and Parametric Defined This text distinguishes between systems and the sequences (processes) that result when a WN input is applied

Nonparametric and Parametric Defined This text distinguishes between systems and the sequences (processes) that result when a WN input is applied Linear Signal Models Overview Introduction Linear nonparametric vs. parametric models Equivalent representations Spectral flatness measure PZ vs. ARMA models Wold decomposition Introduction Many researchers

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements Don Percival Applied Physics Lab, University of Washington, Seattle overheads for talk available at http://staff.washington.edu/dbp/talks.html

More information

MATH 5075: Time Series Analysis

MATH 5075: Time Series Analysis NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show

More information

Notes on the Book: Time Series Analysis: Forecasting and Control by George E. P. Box and Gwilym M. Jenkins

Notes on the Book: Time Series Analysis: Forecasting and Control by George E. P. Box and Gwilym M. Jenkins Notes on the Book: Time Series Analysis: Forecasting and Control by George E. P. Box and Gwilym M. Jenkins John L. Weatherwax June 20, 2008 Introduction Here you ll find some notes that I wrote up as I

More information

Fourier Analysis of Stationary and Non-Stationary Time Series

Fourier Analysis of Stationary and Non-Stationary Time Series Fourier Analysis of Stationary and Non-Stationary Time Series September 6, 2012 A time series is a stochastic process indexed at discrete points in time i.e X t for t = 0, 1, 2, 3,... The mean is defined

More information

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina STAT 720 TIME SERIES ANALYSIS Spring 2015 Lecture Notes Dewei Wang Department of Statistics University of South Carolina 1 Contents 1 Introduction 1 1.1 Some examples........................................

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

8.2 Harmonic Regression and the Periodogram

8.2 Harmonic Regression and the Periodogram Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Review of Linear Systems Theory

Review of Linear Systems Theory Review of Linear Systems Theory The following is a (very) brief review of linear systems theory, convolution, and Fourier analysis. I work primarily with discrete signals, but each result developed in

More information

FILTERING IN THE FREQUENCY DOMAIN

FILTERING IN THE FREQUENCY DOMAIN 1 FILTERING IN THE FREQUENCY DOMAIN Lecture 4 Spatial Vs Frequency domain 2 Spatial Domain (I) Normal image space Changes in pixel positions correspond to changes in the scene Distances in I correspond

More information