Introduction to Time Series Analysis. Lecture 7.

Size: px
Start display at page:

Download "Introduction to Time Series Analysis. Lecture 7."

Transcription

1 Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance of an ARMA process. 4. Homogeneous linear difference equations. 1

2 Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. Review: ARMA(p,q) models and their properties 2. Review: Autocovariance of an ARMA process. 3. Homogeneous linear difference equations. Forecasting 1. Linear prediction. 2. Projection in Hilbert space. 2

3 Review: Autoregressive moving average models An ARMA(p,q) process {X t } is a stationary process that satisfies φ(b)x t = θ(b)w t, where φ, θ are degree p, q polynomials and {W t } WN(0, σ 2 ). We ll insist that the polynomials φ and θ have no common factors. 3

4 Review: Properties of ARMA(p,q) models Theorem: If φ and θ have no common factors, a (unique) stationary solution {X t } to φ(b)x t = θ(b)w t exists iff φ(z) = 1 φ 1 z φ p z p = 0 z 1. This ARMA(p,q) process is causal iff φ(z) = 1 φ 1 z φ p z p = 0 z > 1. It is invertible iff θ(z) = 1 + θ 1 z + + θ q z q = 0. z > 1. 4

5 Review: Properties of ARMA(p,q) models so φ(b)x t = θ(b)w t, X t = ψ(b)w t θ(b) = ψ(b)φ(b).. 1 = ψ 0, θ 1 = ψ 1 φ 1 ψ 0, θ 2 = ψ 2 φ 1 ψ 1 φ 2 ψ 0,. We need to solve the linear difference equation θ j = φ(b)ψ j, with θ 0 = 1, θ j = 0 for j < 0, j > q. 5

6 Review: Autocovariance functions of ARMA processes φ(b)x t = θ(b)w t, X t = ψ(b)w t, q γ(h) φ 1 γ(h 1) φ(2)γ(h 2) = σw 2 θ j ψ j h. j=h We need to solve the homogeneous linear difference equation φ(b)γ(h) = 0 (h > q), with initial conditions γ(h) φ 1 γ(h 1) φ p γ(h p) = σ 2 w q θ j ψ j h, j=h h = 0,...,q. 6

7 Introduction to Time Series Analysis. Lecture Review: ARMA(p,q) models and their properties 2. Review: Autocovariance of an ARMA process. 3. Homogeneous linear difference equations. Forecasting 1. Linear prediction. 2. Projection in Hilbert space. 7

8 Homogeneous linear diff eqns with constant coefficients a 0 x t + a 1 x t a k x t k = 0 ( a0 + a 1 B + + a k B k) x t = 0 a(b)x t = 0 auxiliary equation: a 0 + a 1 z + + a k z k = 0 (z z 1 )(z z 2 ) (z z k ) = 0 where z 1, z 2,...,z k C are the roots of this characteristic polynomial. Thus, a(b)x t = 0 (B z 1 )(B z 2 ) (B z k )x t = 0. 8

9 Homogeneous linear diff eqns with constant coefficients a(b)x t = 0 (B z 1 )(B z 2 ) (B z k )x t = 0. So any {x t } satisfying (B z i )x t = 0 for some i also satisfies a(b)x t = 0. Three cases: 1. The z i are real and distinct. 2. The z i are complex and distinct. 3. Some z i are repeated. 9

10 Homogeneous linear diff eqns with constant coefficients 1. The z i are real and distinct. a(b)x t = 0 (B z 1 )(B z 2 ) (B z k )x t = 0 for some constants c 1,...,c k. x t is a linear combination of solutions to (B z 1 )x t = 0, (B z 2 )x t = 0,..., (B z k )x t = 0 x t = c 1 z t 1 + c 2 z t c k z t k, 10

11 Homogeneous linear diff eqns with constant coefficients 1. The z i are real and distinct. e.g., z 1 = 1.2, z 2 = c 1 z 1 t + c2 z 2 t c 1 =1, c 2 =0 c 1 =0, c 2 =1 c 1 = 0.8, c 2 =

12 Reminder: Complex exponentials a + ib = re iθ = r(cosθ + isinθ), where r = a + ib = a 2 + b 2 ( ) b θ = tan 1 ( π, π]. a Thus, r 1 e iθ 1 r 2 e iθ 2 = (r 1 r 2 )e i(θ 1+θ 2 ), z z = z 2. 12

13 Homogeneous linear diff eqns with constant coefficients 2. The z i are complex and distinct. As before, a(b)x t = 0 x t = c 1 z t 1 + c 2 z t c k z t k. If z 1 R, since a 1,...,a k are real, we must have the complex conjugate root, z j = z 1. And for x t to be real, we must have c j = c 1. For example: x t = c z t 1 + c z 1 t = r e iθ z 1 t e iωt + r e iθ z 1 t e iωt = r z 1 t ( e i(θ ωt) + e i(θ ωt)) = 2r z 1 t cos(ωt θ) where z 1 = z 1 e iω and c = re iθ. 13

14 Homogeneous linear diff eqns with constant coefficients 2. The z i are complex and distinct. e.g., z 1 = i, z 2 = 1.2 i c 1 z 1 t + c2 z 2 t c= i c= i c= i

15 Homogeneous linear diff eqns with constant coefficients 2. The z i are complex and distinct. e.g., z 1 = i, z 2 = 1 0.1i c 1 z 1 t + c2 z 2 t c= i c= i c= i

16 Homogeneous linear diff eqns with constant coefficients 3. Some z i are repeated. a(b)x t = 0 (B z 1 )(B z 2 ) (B z k )x t = 0. If z 1 = z 2, (B z 1 )(B z 2 )x t = 0 (B z 1 ) 2 x t = 0. We can check that (c 1 + c 2 t)z t 1 is a solution... More generally, (B z 1 ) m x t = 0 has the solution ( c1 + c 2 t + + c m 1 t m 1) z t 1. 16

17 Homogeneous linear diff eqns with constant coefficients 3. Some z i are repeated. e.g., z 1 = z 2 = (c 1 + c 2 t) z 1 t c 1 =1, c 2 =0 c 1 =0, c 2 =2 c 1 = 0.2, c 2 =

18 Solving linear diff eqns with constant coefficients Solve: a 0 x t + a 1 x t a k x t k = 0, with initial conditions x 1,...,x k. Auxiliary equation in z C: a 0 + a 1 z + + a k z k = 0 (z z 1 ) m 1 (z z 2 ) m2 (z z l ) m l = 0, where z 1, z 2,...,z l C are the roots of the characteristic polynomial, and z i occurs with multiplicity m i. Solutions: c 1 (t)z1 t + c 2 (t)z2 t + + c l (t)z t l, where c i (t) is a polynomial in t of degree m i 1. We determine the coefficients of the c i (t) using the initial conditions (which might be linear constraints on the initial values x 1,...,x k ). 18

19 Autocovariance functions of ARMA processes: Example ( B 2 )X t = ( B)W t, X t = ψ(b)w t, ( ψ j = 1, 1 5, 1 4, 1 20, 1 16, 1 80, 1 64, 1 ) 320,.... γ(h) φ 1 γ(h 1) φ 2 γ(h 2) = σ 2 w γ(h) γ(h 2) = q h j=0 θ h+j ψ j σ 2 w (ψ ψ 1 ) if h = 0, 0.2σ 2 wψ 0 if h = 1, 0 otherwise. 19

20 Autocovariance functions of ARMA processes: Example We have the homogeneous linear difference equation γ(h) γ(h 2) = 0 for h 2, with initial conditions γ(0) γ( 2) = σw 2 (1 + 1/25) γ(1) γ( 1) = σw/

21 Autocovariance functions of ARMA processes: Example Homogeneous lin. diff. eqn: The characteristic polynomial is γ(h) γ(h 2) = z 2 = 1 4 ( ) 4 + z 2 = 1 (z 2i)(z + 2i), 4 which has roots at z 1 = 2e iπ/2, z 1 = 2e iπ/2. The solution is of the form γ(h) = cz h 1 + c z 1 h. 21

22 Autocovariance functions of ARMA processes: Example z 1 = 2e iπ/2, z 1 = 2e iπ/2, c = c e iθ. We have γ(h) = cz h 1 + c z 1 h ( = 2 h c e i(θ hπ/2) + c e i( θ+hπ/2)) ( ) hπ = c 1 2 h cos 2 θ. And we determine c 1, θ from the initial conditions γ(0) γ( 2) = σ 2 w (1 + 1/25) γ(1) γ( 1) = σ 2 w/5. 22

23 Autocovariance functions of ARMA processes: Example We determine c 1, θ from the initial conditions: We plug γ(0) = c 1 cos(θ) γ(1) = c 1 2 sin(θ) γ(2) = c 1 4 cos(θ) into γ(0) γ(2) = σ 2 w (1 + 1/25) 1.25γ(1) = σ 2 w/5. 23

24 Autocovariance functions of ARMA processes: Example 1.2 γ(h)

25 Introduction to Time Series Analysis. Lecture Review: ARMA(p,q) models and their properties 2. Review: Autocovariance of an ARMA process. 3. Homogeneous linear difference equations. Forecasting 1. Linear prediction. 2. Projection in Hilbert space. 25

26 Review: least squares linear prediction Consider a linear predictor of X n+h given X n = x n : f(x n ) = α 0 + α 1 x n. For a stationary time series {X t }, the best linear predictor is f (x n ) = (1 ρ(h))µ + ρ(h)x n : E (X n+h (α 0 + α 1 X n )) 2 E (X n+h f (X n )) 2 = σ 2 (1 ρ(h) 2 ). 26

27 Linear prediction Given X 1, X 2,...,X n, the best linear predictor X n n+m = α 0 + n α i X i i=1 of X n+m satisfies the prediction equations E ( X n+m X n n+m) = 0 E [( X n+m X n n+m) Xi ] = 0 for i = 1,...,n. This is a special case of the projection theorem. 27

28 Projection Theorem If H is a Hilbert space, M is a closed linear subspace of H, and y H, then there is a point Py M (the projection of y on M) satisfying 1. Py y w y for w M, 2. Py y < w y for w M, w y 3. y Py, w = 0 for w M. Py M y Py y 28

29 Hilbert spaces Hilbert space = complete inner product space: Inner product space: vector space, with inner product a, b : a, b = b, a, α 1 a 1 + α 2 a 2, b = α 1 a 1, b + α 2 a 2, b, a, a = 0 a = 0. Norm: a 2 = a, a. complete = limits of Cauchy sequences are in the space Examples: 1. R n, with Euclidean inner product, x, y = i x iy i. 2. {random variables X: EX 2 < }, with inner product X, Y = E(XY ). (Strictly, equivalence classes of a.s. equal r.v.s) 29

30 Projection theorem Example: Linear regression Given y = (y 1, y 2,...,y n ) R n, and Z = (z 1,...,z q ) R n q, choose β = (β 1,...,β q ) R q to minimize y Zβ 2. Here, H = R n, with a, b = i a ib i, and M = {Zβ : β R q } = sp {z 1,...,z q }. 30

31 Projection theorem If H is a Hilbert space, M is a closed subspace of H, and y H, then there is a point Py M (the projection of y on M) satisfying 1. Py y w y 2. y Py, w = 0 for w M. Py M y Py y 31

32 Projection theorem Py y Py y y Py, w = 0 y Z ˆβ, z i = 0, i M Z Z ˆβ = Z y ˆβ = (Z Z) 1 Z y normal equations. 32

33 Projection theorem Example: Linear prediction Given 1, X 1, X 2,...,X n { r.v.s X : EX 2 < }, choose α 0, α 1,...,α n R so that Z = α 0 + n i=1 α ix i minimizes E(X n+m Z) 2. Here, X, Y = E(XY ), M = {Z = α 0 + n i=1 α ix i : α i R} = sp {1, X 1,...,X n }, and y = X n+m. 33

34 Projection theorem If H is a Hilbert space, M is a closed subspace of H, and y H, then there is a point Py M (the projection of y on M) satisfying 1. Py y w y 2. y Py, w = 0 for w M. Py M y Py y 34

35 Projection theorem: Linear prediction Let Xn+m n denote the best linear predictor: Xn+m n X n+m 2 Z X n+m 2 for all Z M. The projection theorem implies the orthogonality Xn+m n X n+m, Z = 0 for all Z M Xn+m n X n+m, Z = 0 for all Z {1, X 1,...,X n } E ( X n n+m X n+m ) = 0 E [( X n n+m X n+m ) Xi ] = 0 That is, the prediction errors (X n n+m X n+m ) are uncorrelated with the prediction variables (1, X 1,...,X n ). 35

36 Linear prediction Error (Xn+m n X n+m ) is uncorrelated with the prediction variable 1: E ( ) Xn+m n X n+m = 0 ( E α 0 + ) α i X i X n+m = 0 i ( µ 1 ) α i = α 0. i 36

37 Linear prediction... µ ( 1 i α i ) = α 0. Substituting for α 0 in X n n+m = α 0 + i α i X i, we get X n n+m = µ + i α i (X i µ). So we can subtract µ from all variables: X n n+m µ = i α i (X i µ). Thus, for forecasting, we can assume µ = 0. So we ll ignore α 0. 37

38 One-step-ahead linear prediction Write Xn+1 n = φ n1 X n + φ n2 X n φ nn X 1 Prediction equations: E ( (Xn+1 n ) X n+1 )X i = 0, for i = 1,...,n n φ nj E (X n+1 j X i ) = E(X n+1 X i ) j=1 n φ nj γ(i j) = γ(i) j=1 Γ n φ n = γ n, 38

39 One-step-ahead linear prediction Prediction equations: Γ n φ n = γ n. γ(0) γ(1) γ(n 1) γ(1) γ(0) γ(n 2) Γ n = γ(n 1) γ(n 2) γ(0), φ n = (φ n1, φ n2,...,φ nn ), γ n = (γ(1), γ(2),..., γ(n)). 39

40 Mean squared error of one-step-ahead linear prediction Pn+1 n = E ( X n+1 Xn+1 n ) 2 = E (( X n+1 Xn+1)( n Xn+1 Xn+1 n )) = E ( ( X n+1 Xn+1 Xn+1 n )) = γ(0) E (φ nxx n+1 ) = γ(0) γ nγ 1 n γ n, where X = (X n, X n 1,...,X 1 ). 40

41 Mean squared error of one-step-ahead linear prediction Variance is reduced: P n n+1 = E ( X n+1 X n n+1 = γ(0) γ nγ 1 n γ n ) 2 = Var(X n+1 ) Cov(X n+1, X)Cov(X, X) 1 Cov(X, X n+1 ) = E (X n+1 0) 2 Cov(X n+1, X)Cov(X, X) 1 Cov(X, X n+1 ), where X = (X n, X n 1,...,X 1 ). 41

42 Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. Review: ARMA(p,q) models and their properties 2. Review: Autocovariance of an ARMA process. 3. Homogeneous linear difference equations. Forecasting 1. Linear prediction. 2. Projection in Hilbert space. 42

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

MATH 5075: Time Series Analysis

MATH 5075: Time Series Analysis NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Introduction to Time Series Analysis. Lecture 18.

Introduction to Time Series Analysis. Lecture 18. Introduction to Time Series Analysis. Lecture 18. 1. Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Introduction to Time Series Analysis. Lecture 15.

Introduction to Time Series Analysis. Lecture 15. Introduction to Time Series Analysis. Lecture 15. Spectral Analysis 1. Spectral density: Facts and examples. 2. Spectral distribution function. 3. Wold s decomposition. 1 Spectral Analysis Idea: decompose

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

Introduction to Time Series Analysis. Lecture 12.

Introduction to Time Series Analysis. Lecture 12. Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.

More information

Statistics 910, #15 1. Kalman Filter

Statistics 910, #15 1. Kalman Filter Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina STAT 720 TIME SERIES ANALYSIS Spring 2015 Lecture Notes Dewei Wang Department of Statistics University of South Carolina 1 Contents 1 Introduction 1 1.1 Some examples........................................

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Mathematical Review for AC Circuits: Complex Number

Mathematical Review for AC Circuits: Complex Number Mathematical Review for AC Circuits: Complex Number 1 Notation When a number x is real, we write x R. When a number z is complex, we write z C. Complex conjugate of z is written as z here. Some books use

More information

Linear Processes in Function Spaces

Linear Processes in Function Spaces D. Bosq Linear Processes in Function Spaces Theory and Applications Springer Preface Notation vi xi Synopsis 1 1. The object of study 1 2. Finite-dimensional linear processes 3 3. Random variables in function

More information

Chapter 2: Complex numbers

Chapter 2: Complex numbers Chapter 2: Complex numbers Complex numbers are commonplace in physics and engineering. In particular, complex numbers enable us to simplify equations and/or more easily find solutions to equations. We

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

ARMA Models: I VIII 1

ARMA Models: I VIII 1 ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Time series models 2007

Time series models 2007 Norwegia Uiversity of Sciece ad Techology Departmet of Mathematical Scieces Solutios to problem sheet 1, 2007 Exercise 1.1 a Let Sc = E[Y c 2 ]. The This gives Sc = EY 2 2cEY + c 2 ds dc = 2EY + 2c = 0

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

Wave Phenomena Physics 15c

Wave Phenomena Physics 15c Wave Phenomena Physics 15c Lecture Harmonic Oscillators (H&L Sections 1.4 1.6, Chapter 3) Administravia! Problem Set #1! Due on Thursday next week! Lab schedule has been set! See Course Web " Laboratory

More information

Wave Phenomena Physics 15c

Wave Phenomena Physics 15c Wave Phenomena Physics 15c Lecture Harmonic Oscillators (H&L Sections 1.4 1.6, Chapter 3) Administravia! Problem Set #1! Due on Thursday next week! Lab schedule has been set! See Course Web " Laboratory

More information

STA 6857 Forecasting ( 3.5 cont.)

STA 6857 Forecasting ( 3.5 cont.) STA 6857 Forecasting ( 3.5 cont.) Outline 1 Forecasting 2 Arthur Berg STA 6857 Forecasting ( 3.5 cont.) 2/ 20 Outline 1 Forecasting 2 Arthur Berg STA 6857 Forecasting ( 3.5 cont.) 3/ 20 Best Linear Predictor

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information

Homework 4. 1 Data analysis problems

Homework 4. 1 Data analysis problems Homework 4 1 Data analysis problems This week we will be analyzing a number of data sets. We are going to build ARIMA models using the steps outlined in class. It is also a good idea to read section 3.8

More information

Statistics 349(02) Review Questions

Statistics 349(02) Review Questions Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation

More information

2. Review of Linear Algebra

2. Review of Linear Algebra 2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Complex Numbers Introduction. Number Systems. Natural Numbers ℵ Integer Z Rational Q Real Complex C

Complex Numbers Introduction. Number Systems. Natural Numbers ℵ Integer Z Rational Q Real Complex C Number Systems Natural Numbers ℵ Integer Z Rational Q R Real Complex C Number Systems Natural Numbers ℵ Integer Z Rational Q R Real Complex C The Natural Number System All whole numbers greater then zero

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic rocesses: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: Setember 2002, Aril 2004, Setember 2004, January 2005, July 2011, May 2016, July 2016 This version:

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

8.2 Harmonic Regression and the Periodogram

8.2 Harmonic Regression and the Periodogram Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014 UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014 STAD57H3 Time Series Analysis Duration: One hour and fifty minutes Last Name: First Name: Student

More information

Time Series Analysis - Part 1

Time Series Analysis - Part 1 Time Series Analysis - Part 1 Dr. Esam Mahdi Islamic University of Gaza - Department of Mathematics April 19, 2017 1 of 189 What is a Time Series? Fundamental concepts Time Series Decomposition Estimating

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Final Exam. Economics 835: Econometrics. Fall 2010

Final Exam. Economics 835: Econometrics. Fall 2010 Final Exam Economics 835: Econometrics Fall 2010 Please answer the question I ask - no more and no less - and remember that the correct answer is often short and simple. 1 Some short questions a) For each

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Topic 4 Notes Jeremy Orloff

Topic 4 Notes Jeremy Orloff Topic 4 Notes Jeremy Orloff 4 Complex numbers and exponentials 4.1 Goals 1. Do arithmetic with complex numbers.. Define and compute: magnitude, argument and complex conjugate of a complex number. 3. Euler

More information

ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009

ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009 1. Introduction Given a covariance stationary process µ ' E[ ], the Wold decomposition states that where U t ARMA MODELS Herman J. Bierens Pennsylvania State University February 23, 2009 with vanishing

More information

3 ARIMA Models. 3.1 Introduction

3 ARIMA Models. 3.1 Introduction 3 ARIMA Models 3. Introduction In Chapters and, we introduced autocorrelation and cross-correlation functions (ACFs and CCFs) as tools for clarifying relations that may occur within and between time series

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

INNER PRODUCT SPACE. Definition 1

INNER PRODUCT SPACE. Definition 1 INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of

More information

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2)

STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) Outline 1 Signal Extraction and Optimal Filtering 2 Arthur Berg STA 6857 Signal Extraction & Long Memory ARMA ( 4.11 & 5.2) 2/ 17 Outline 1 Signal

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. 9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts

More information

STAD57 Time Series Analysis. Lecture 23

STAD57 Time Series Analysis. Lecture 23 STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=

More information

Examination paper for Solution: TMA4285 Time series models

Examination paper for Solution: TMA4285 Time series models Department of Mathematical Sciences Examination paper for Solution: TMA4285 Time series models Academic contact during examination: Håkon Tjelmeland Phone: 4822 1896 Examination date: December 7th 2013

More information

Lecture 6: Geometry of OLS Estimation of Linear Regession

Lecture 6: Geometry of OLS Estimation of Linear Regession Lecture 6: Geometry of OLS Estimation of Linear Regession Xuexin Wang WISE Oct 2013 1 / 22 Matrix Algebra An n m matrix A is a rectangular array that consists of nm elements arranged in n rows and m columns

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Complex numbers, the exponential function, and factorization over C

Complex numbers, the exponential function, and factorization over C Complex numbers, the exponential function, and factorization over C 1 Complex Numbers Recall that for every non-zero real number x, its square x 2 = x x is always positive. Consequently, R does not contain

More information

ARIMA Models. Richard G. Pierse

ARIMA Models. Richard G. Pierse ARIMA Models Richard G. Pierse 1 Introduction Time Series Analysis looks at the properties of time series from a purely statistical point of view. No attempt is made to relate variables using a priori

More information

Complex Numbers. The set of complex numbers can be defined as the set of pairs of real numbers, {(x, y)}, with two operations: (i) addition,

Complex Numbers. The set of complex numbers can be defined as the set of pairs of real numbers, {(x, y)}, with two operations: (i) addition, Complex Numbers Complex Algebra The set of complex numbers can be defined as the set of pairs of real numbers, {(x, y)}, with two operations: (i) addition, and (ii) complex multiplication, (x 1, y 1 )

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

CHANGE DETECTION IN TIME SERIES

CHANGE DETECTION IN TIME SERIES CHANGE DETECTION IN TIME SERIES Edit Gombay TIES - 2008 University of British Columbia, Kelowna June 8-13, 2008 Outline Introduction Results Examples References Introduction sunspot.year 0 50 100 150 1700

More information

Spectral Analysis (Theory)

Spectral Analysis (Theory) Spectral Analysis (Theory) Al Nosedal University of Toronto Winter 2016 Al Nosedal University of Toronto Spectral Analysis (Theory) Winter 2016 1 / 28 Idea: Decompose a stationary time series {X t } into

More information