Ch 4. Models For Stationary Time Series. Time Series Analysis

Size: px
Start display at page:

Download "Ch 4. Models For Stationary Time Series. Time Series Analysis"

Transcription

1

2 This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e t } represent an unobserved white noise series (i.i.d. r.v.s with zero mean.) Assumptions for the models: 1 {Y t } is stationary and zero mean. (If {Y t } has a nonzero mean µ, we may replace {Y t } by {Y t µ} to get a zero mean series. e.g., Y t µ = (Y t 1 µ) 0.24(Y t 2 µ) + e t. ) 2 e t is independent to Y t k (thus E(e t Y t k ) = 0) for k = 1, 2, 3,. One important issue is the pattern of the autocorrelation function {ρ k }, which will be estimated by the sample autocorrelation function {r k } to build appropriate models in later chapters. The {ρ k } could be solved recursively by the Yule-Walker equations.

3 4.1 General Linear Processes Def. A general linear process, {Y t }, is one that can be represented as a weighted linear combination of (finite or infinite) present and past white noise terms as Y t = e t + Ψ 1 e t 1 + Ψ 2 e t 2 +. Let Ψ 0 = 1. We assume that Var (Y t ) = i=0 Ψ2 i < to make the process meaningful.

4 Ex. An important example is Ψ i = φ i for a given φ ( 1, 1). So Y t = e t + φe t 1 + φ 2 e t 2 + We have = e t + φ(e t 1 + φe t 2 + ) = e t + φy t 1 (an AR(1) ). E(Y t ) = 0, ( ) ( ) Var (Y t ) = Var φ i e t i = φ 2i Var (e t i ) = σe 2 φ 2i i=0 Since Y t = φy t 1 + e t, and Y t 1 and e t are independent, we get i=0 Cov (Y t, Y t 1 ) = Cov (φy t 1 + e t, Y t 1 ) = φvar (Y t 1 ) = φσ2 e 1 φ 2, [ ] [ ] φσ 2 Corr (Y t, Y t 1 ) = e σ 2 1 φ 2 / e 1 φ 2 = φ. Similarly, we get i=0 Cov (Y t, Y t k ) = φk σ 2 e 1 φ 2, Corr (Y t, Y t k ) = φ k. Clearly, {Y t } is stationary. = σ2 e 1 φ 2

5 In general, Every general linear process (with finite Y t variance) Y t = e t + Ψ 1 e t 1 + Ψ 2 e t 2 + is stationary, with E(Y t ) = 0, ( ) γ k = Cov (Y t, Y t k ) = σe 2 Ψ i Ψ i+k, k 0. i=0

6 4.2 Moving Average Processes Def. A general linear process with only finite nonzero terms Y t = e t θ 1 e t 1 θ 2 e t 2 θ q e t q is called a moving average process of order q (abbr. MA(q).) Remarks. 1 We change notation from Ψ s to θ s in MA processes. 2 The R software uses + signs before the θ s.

7 MA(1) Process Y t = e t θe t 1 By direct computation, E(Y t ) = 0, Var (Y t ) = σe(1 2 + θ 2 ), Cov (Y t, Y t 1 ) = Cov (e t θe t 1, e t 1 θe t 2 ) = θσe, 2 Cov (Y t, Y t k ) = Cov (e t θe t 1, e t k θe t k 1 ) = 0, for k 2. Important fact: the MA(1) process has no correlation beyond lag 1. Theorem 1 For a MA(1) model Y t = e t θe t 1, E(Y t ) = 0, γ 0 = σe(1 2 + θ 2 ), γ 1 = θσe, 2 ρ 1 = θ 1 + θ 2, γ k = ρ k = 0 for k 2.

8 Ex. Exhibit 4.1 displays a graph of the lag 1 autocorrelation values for θ ranging from -1 to +1. rho=function(theta){-theta/(1+theta 2)} # Define rho as a function with variable theta plot(rho,xlim=c(-1,1),ylab=expression(rho[1]), xlab=expression(theta), main=expression(paste( Lag 1 Autocorrelation of an MA(1) Process for Different, theta))) R code explanations: 1 ylab=expression(rho[1]) specifies that the y label is ρ 1. Similarly for the expression commands in xlab and main. 2 Try?legend or?plotmath for more about typesetting or plotting a formula. See TS-ch4.R

9 Ex. An MA(1) series with MA coefficient equal to θ 1 = 0.9 and of length n = 100 can be simulated as follow (see TS-ch4.R): set.seed(12345) # initializes the seed of the random number generator to reproduce a simulation. y=arima.sim(model=list(ma=-c(-0.9)),n=100) # simulate a realization of size 100 of an MA(1) model with θ 1 = 0.9 R code explanations: 1 The arima.sim function simulates a time series from a given ARIMA model passed into the function as a list that contains the AR and MA parameters as vectors. 2 R uses a plus convention in parameterizing the MA part, so we have to add a minus sign before the vector of MA values to agree with our parameterization. 3 A list object consists of a list of components, each of which contains data with possibly different data structures. The elements of a list are ordered according to the order they are entered. The list is the most flexible data structure in R.

10 The plot shows a moderately strong positive correlation at lag 1.

11 The plot shows a moderately strong upward trend.

12

13 The dataset ma1.1.s is simulated by a MA(1) process with θ = 0.9. We compute that ρ 1 = The plot shows a moderately strong negative correlation at lag 1.

14

15

16 4.2.2 MA(2) Process Y t = e t θ 1 e t 1 θ 2 e t 2 We compute that γ 0 = Var (Y t ) = Var (e t θ 1 e t 1 θ 2 e t 2 ) = (1 + θ1 2 + θ2)σ 2 e, 2 γ 1 = Cov (Y t, Y t 1 ) = ( θ 1 + θ 1 θ 2 )σe, 2 γ 2 = Cov (Y t, Y t 2 ) = θ 2 σe, 2 γ k = 0 for k > 2. Theorem 2 For the MA(2) model Y t = e t θ 1 e t 1 θ 2 e t 2, ρ 1 = θ 1 + θ 1 θ θ1 2 +, ρ 2 = θ2 2 and ρ k = 0 for k = 3, 4, θ θ1 2 +, θ2 2

17 Ex 4.8. (Time Plot of an MA(2) Process with θ 1 = 1 and θ 2 = 0.6) The dataset ma2.s is simulated from a MA(2) process with θ 1 = 1 and θ 2 = 0.6, that is, Y t = e t e t e t 2. We have ρ 1 = and ρ 2 =

18 The scatterplot apparently reflects the negative autocorrelation at lag 1.

19 The plot shows a weak positive autocorrelation at lag 2.

20 The plot suggest the lack of autocorrelation at lag 3.

21 4.2.3 The General MA(q) Process Theorem 3 For the MA(q) process Y t = e t θ 1 e t 1 θ 2 e t 2 θ q e t q, we have γ 0 = (1 + θ θ θ 2 q)σ 2 e, ρ k = { θk +θ 1 θ k+1 +θ 2 θ k+2 + +θ q k θ q 1+θ 2 1 +θ θ2 q, for k = 1, 2,, q 0, for k > q where the numerator of ρ q is just θ q.

22 4.3 Autoregressive Processes Def. (AR(p)) A pth-order autoregressive process {Y t } satisfies the equation Y t = φ 1 Y t 1 + φ 2 Y t φ p Y t p + e t For every t, we assume that e t is independent of Y t 1, Y t 2, Y t 3, Autoregressive models are based on the idea that the current value Y t of the series can be explained as a function of p most recent past values plus an innovation term e t that incorporates everything new in the series at time t.

23 4.3.1 The AR(1) Process Y t = φy t 1 + e t Assume stationarity and zero mean on {Y t }. We take variances of both sides of Y t = φy t 1 + e t and obtain γ 0 = φ 2 γ 0 + σ 2 e = γ 0 = σ2 e 1 φ 2. By γ 0 > 0, we must have φ < 1. For any k > 0, we multiply both sides of Y t = φy t 1 + e t by Y t k and take expected values: E(Y t Y t k ) = φe(y t 1 Y t k ) + E(e t Y t k ) = γ k = φγ k 1, k = 1, 2,

24 Theorem 4 For the AR(1) process Y t = φy t 1 + e t, we have σ 2 e γ k = φ k 1 φ 2, ρ k = φ k, k = 0, 1, 2, Since φ < 1, the magnitude of the autocorrelation function decreases exponentially as the number of lags k increases. 1 If 0 < φ < 1, all correlations are positive. 2 if 1 < φ < 0, the lag 1 autocorrelation ρ 1 = φ is negative, and the signs of successive autocorrelations alternate with their magnitudes decreasing exponentially.

25 In R, the theoretical ACF of a stationary ARMA process can be computed by the ARMAacf function. The ar (resp. ma) parameter vector, if present, is to be passed into the function via the ar (resp. ma) argument. The maximum lag may be specified by the lag.max argument. Type?ARMAacf for more options (e.g. pacf). We define a function AR1acf to plot the autocorrelation functions for AR(1) models with different φ (Type?plotmath for more about displaying math symbols):

26 See TS-ch4.R. The file also shows simulated AR(1) series of size n = 300 and φ = 0.4, together with the plot of its sample acf.

27 Ex 4.13 The dataset ar1.s is simulated from a AR(1) process with φ = 0.9. The smoothness of the plot shows a strong autocorrelation at lag 1.

28 The plot shows a strong autocorrelation at lag 1.

29 The plot shows a strong autocorrelation at lag 2.

30 The plot shows a high autocorrelation at lag 3.

31 The AR(1) Model may be represented as a general linear process: Y t = e t + φy t 1 = e t + φ(e t 1 + φy t 2 ) = e t + φe t 1 + φ 2 Y t 2 = e t + φe t 1 + φ 2 (e t 2 + φy t 3 ) = e t + φe t 1 + φ 2 e t 2 + φ 3 Y t 3 = = e t + φe t 1 + φ 2 e t φ k 1 e t k+1 + φ k Y t k Y t = e t + φe t 1 + φ 2 e t 2 + φ 3 e t 3 + The stationarity condition for the AR(1) process Y t = φy t 1 + e t is φ < 1.

32 The AR(2) Process Y t = φ 1 Y t 1 + φ 2 Y t 2 + e t We assume that e t is independent of Y t 1, Y t 2,. The process is equivalent to e t = Y t φ 1 Y t 1 φ 2 Y t 2. Def. For the AR(2) process Y t = φ 1 Y t 1 + φ 2 Y t 2 + e t, the AR characteristic polynomial is φ(x) = 1 φ 1 x φ 2 x 2 and the AR characteristic equation is 1 φ 1 x φ 2 x 2 = 0 The equation has two (possibly complex) roots z 1, z 2 = φ 1 ± φ φ 2. 2φ 2

33 The stationarity of an AR process is determined by the roots of its characteristic equation. Theorem 5 A stationary process Y t = φ 1 Y t 1 + φ 2 Y t 2 + e t exists iff both roots of the AR characteristic equation has modulus exceed 1, iff it meets the stationarity conditions for the AR(2) model: φ 1 + φ 2 < 1, φ 2 φ 1 < 1, φ 2 < 1.

34

35 To derive the autocorrelation function for the AR(2) process, we multiply both sides of Y t = φ 1 Y t 1 + φ 2 Y t 2 + e t by Y t k and take expected value. Assuming stationarity, zero means, and that e t is independent of Y t k, we get dividing through by γ 0, γ k = φ 1 γ k 1 + φ 2 γ k 2, k = 1, 2, 3, (1) ρ k = φ 1 ρ k 1 + φ 2 ρ k 2, k = 1, 2, 3, (2) Equations (1) and (2) are called the Yule-Walker Equations, esp. the set of equations for k = 1 and k = 2: { ρ 1 = φ 1 + φ 2 ρ 1 ρ 2 = φ 1 ρ 1 + φ 2 (3) The ρ 1 and ρ 2 can be solved above, and successive ρ k may be calculated by the Yule-Walker Equations (2).

36 The variance γ 0 of the AR(2) process may be solved by the joint equations of 1 taking variances on both sides of Y t = φ 1 Y t 1 + φ 2 Y t 2 + e t, and 2 the Yule-Walker equation (1) for k = 1. ( ) 1 φ2 σe 2 γ 0 = 1 + φ 2 (1 φ 2 ) 2 φ 2. 1

37

38 Clearly G 1, G 2 < 1 for stationary processes. 1 If the roots are real and distinct, then ρ k and γ k are linear combinations of G1 k and G 2 k. Similarly if the roots are identical. The curves dies out exponentially. 2 If the roots are complex, then ρ k and γ k are linear combinations of R k sin(θk) and R k cos(θk), where R = φ 2 and cos Θ = φ 1 2 φ 2. The curves displays a damped sine wave behavior.

39

40

41 The smoothness of the plot shows the strong correlations in successive points.

42 The Ψ-coefficients for the AR(2) model as a general linear process may be obtained by substituting the general linear process representations of Y t, Y t 1 and Y t 2, Y t = e t + Ψ 1 e t 1 + Ψ 2 e t 2 +, to Y t = φ 1 Y t 1 + φ 2 Y t 2 + e t, then equating coefficients of e k and get the recursive relationships: Ψ 0 = 1, Ψ 1 φ 1 Ψ 0 = 0, Ψ k φ 1 Ψ k 1 φ 2 Ψ k 2 = 0, for k = 2, 3, The Ψ k can be solved recursively. The sequence {Ψ k } has similar pattern as that of {ρ k } and {γ k } (determined by whether the roots of characteristic equation are real or complex).

43 4.3.3 The General Autoregressive Process Def. The pth-order autoregressive model AR(p): Y t = φ 1 Y t 1 + φ 2 Y t φ p Y t p + e t (4) has AR characteristic polynomial φ(x) = 1 φ 1 x φ 2 x 2 φ p x p, and AR characteristic equation 1 φ 1 x φ 2 x 2 φ p x p = 0. Theorem 6 (Stationarity) The AR(p) process is stationary iff the p roots of the characteristic equation each exceeds 1 in modulus.

44 Assuming stationarity and zero means, we may multiply Equation (4) by Y t k, take expectations, divide by γ 0, and obtain the important recursive relationship ρ k = φ 1 ρ k 1 + φ 2 ρ k 2 + φ 3 ρ k φ p ρ k p for k 1. (5) Putting k = 1, 2,, p into Equation (5) and using ρ 0 = 1 and ρ k = ρ k, we get the general Yule-Walker equations ρ 1 = φ 1 + φ 2 ρ 1 + φ 3 ρ φ p ρ p 1 ρ 2 ρ 3 ρ p = φ 1 ρ 1 + φ 2 + φ 3 ρ φ p ρ p 2 = φ 1 ρ 2 + φ 2 ρ 1 + φ φ p ρ p 3. = φ 1 ρ p 1 + φ 2 ρ p 2 + φ 3 ρ p φ p (6) Given φ 1,, φ p, we can solve ρ 1,, ρ k by the Yule-Walker equations, and solve the other ρ s by (5).

45 Multiply (4) by e t and take expectation. We get E(e t Y t ) = σ 2 e. Multiply (4) by Y t and take expectation. We get Use ρ k = γ k /γ 0. We get γ 0 = φ 1 γ 1 + φ 2 γ φ p γ p + σ 2 e γ 0 = and solve the other γ s. σ 2 e 1 φ 1 ρ 1 φ 2 ρ 2 φ p ρ p Facts: each of the ρ k, γ k, and Ψ k (in general linear process representation) is a linear combination of exponentially decaying terms and damped sine wave terms corresponding to the roots of the characteristic equation.

46 4.4 The Mixed Autoregressive Moving Average Model Def. A process {Y t } is called a mixed autoregressive moving average process of orders p and q (abbr. ARMA(p, q)), if Y t = φ 1 Y t 1 +φ 2 Y t 2 + +φ p Y t p +e t θ 1 e t 1 θ 2 e t 2 θ q e t q (7) Remark. We assume that there are no common factors in the autoregressive and moving average polynomials: 1 φ 1 x φ 2 x 2 φ p x p and 1 θ 1 x θ 2 x 2 θ q x q. If there were, we could cancel them and the model would reduce to an ARMA model of lower order.

47 The ARMA(1,1) Model: The defining equation is To derive the Yule-Walker type equation: Y t = φy t 1 + e t θe t 1. (8) E(e t Y t ) = E[e t (φy t 1 + e t θe t 1 )] = σ 2 e E(e t 1 Y t ) = E[e t 1 (φy t 1 + e t θe t 1 )] = φσe 2 E(Y t k Y t ) = E[Y t k (φy t 1 + e t θe t 1 )] The last equation for k = 0, 1, 2, 3, yields γ 0 = φγ 1 + [1 θ(φ θ)]σe 2 γ 1 = φγ 0 θσe 2 γ k = φγ k 1 for k 2. (9)

48 We can solve that γ 0 = (1 2φθ + θ2 ) 1 φ 2 σe 2 (10) (1 θφ)(φ θ) ρ k = 1 2θφ + θ 2 φk 1 for k 1. (11) The ARMA(1,1) autocorrelation function decays exponentially with the damping factor φ as the lag k increases. The decay starts from initial value ρ 1. In contrast, the AR(1) autocorrelation function decays with the damping factor φ and from initial value ρ 0 = 1. (See Exercises 4.19 & 4.20) The general linear process form of ARMA(1,1) is: Y t = e t + (φ θ) φ j 1 e t j. j=1

49 The ARMA(p,q) model: Theorem 7 (Stationarity) A stationary solution to the model Y t = φ 1 Y t 1 +φ 2 Y t 2 + +φ p Y t p +e t θ 1 e t 1 θ 2 e t 2 θ q e t q exists iff all the roots of the AR characteristic equation φ(x) = 0 exceed 1 in modulus. The autocorrelation function can be shown to satisfy ρ k = φ 1 ρ k 1 + φ 2 ρ k φ p ρ k p for k > q. (12) Similarly equations can be developed for k = 1, 2,, q that involve θ 1, θ 2,, θ q. The autocorrelation function can be computed by ARMAacf in R. The ARMA(p,q) model can be written as a general linear process with the Ψ coefficients determined by similar equations as those of the AR(p) model, except that θ 1,, θ q are involved.

50 4.5 Invertibility The MA(1) model with θ has the same autocorrelation function as the MA(1) model with 1/θ. Similar nonuniqueness of MA(q) models for given autocorrelation function exist in every q. It can be resolved by assuming invertibility of the MA(q) model. An AR process can always be expressed as a general linear process (an infinite-order MA process). Conversely, how and when can an MA process be expressed as an infinite-order AR process?

51 Ex. Consider the MA(1) model: Y t = e t θe t 1. We get e t = Y t + θe t 1 = Y t + θ (Y t 1 + θe t 2 ) = Y t + θy t 1 + θ 2 e t 2 = = Y t + θy t 1 + θ 2 Y t 2 + When θ < 1, we succeed in converting the MA(1) process to an AR process: Y t = θy t 1 θ 2 Y t 2 + e t So the MA(1) model is invertible iff θ < 1.

52 Def. For a general MA(q) or ARMA(p,q) model, we define the MA characteristic polynomial as θ(x) = 1 θ 1 x θ 2 x 2 θ q x q and the MA characteristic equation Theorem 8 (Invertibility) 1 θ 1 x θ 2 x 2 θ q x q = 0. The MA(q) or ARMA(p,q) model is invertible; that is, there are coefficients π j such that Y t = π 1 Y t 1 + π 2 Y t 2 + π 3 Y t e t iff the roots of the MA characteristic equation exceed 1 in modulus. Theorem 9 Given a suitable autocorrelation function, there is only one set of parameter values that yield an invertible MA process. From now on, we require both stationarity and invertibility for an ARMA(p,q) model.

53

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before. ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Homework 2 Solutions

Homework 2 Solutions Math 506 Spring 201 Homework 2 Solutions 1. Textbook Problem 2.7: Since {Y t } is stationary, E[Y t ] = µ Y for all t and {Y t } has an autocovariance function γ Y. Therefore, (a) E[W t ] = E[Y t Y t 1

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

Ch 9. FORECASTING. Time Series Analysis

Ch 9. FORECASTING. Time Series Analysis In this chapter, we assume the model is known exactly, and consider the calculation of forecasts and their properties for both deterministic trend models and ARIMA models. 9.1 Minimum Mean Square Error

More information

STAD57 Time Series Analysis. Lecture 8

STAD57 Time Series Analysis. Lecture 8 STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Chapter 5: Models for Nonstationary Time Series

Chapter 5: Models for Nonstationary Time Series Chapter 5: Models for Nonstationary Time Series Recall that any time series that is a stationary process has a constant mean function. So a process that has a mean function that varies over time must be

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

Ch3. TRENDS. Time Series Analysis

Ch3. TRENDS. Time Series Analysis 3.1 Deterministic Versus Stochastic Trends The simulated random walk in Exhibit 2.1 shows a upward trend. However, it is caused by a strong correlation between the series at nearby time points. The true

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Moving Average (MA) representations

Moving Average (MA) representations Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

A Data-Driven Model for Software Reliability Prediction

A Data-Driven Model for Software Reliability Prediction A Data-Driven Model for Software Reliability Prediction Author: Jung-Hua Lo IEEE International Conference on Granular Computing (2012) Young Taek Kim KAIST SE Lab. 9/4/2013 Contents Introduction Background

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

Time Series. Chapter Time Series Data

Time Series. Chapter Time Series Data Chapter 10 Time Series 10.1 Time Series Data The main difference between time series data and cross-sectional data is the temporal ordering. To emphasize the proper ordering of the observations, Table

More information

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis CHAPTER 8 MODEL DIAGNOSTICS We have now discussed methods for specifying models and for efficiently estimating the parameters in those models. Model diagnostics, or model criticism, is concerned with testing

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Lecture note 2 considered the statistical analysis of regression models for time

Lecture note 2 considered the statistical analysis of regression models for time DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Stochastic Modelling Solutions to Exercises on Time Series

Stochastic Modelling Solutions to Exercises on Time Series Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Chapter 8: Model Diagnostics

Chapter 8: Model Diagnostics Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

Dynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models

Dynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models 6 Dependent data The AR(p) model The MA(q) model Hidden Markov models Dependent data Dependent data Huge portion of real-life data involving dependent datapoints Example (Capture-recapture) capture histories

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Final Examination 7/6/2011

Final Examination 7/6/2011 The Islamic University of Gaza Faculty of Commerce Department of Economics & Applied Statistics Time Series Analysis - Dr. Samir Safi Spring Semester 211 Final Examination 7/6/211 Name: ID: INSTRUCTIONS:

More information

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) The maximum number of points on this exam is

More information