Time Series Examples Sheet

Size: px
Start display at page:

Download "Time Series Examples Sheet"

Transcription

1 Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: Throughout, unless otherwise stated, the sequence {ǫ t } is white noise, variance σ 2. 1

2 1. Find the Yule-Walker equations for the AR(2) process X t = 1 3 X t X t 2 + ǫ t. Hence show that it has autocorrelation function ( ρ k = 16 2 ) k ( k 3), k Z. [ The Yule-Walker equations are ρ k 1 3 ρ k ρ k 2 = 0, k 2. On trying ρ k = Aλ k, we require λ λ 2 9 = 0. This has roots 2 3 and 1 3, so ρ k = A ( ) 2 k ( 3 + B 1 k 3), where ρ 0 = A + B = 1. We also require ρ 1 = ρ 1. Hence ρ 1 = 3 7, and thus we require 2 3 A 1 3 B = These give A = 21, B = ] 2

3 2. Let X t = A cos(ωt+u), where A is an arbitrary constant, Ω and U are independent random variables, Ω has distribution function F over [0, π], and U is uniform over [0, 2π]. Find the autocorrelation function and spectral density function of {X t }. Hence show that, for any positive definite set of covariances {γ k }, there exists a process with autocovariances {γ k } such that every realization is a sine wave. [Use the following definition: {γ k } are positive definite if there exists a nondecreasing function F such that γ k = π π eikω df(ω).] [ 2π E[X t Ω] = 1 A cos(ωt + u) du = 1 2π 2π 0 A sin(ωt + u) 2π 0 = 0 π E[X t+s X t ] = 1 2π 2π 0 0 π 2π = 1 4π 0 0 π = π = = 1 4 A2 π A cos(ω(t + s) + u)a cos(ωt + u) du df(ω) A 2 [cos(ω(2t + s) + 2u) + cos(ωs)] du df(ω) A 2 cos(ωs)df(ω) A 2 [ e iωs + e iωs] df(ω) π e iωs d F(Ω) where we define over the range [ π, π] the nondecreasing function F, by F( Ω) = F(π) F(Ω) and F(Ω) = F(Ω) + F(π) 2F(0), Ω [0, π]. ] 3

4 3. Find the spectral density function of the AR(2) process X t = φ 1 X t 1 + φ 2 X t 2 + ǫ t. What conditions on (φ 1, φ 2 ) are required for this process to be an indeterministic second order stationary? Sketch in the (φ 1, φ 2 ) plane the stationary region. [ We have Hence f X (ω) = f X (ω) 1 φ1 e iω φ 2 e 2iω 2 = σ 2 /π σ 2 π [1 + φ φ ( φ 1 + φ 1 φ 2 ) cos(ω) 2φ 2 cos(2ω)] The Yule-Walker equations have solution of the form ρ k = Aλ k 1 + Bλk 2 where λ 1, λ 2 are roots of g(λ) = λ 2 φ 1 λ φ 2 = 0. The roots are λ = [ φ 1 ± φ φ 2 ] /2. To be indeterministic second order stationary these roots must have modulus less that 1. If φ φ 2 > 0 then the roots are real and lie in the range [ 1, 1] if and only if g( 1) > 0 and g(1) > 0, i.e., φ 1 + φ 2 < 1, φ 1 φ 2 > 1. If φ φ 2 < 0 then the roots are complex and their product must be less than 1, i.e., φ 2 > 1. The union of these two regions, corresponding to possible (φ 1, φ 2 ) for real and imaginary roots, is simply the triangular region ] φ 1 + φ 2 < 1, φ 1 φ 2 > 1, φ

5 4. For a stationary process define the covariance generating function g(z) = γ k z k, z < 1. k= Suppose {X t } satisfies X = C(B)ǫ, that is, it has the Wold representation X t = c r ǫ t r, r=0 where {c r } are constants satisfying 0 c2 r < and C(z) = r=0 c rz r. Show that g(z) = C(z)C(z 1 )σ 2. Explain how this can be used to derive autocovariances for the ARMA(p, q) model. Hence show that for ARMA(1, 1), ρ 2 2 = ρ 1 ρ 3. How might this fact be useful? [ We have [ ] γ k = EX t X t+k = E c r ǫ t r c s ǫ t+k s = σ 2 c r c k+r r=0 r=0 s=0 Now C(z)C(z 1 ) = The coefficients of z k and z k are clearly from which the result follows. c r z r c s z s r=0 s=0 c k c 0 + c k+1 c 1 + c k+2 c 3 + For the ARMA(p, q) model φ(b)x = θ(b)ǫ or X = θ(b) φ(b) ǫ where φ and θ are polynomials of degrees p and q in z. Hence C(z) = θ(z) φ(z) 5

6 and γ k can be found as the coefficient of z k in the power series expansion of σ 2 θ(z)θ(1/z)/φ(z)φ(1/z). For ARMA(1, 1) this is σ 2 (1 + θz)(1 + θz 1 )(1 + φz + φ 2 z 2 + )(1 + φz 1 + φ 2 z 2 + ) from which we have ( γ 1 = θ(1 + φ 2 + φ 4 + ) ) + (φ + φ 3 + φ 5 + )(1 + θ 2 ) + θ(φ 2 + φ 4 + φ 6 + ) and similarly = θ + φ(1 + θ2 ) + φ 2 θ 1 φ 2 σ 2 γ 2 = φ θ + φ(1 + θ2 ) + φ 2 θ 1 φ 2 σ 2 γ 3 = φ 2θ + φ(1 + θ2 ) + φ 2 θ 1 φ 2 σ 2 Hence ρ 2 2 = ρ 1 ρ 3. This might be used as a diagnostic to test the appropriateness of an ARMA(1, 1) model, by reference to the correlogram, where we would expect to see r 2 2 = r 1 r 3. ] σ 2 6

7 5. Consider the ARMA(2, 1) process defined as X t = φ 1 X t 1 + φ 2 X t 2 + ǫ t + θ 1 ǫ t 1. Show that the coefficients of the Wold representation satisfy the difference equation and hence that c k = φ 1 c k 1 + φ 2 c k 2, k 2, c k = Az k 1 + Bz k 2, where z 1 and z 2 are zeros of φ(z) = 1 φ 1 z φ 2 z 2, and A and B are constants. Explain how in principle one could find A and B. [ The recurrence is produced by substituting X t = r=0 c rǫ t r into the defining equation, and similarly for X t 1 and X t 2, multiplying by ǫ t k, k 2, and taking expected value. The general solution to such a second order linear recurrence relation is of the form given and we find A and B by noting that X t = φ 1 (φ 1 X t 2 + φ 2 X t 3 + ǫ t 1 + θ 1 ǫ t 2 ) + φ 2 X t 2 + ǫ t + θ 1 ǫ t 1 so that c 0 = 1 and c 1 = θ 1 +φ 1. Hence A+B = 1 and Az 1 1 +Bz 1 2 = θ 1 +φ 1. These can be solved for A and B. ] 7

8 6. Suppose Y t = X t + ǫ t, X t = αx t 1 + η t, where {ǫ t } and {η t } are independent white noise sequences with common variance σ 2. Show that the spectral density function of {Y t } is { } f Y (ω) = σ2 2 2α cosω + α 2. π 1 2α cosω + α 2 For what values of p, d, q is the autocovariance function of {Y t } identical to that of an ARIMA(p, d, q) process? [ f Y (ω) = f X (ω) + f ǫ (ω) = { = σ2 π 1 1 2α cosω + α αe iω 2f η(ω) + f ǫ (ω) } { } = σ2 2 2α cosω + α 2 π 1 2α cosω + α 2. We recognise this as the spectral density of an ARMA(1, 1) model. E.g., Z t αz t 1 = ξ t θξ t 1, choosing θ and σξ 2 such that (1 2θ cosω + θ 2 )σ 2 ξ = (σ2 /π)(2 2α cosω + α 2 ) I.e., choosing θ such that (1 + θ 2 )/θ = (2 + α 2 )/α. ] 8

9 7. Suppose X 1,..., X T are values of a time series. Prove that { } T 1 ˆγ ˆγ k = 0, where ˆγ k is the usual estimator of the kth order autocovariance, k=1 ˆγ k = 1 T T (X t X)(X t k X). t=k+1 Hint: Consider 0 = T t=1 (X t X). Hence deduce that not all ordinates of the correlogram can have the same sign. Suppose f( ) is the spectral density and I( ) the periodogram. Suppose f is continuous and f(0) 0. Does EI(2π/T) f(0) as T? [ The results follow directly from Note that formally, 1 T [ T X)] 2 (X t = 0. t=1 T 1 I(0) = ˆγ ˆγ k = 0. so it might appear that EI(2π/T) I(0) f(0) as T. However, this would be mistaken. It is a theorem that as T, I(ω j ) f(ω j )χ 2 2 /2. So for large T, EI(2π/T) f(0). ] k=1 9

10 8. Suppose I( ) is the periodogram of ǫ 1,..., ǫ T, where these are i.i.d. N(0, 1) and T = 2m + 1. Let ω j, ω k be two distinct Fourier frequencies, Show that I(ω j ) and I(ω k ) are independent random variables. What are their distributions? If it is suspected that {ǫ t } departs from white noise because of the presence of a single harmonic component at some unknown frequency ω a natural test statistic is the maximum periodogram ordinate T = max j=1,...,m I(ω j). Show that under the hypothesis that {ǫ t } is white noise P(T > t) = 1 { 1 exp ( πt/σ 2)} m. [ The independence of I(ω j ) and I(ω k ) was proved in lectures. Their distributions are (σ 2 /2π)χ 2 2, which is equivalent to the exponential distribution with mean σ2 /π. Hence the probability that the maximum is less than t is the probability that all are, i.e., P(T < t) = { 1 exp ( πt/σ 2)} m. ] 10

11 9. Complete this sketch of the fast Fourier transform. From data X 0,..., X T, with T = 2 M 1, we want to compute the 2 M 1 ordinates of the periodogram I(ω j ) = 1 T 2 X t e it2πj/2m, j = 1,..., 2 M 1. πt t=0 This requires order T multiplications for each j and so order T 2 multiplications in all. However, X t e it2πj/2m = X t e it2πj/2m + X t e it2πj/2m t=0,1,...,2 M 1 = = t=0,1,...,2 M 1 1 t=0,1,...,2 M 1 1 X 2t e i2t2πj/2m + t=0,2,...,2 M 2 t=0,1,...,2 M 1 1 X 2t e it2πj/2m 1 + e i2πj/2m t=0,1,...,2 M 1 1 t=1,3,...,2 M 1 X 2t+1 e i(2t+1)2πj/2m X 2t+1 e it2πj/2m 1. Note that the value of either sum on the right hand side at j = k is the complex conjugate of its value at j = (2 M 1 k); so these sums need only be computed for j = 1,..., 2 M 2. Thus we have two sums, each of which is similar to the sum on the left hand side, but for a problem half as large. Suppose the computational effort required to work out each right hand side sum (for all 2 M 2 values of j) is Θ(M 1). The sum on the left hand side is obtained (for all 2 M 1 values of j) by combining the right hand sums, with further computational effort of order 2 M 1. Explain Θ(M) = a2 M 1 + 2Θ(M 1). Hence deduce that I( ) can be computed (by the FFT) in time T log 2 T. [ The derivation of the recurrence for Θ(M) should be obvious. We have Θ(1) = 1, and hence Θ(M) = am2 M = O(T log 2 T). ] 11

12 10. Suppose we have the ARMA(1, 1) process X t = φx t 1 + ǫ t + θǫ t 1, with φ < 1, θ < 1, φ + θ 0, observed up to time T, and we want to calculate k-step ahead forecasts ˆX T,k, k 1. Derive a recursive formula to calculate ˆX T,k for k = 1 and k = 2. [ ] ˆX T,1 = φx T + ˆǫ T+1 + θˆǫ T = φx T + θ(x T ˆX T 1,1 ) ˆX T,2 = φ ˆX T,1 + ˆǫ T+2 + θˆǫ T+1 = φ ˆX T,1 12

13 11. Consider the stationary scalar-valued process {X t } generated by the moving average, X t = ǫ t θǫ t 1. Determine the linear least-square predictor of X t, in terms of X t 1, X t 2,.... [ We can directly apply our results to give ˆX t 1,1 = θˆǫ t 1 = θ[x t 1 ˆX t 2,1 ] = θx t 1 + θ ˆX t 2,1 = θx t 1 + θ[ θx t 2 + θ ˆX t 3,1 ] = θx t 1 θ 2 X t 2 θ 3 X t 3 Alternatively, take the linear predictor as ˆX t 1,1 = r=1 a rx t r and seek to minimize E[X t ˆX t 1,1 ] 2. We have [ ] 2 E[X t ˆX t 1,1 ] 2 = E ǫ t θǫ t 1 a r (ǫ t r θǫ t r 1 ) r=1 = σ 2 [ 1 + (θ + a 1 ) 2 + (θa 1 a 2 ) 2 + (θa 2 a 3 ) 2 + ] Note that all terms, but the first, are minimized to 0 by taking a r = θ r. ] 13

14 12. Consider the ARIMA(0, 2, 2) model where {ǫ t } is white noise with variance 1. (I B) 2 X = (I 0.81B B 2 )ǫ (a) With data up to time T, calculate the k-step ahead optimal forecast of ˆXT,k for all k 1. By giving a general formula relating ˆX T,k, k 3, to ˆX T,1 and ˆX T,2, determine the curve on which all these forecasts lie. [ The model is Hence and similarly X t = 2X t 1 X t 2 + ǫ t 0.81ǫ t ǫ t 2. ˆX T,1 = 2X T X T 1 + ˆǫ T ˆǫ T ˆǫ T 1 = 2X T X T [X T ˆX T 1,1 ] [X T 1 ˆX T 2,1 ] ˆX T,2 = 2 ˆX T,1 X T [X T ˆX T 1,1 ] ˆX T,k = 2 ˆX T,k 1 ˆX T,k 2, k 3. This implies that the forecasts lie on a straight line. ] (b) Suppose now that T = 95. Calculate numerically the forecasts ˆX 95,k, k = 1, 2, 3 and their mean squared prediction errors when the last five observations are X 91 = 15.1, X 92 = 15.8, X 93 = 15.9, X 94 = 15.2, X 95 = [You will need estimates for ǫ 94 and ǫ 95. Start by assuming ǫ 91 = ǫ 92 = 0, then calculate ˆǫ 93 = ǫ 93 = X 93 ˆX 92,1, and so on, until ǫ 94 and ǫ 95 are obtained.] [ Using the above formulae we obtain t X t ˆXt,1 ˆXt,2 ˆXt,3 ǫ t Now X T+k = c r ǫ T+k r and ˆXT,k = r=0 c r ǫ T+k r. r=k 14

15 Thus where σ 2 ǫ = 1. Now E [X T+k ˆX ] 2 k 1 T,k = σ 2 ǫ c 2 r. r=0 X T = ǫ T + (2 0.81)ǫ T 1 + ( 2(0.81) )ǫ T 2 + Hence the mean square errors of ˆX 95,1, ˆX95,2, ˆX95,3 are respectively 1, 1.416, ] 15

16 13. Consider the state space model, X t = S t + v t, S t = S t 1 + w t, where X t and S t are both scalars, X t is observed, S t is unobserved, and {v t }, {w t } are Gaussian white noise sequences with variances V and W respectively. Write down the Kalman filtering equations for Ŝt and P t. Show that P t P (independently of t) if and only if P 2 + PW = WV, and show that in this case the Kalman filter for Ŝt is equivalent to exponential smoothing. [ This is the same as Section 8.4 of the notes. F t = 1, G) t = 1, V t = V, W t = W. R t = P t 1 +W. So if (S t 1 X 1,..., X t 1 ) N (Ŝt 1, P t 1 then (S t X 1,..., X t ) ) N (Ŝt, P t, where P t = R t Ŝ t = Ŝt 1 + R t (V + R t ) 1 (X t Ŝt 1) R2 t V + R t = V R t V + R t = V (P t 1 + W) V + P t 1 + W. P t is constant if P t = P, where P is the positive root of P 2 + WP WV = 0. In this case Ŝt behaves like Ŝt = (1 α) r=0 αr X t r, where α = V/(V + W + P). This is simple exponential smoothing. ] 16

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Analysis. Solutions to problems in Chapter 5 IMM Time Series Analysis Solutions to problems in Chapter 5 IMM Solution 5.1 Question 1. [ ] V [X t ] = V [ǫ t + c(ǫ t 1 + ǫ t + )] = 1 + c 1 σǫ = The variance of {X t } is not limited and therefore {X t }

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm Booth School of Business, University of Chicago Business 41914, Spring Quarter 017, Mr Ruey S Tsay Solutions to Midterm Problem A: (51 points; 3 points per question) Answer briefly the following questions

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Time Series Solutions HT 2009

Time Series Solutions HT 2009 Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

STAD57 Time Series Analysis. Lecture 23

STAD57 Time Series Analysis. Lecture 23 STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

IDENTIFICATION OF ARMA MODELS

IDENTIFICATION OF ARMA MODELS IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

Stochastic Modelling Solutions to Exercises on Time Series

Stochastic Modelling Solutions to Exercises on Time Series Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0 MODULE 9: STATIONARY PROCESSES 7 Lecture 2 Autoregressive Processes 1 Moving Average Process Pure Random process Pure Random Process or White Noise Process: is a random process X t, t 0} which has: E[X

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

8.2 Harmonic Regression and the Periodogram

8.2 Harmonic Regression and the Periodogram Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations. Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,

More information

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

AR(p) + I(d) + MA(q) = ARIMA(p, d, q) AR(p) + I(d) + MA(q) = ARIMA(p, d, q) Outline 1 4.1: Nonstationarity in the Mean 2 ARIMA Arthur Berg AR(p) + I(d)+ MA(q) = ARIMA(p, d, q) 2/ 19 Deterministic Trend Models Polynomial Trend Consider the

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2: EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Examination paper for Solution: TMA4285 Time series models

Examination paper for Solution: TMA4285 Time series models Department of Mathematical Sciences Examination paper for Solution: TMA4285 Time series models Academic contact during examination: Håkon Tjelmeland Phone: 4822 1896 Examination date: December 7th 2013

More information

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Circle a single answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before. ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Computer Exercise 0 Simulation of ARMA-processes

Computer Exercise 0 Simulation of ARMA-processes Lund University Time Series Analysis Mathematical Statistics Fall 2018 Centre for Mathematical Sciences Computer Exercise 0 Simulation of ARMA-processes The purpose of this computer exercise is to illustrate

More information

MATH 5075: Time Series Analysis

MATH 5075: Time Series Analysis NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Lecture 4 - Spectral Estimation

Lecture 4 - Spectral Estimation Lecture 4 - Spectral Estimation The Discrete Fourier Transform The Discrete Fourier Transform (DFT) is the equivalent of the continuous Fourier Transform for signals known only at N instants separated

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm Chicago Booth Honor Code: I pledge my honor that I have not violated the Honor Code during

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Time Series Solutions HT Let fx t g be the ARMA(1, 1) process, where jffij < 1 and j j < 1. Show that the autocorrelation function of

Time Series Solutions HT Let fx t g be the ARMA(1, 1) process, where jffij < 1 and j j < 1. Show that the autocorrelation function of 1+ 2 +2ffi ; ρ(h) =ffih 1 ρ(1) for h > 1: Time Series Solutions HT 2008 1. Let fx t g be the ARMA(1, 1) process, X t ffix t 1 = ffl t + ffl t 1 ; fffl t gοwn(0;ff 2 ); where jffij < 1 and j j < 1. Show

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions

More information

Class: Trend-Cycle Decomposition

Class: Trend-Cycle Decomposition Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis

More information

Introduction to Signal Processing

Introduction to Signal Processing to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

On Moving Average Parameter Estimation

On Moving Average Parameter Estimation On Moving Average Parameter Estimation Niclas Sandgren and Petre Stoica Contact information: niclas.sandgren@it.uu.se, tel: +46 8 473392 Abstract Estimation of the autoregressive moving average (ARMA)

More information

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2 Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

Introduction to Time Series Analysis. Lecture 18.

Introduction to Time Series Analysis. Lecture 18. Introduction to Time Series Analysis. Lecture 18. 1. Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Trend-Cycle Decompositions

Trend-Cycle Decompositions Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

STAT 153: Introduction to Time Series

STAT 153: Introduction to Time Series STAT 153: Introduction to Time Series Instructor: Aditya Guntuboyina Lectures: 12:30 pm - 2 pm (Tuesdays and Thursdays) Office Hours: 10 am - 11 am (Tuesdays and Thursdays) 423 Evans Hall GSI: Brianna

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")

More information

ENGR352 Problem Set 02

ENGR352 Problem Set 02 engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).

More information

Cointegrated VARIMA models: specification and. simulation

Cointegrated VARIMA models: specification and. simulation Cointegrated VARIMA models: specification and simulation José L. Gallego and Carlos Díaz Universidad de Cantabria. Abstract In this note we show how specify cointegrated vector autoregressive-moving average

More information