Time Series Examples Sheet

Similar documents
Time Series Examples Sheet

Time Series: Theory and Methods

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Exercises - Time series analysis

Some Time-Series Models

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

E 4101/5101 Lecture 6: Spectral analysis

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Chapter 4: Models for Stationary Time Series

IDENTIFICATION OF ARMA MODELS

8.2 Harmonic Regression and the Periodogram

Introduction to ARMA and GARCH processes

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

Stochastic Modelling Solutions to Exercises on Time Series

Time Series Solutions HT 2009

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Advanced Econometrics

Elements of Multivariate Time Series Analysis

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Covariances of ARMA Processes

Midterm Suggested Solutions

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

ITSM-R Reference Manual

TMA4285 December 2015 Time series models, solution.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

MAT3379 (Winter 2016)

3. ARMA Modeling. Now: Important class of stationary processes

Econ 623 Econometrics II Topic 2: Stationary Time Series

On Moving Average Parameter Estimation

Time Series 2. Robert Almgren. Sept. 21, 2009

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

1 Linear Difference Equations

Econometría 2: Análisis de series de Tiempo

6.3 Forecasting ARMA processes

Time Series I Time Domain Methods

Lecture 4 - Spectral Estimation

Statistics 349(02) Review Questions

Chapter 6: Model Specification for Time Series

Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Classic Time Series Analysis

Lecture 2: Univariate Time Series

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Ch 4. Models For Stationary Time Series. Time Series Analysis

3 Theory of stationary random processes

Forecasting. Simon Shaw 2005/06 Semester II

X t = a t + r t, (7.1)

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

5 Transfer function modelling

STAD57 Time Series Analysis. Lecture 23

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Advanced Digital Signal Processing -Introduction

Nonlinear time series

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Modelling using ARMA processes

A time series is called strictly stationary if the joint distribution of every collection (Y t

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Trend-Cycle Decompositions

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Class 1: Stationary Time Series Analysis

Examination paper for TMA4285 Time Series Models

ARIMA Modelling and Forecasting

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Figure 18: Top row: example of a purely continuous spectrum (left) and one realization

FE570 Financial Markets and Trading. Stevens Institute of Technology

Class: Trend-Cycle Decomposition

Examination paper for Solution: TMA4285 Time series models

X random; interested in impact of X on Y. Time series analogue of regression.

Computer Exercise 0 Simulation of ARMA-processes

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Levinson Durbin Recursions: I

Levinson Durbin Recursions: I

Econometrics II Heij et al. Chapter 7.1

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

ENGR352 Problem Set 02

Time series analysis of activity and temperature data of four healthy individuals

Introduction to Signal Processing

Stationary Stochastic Time Series Models

Autoregressive Moving Average (ARMA) Models and their Practical Applications

ARMA (and ARIMA) models are often expressed in backshift notation.

CONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1

STAT 248: EDA & Stationarity Handout 3

Parameter estimation: ACVF of AR processes

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

STAT 443 (Winter ) Forecasting

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Transcription:

Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout, unless otherwise stated, the sequence {ǫ t } is white noise, variance σ 2. 1

1. Find the Yule-Walker equations for the AR(2) process X t = 1 3 X t 1 + 2 9 X t 2 + ǫ t. Hence show that it has autocorrelation function ( ρ k = 16 2 ) k ( 21 3 + 5 21 1 k 3), k Z. 2

2. Let X t = A cos(ωt+u), where A is an arbitrary constant, Ω and U are independent random variables, Ω has distribution function F over [0, π], and U is uniform over [0, 2π]. Find the autocorrelation function and spectral density function of {X t }. Hence show that, for any positive definite set of covariances {γ k }, there exists a process with autocovariances {γ k } such that every realization is a sine wave. [Use the following definition: {γ k } are positive definite if there exists a nondecreasing function F such that γ k = π π eikω df(ω).] 3

3. Find the spectral density function of the AR(2) process X t = φ 1 X t 1 + φ 2 X t 2 + ǫ t. What conditions on (φ 1, φ 2 ) are required for this process to be an indeterministic second order stationary? Sketch in the (φ 1, φ 2 ) plane the stationary region. 4

4. For a stationary process define the covariance generating function g(z) = γ k z k, z < 1. k= Suppose {X t } satisfies X = C(B)ǫ, that is, it has the Wold representation X t = c r ǫ t r, r=0 where {c r } are constants satisfying 0 c2 r < and C(z) = r=0 c rz r. Show that g(z) = C(z)C(z 1 )σ 2. Explain how this can be used to derive autocovariances for the ARMA(p, q) model. Hence show that for ARMA(1, 1), ρ 2 2 = ρ 1ρ 3. How might this fact be useful? 5

5. Consider the ARMA(2, 1) process defined as X t = φ 1 X t 1 + φ 2 X t 2 + ǫ t + θ 1 ǫ t 1. Show that the coefficients of the Wold representation satisfy the difference equation and hence that c k = φ 1 c k 1 + φ 2 c k 2, k 2, c k = Az k 1 + Bz k 2, where z 1 and z 2 are zeros of φ(z) = 1 φ 1 z φ 2 z 2, and A and B are constants. Explain how in principle one could find A and B. 6

6. Suppose Y t = X t + ǫ t, X t = αx t 1 + η t, where {ǫ t } and {η t } are independent white noise sequences with common variance σ 2. Show that the spectral density function of {Y t } is { } f Y (ω) = σ2 2 2α cosω + α 2. π 1 2α cosω + α 2 For what values of p, d, q is the autocovariance function of {Y t } identical to that of an ARIMA(p, d, q) process? 7

7. Suppose X 1,..., X T are values of a time series. Prove that { } T 1 ˆγ 0 + 2 ˆγ k = 0, where ˆγ k is the usual estimator of the kth order autocovariance, k=1 ˆγ k = 1 T T (X t X)(X t k X). t=k+1 Hint: Consider 0 = T t=1 (X t X). Hence deduce that not all ordinates of the correlogram can have the same sign. Suppose f( ) is the spectral density and I( ) the periodogram. Suppose f is continuous and f(0) 0. Does EI(2π/T) f(0) as T? 8

8. Suppose I( ) is the periodogram of ǫ 1,..., ǫ T, where these are i.i.d. N(0, 1) and T = 2m + 1. Let ω j, ω k be two distinct Fourier frequencies, Show that I(ω j ) and I(ω k ) are independent random variables. What are their distributions? If it is suspected that {ǫ t } departs from white noise because of the presence of a single harmonic component at some unknown frequency ω a natural test statistic is the maximum periodogram ordinate T = max j=1,...,m I(ω j). Show that under the hypothesis that {ǫ t } is white noise P(T > t) = 1 { 1 exp ( πt/σ 2)} m. 9

9. Complete this sketch of the fast Fourier transform. From data X 0,..., X T, with T = 2 M 1, we want to compute the 2 M 1 ordinates of the periodogram I(ω j ) = 1 T 2 X t e it2πj/2m, j = 1,..., 2 M 1. πt t=0 This requires order T multiplications for each j and so order T 2 multiplications in all. However, X t e it2πj/2m = X t e it2πj/2m + X t e it2πj/2m t=0,1,...,2 M 1 = = t=0,1,...,2 M 1 1 t=0,1,...,2 M 1 1 X 2t e i2t2πj/2m + t=0,2,...,2 M 2 t=0,1,...,2 M 1 1 X 2t e it2πj/2m 1 + e i2πj/2m t=0,1,...,2 M 1 1 t=1,3,...,2 M 1 X 2t+1 e i(2t+1)2πj/2m X 2t+1 e it2πj/2m 1. Note that the value of either sum on the right hand side at j = k is the complex conjugate of its value at j = (2 M 1 k); so these sums need only be computed for j = 1,..., 2 M 2. Thus we have two sums, each of which is similar to the sum on the left hand side, but for a problem half as large. Suppose the computational effort required to work out each right hand side sum (for all 2 M 2 values of j) is Θ(M 1). The sum on the left hand side is obtained (for all 2 M 1 values of j) by combining the right hand sums, with further computational effort of order 2 M 1. Explain Θ(M) = a2 M 1 + 2Θ(M 1). Hence deduce that I( ) can be computed (by the FFT) in time T log 2 T. 10

10. Suppose we have the ARMA(1, 1) process X t = φx t 1 + ǫ t + θǫ t 1, with φ < 1, θ < 1, φ + θ 0, observed up to time T, and we want to calculate k-step ahead forecasts ˆX T,k, k 1. Derive a recursive formula to calculate ˆX T,k for k = 1 and k = 2. 11

11. Consider the stationary scalar-valued process {X t } generated by the moving average, X t = ǫ t θǫ t 1. Determine the linear least-square predictor of X t, in terms of X t 1, X t 2,.... 12

12. Consider the ARIMA(0, 2, 2) model where {ǫ t } is white noise with variance 1. (I B) 2 X = (I 0.81B + 0.38B 2 )ǫ (a) With data up to time T, calculate the k-step ahead optimal forecast of ˆXT,k for all k 1. By giving a general formula relating ˆX T,k, k 3, to ˆX T,1 and ˆX T,2, determine the curve on which all these forecasts lie. (b) Suppose now that T = 95. Calculate numerically the forecasts ˆX 95,k, k = 1, 2, 3 and their mean squared prediction errors when the last five observations are X 91 = 15.1, X 92 = 15.8, X 93 = 15.9, X 94 = 15.2, X 95 = 15.9. [You will need estimates for ǫ 94 and ǫ 95. Start by assuming ǫ 91 = ǫ 92 = 0, then calculate ˆǫ 93 = ǫ 93 = X 93 ˆX 92,1, and so on, until ǫ 94 and ǫ 95 are obtained.] 13

13. Consider the state space model, X t = S t + v t, S t = S t 1 + w t, where X t and S t are both scalars, X t is observed, S t is unobserved, and {v t }, {w t } are Gaussian white noise sequences with variances V and W respectively. Write down the Kalman filtering equations for Ŝt and P t. Show that P t P (independently of t) if and only if P 2 + PW = WV, and show that in this case the Kalman filter for Ŝt is equivalent to exponential smoothing. 14