Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Similar documents
TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Elements of Multivariate Time Series Analysis

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series: Theory and Methods

New Introduction to Multiple Time Series Analysis

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Full terms and conditions of use:

Some Time-Series Models

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Lecture 2: Univariate Time Series

A Course in Time Series Analysis

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Classic Time Series Analysis

FE570 Financial Markets and Trading. Stevens Institute of Technology

Time Series Analysis. Nonstationary Data Analysis by Time Deformation

3 Theory of stationary random processes

APPLIED TIME SERIES ECONOMETRICS

Exercises - Time series analysis

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Statistical Methods for Forecasting

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Adaptive Filtering. Squares. Alexander D. Poularikas. Fundamentals of. Least Mean. with MATLABR. University of Alabama, Huntsville, AL.

Lessons in Estimation Theory for Signal Processing, Communications, and Control

Autoregressive Moving Average (ARMA) Models and their Practical Applications

INTRODUCTORY REGRESSION ANALYSIS

A time series is called strictly stationary if the joint distribution of every collection (Y t

Wavelet Methods for Time Series Analysis

Independent Component Analysis. Contents

Automatic Autocorrelation and Spectral Analysis

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

Econometrics II Heij et al. Chapter 7.1

MGR-815. Notes for the MGR-815 course. 12 June School of Superior Technology. Professor Zbigniew Dziong

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Chapter 4: Models for Stationary Time Series

Lesson 2: Analysis of time series

Introduction to ARMA and GARCH processes

Time Series I Time Domain Methods

CHAPTER 8 FORECASTING PRACTICE I

FORECASTING METHODS AND APPLICATIONS SPYROS MAKRIDAKIS STEVEN С WHEELWRIGHT. European Institute of Business Administration. Harvard Business School

at least 50 and preferably 100 observations should be available to build a proper model

Econ 623 Econometrics II Topic 2: Stationary Time Series

Statistics 349(02) Review Questions

Multivariate Time Series: VAR(p) Processes and Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Lecture 2: ARMA(p,q) models (part 2)

7. Forecasting with ARIMA models

3. ARMA Modeling. Now: Important class of stationary processes

Introduction to Eco n o m et rics

Time Series Analysis -- An Introduction -- AMS 586

Lesson 9: Autoregressive-Moving Average (ARMA) models

Parametric Inference on Strong Dependence

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

CONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1

Generalized Linear. Mixed Models. Methods and Applications. Modern Concepts, Walter W. Stroup. Texts in Statistical Science.

Applied time-series analysis

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Introduction to Stochastic processes

ARMA Models: I VIII 1

Introduction to. Process Control. Ahmet Palazoglu. Second Edition. Jose A. Romagnoli. CRC Press. Taylor & Francis Group. Taylor & Francis Group,

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Advanced Econometrics

The Application of the Kalman Filter to. Nonstationary Time Series through Time. Deformation

Time Series Examples Sheet

Econometría 2: Análisis de series de Tiempo

1 Class Organization. 2 Introduction

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm

Sample Exam Questions for Econometrics

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Research Article G-Filtering Nonstationary Time Series

Quantitative Finance I

Christopher Dougherty London School of Economics and Political Science

Applied Probability and Stochastic Processes

TIME SERIES DATA ANALYSIS USING EVIEWS

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

MULTIVARIATE TIME SERIES ANALYSIS LECTURE NOTES. Version without figures. 9 August 2018

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Time Series Analysis Fall 2008

Classical Decomposition Model Revisited: I

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

ARMA Estimation Recipes

ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Midterm Suggested Solutions

Econometric Forecasting

Univariate ARIMA Models

X t = a t + r t, (7.1)

Modelling using ARMA processes

Index. Regression Models for Time Series Analysis. Benjamin Kedem, Konstantinos Fokianos Copyright John Wiley & Sons, Inc. ISBN.

KALMAN-TYPE RECURSIONS FOR TIME-VARYING ARMA MODELS AND THEIR IMPLICATION FOR LEAST SQUARES PROCEDURE ANTONY G AU T I E R (LILLE)

Signals and Systems Laboratory with MATLAB

ARIMA Models. Richard G. Pierse

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Time Series Analysis

Univariate Time Series Analysis; ARIMA Models

Biomedical Signal Processing and Signal Modeling

Transcription:

Applied Time Series Analysis Wayne A. Woodward Southern Methodist University Dallas, Texas, USA Henry L. Gray Southern Methodist University Dallas, Texas, USA Alan C. Elliott University of Texas Southwestern Medical Center at Dallas Dallas, Texas, USA CRC Press Taylor & Francis Group Boca Raton London New York CRC Press is an imprint of the Taylor & Francis Croup, an Informa business A CHAPMAN & HALL BOOK

Contents Preface Acknowledgments xvii xxiii 1. Stationary Time Series 1 1.1 Time Series 2 1.2 Stationary Time Series 5 1.3 Autocovariance and Autocorrelation Functions for Stationary 1.4 Estimation of the Mean, Autocovariance, and Autocorrelation Time Series 7 for Stationary Time Series 11 1.4.1 Estimation of /u, 11 ^ 1.4.1.1 Ergodicity ofx 11 1.4.1.2 Variance of X 17 1.4.2 Estimation of 7& 18 1.4.3 Estimation of Pk 19 1.5 Power Spectrum 22 1.6 Estimating the Power Spectrum and Spectral Density for Discrete Time Series 32 1.7 Time Series Examples 36 1.7.1 Simulated Data 36 1.7.2 Real Data 40 1.A Appendix 45 Exercises 47 2. Linear Filters 53 2.1 Introduction to Linear Filters 53 2.1.1 Relationship between the Spectra of the Input and Output of a Linear Filter 55 2.2 Stationary General Linear Processes 55 2.2.1 Spectrum and Spectral Density for a General Linear Process 57 2.3 Wold Decomposition Theorem 58 2.4 Filtering Applications 58 2.4.1 Butterworth Filters 63 2.A Appendix 68 Exercises 69 ix

x Contents 3. ARMA Time Series Models 73 3.1 Moving Average Processes 73 3.1.1 MA(1) Model 75 3.1.2 MA(2) Model 79 3.2 Autoregressive Processes 79 3.2.1 Inverting the Operator 83 3.2.2 AR(1) Model 84 3.2.3 AR( p) Model for p > 1 90 3.2.4 Autocorrelations of an AR(p) Model 90 3.2.5 Linear Difference Equations 91 3.2.6 Spectral Density of an AR(p) Model 94 3.2.7 AR(2) Model 94 3.2.7.1 Autocorrelations of an AR(2) Model 94 3.2.7.2 Spectral Density of an AR(2) 97 3.2.7.3 Stationary/Causal Region of an AR(2) 98 3.2.7.4 (//-Weights of an AR(2) Model 98 3.2.8 Summary of AR(1) and AR(2) Behavior 105 3.2.9 AR(p) Model 106 3.2.10 AR(1) and AR(2) Building Blocks of an AR(p) Model 109 3.2.11 Factor Tables Ill 3.2.12 Mvertibilily/Infinite-Order Autoregressive Processes 118 3.2.13 Two Reasons for Imposing Invertibility 119 3.3 Autoregressive-Moving Average Processes 120 3.3.1 Stationarity and Invertibility Conditions for an ARMA(p^) Model 123 3.3.2 Spectral Density of an ARMA( p,q) Model 123 3.3.3 Factor Tables and ARMA(p,(?) Models 124 3.3.4 Autocorrelations of an ARMA(p^) Model 127 3.3.5 (//-Weights of an ARMA(p/(j) 131 3.3.6 Approximating ARMA(p^) Processes Using High-Order AR(p) Models 133 3.4 Visualizing Autoregressive Components 134 3.5 Seasonal ARMA(p,<?) x (PS,QS)S Models 136 Processes 140 3.6 Generating Realizations from ARMA(p,q) 3.6.1 MA((j) Model 140 3.6.2 AR(2) Model 140 3.6.3 General Procedure 141 3.7 Transformations 142 3.7.1 Memoryless 3.7.2 Autoregressive Transformations 143 3.A Appendix: Proofs of Theorems 147 Transformations 142 Exercises 151

Contents xi 4. Other Stationary Time Series Models 159 4.1 Stationary Harmonic Models 159 4.1.1 Pure Harmonic Models 161 4.1.2 Harmonic Signal-plus-Noise Models 163 4.1.3 ARMA Approximation to the Harmonic Signal-plus-Noise Model 165 4.2 ARCH and GARCH Processes 168 Exercises 4.2.1 ARCH Processes 169 4.2.1.1 The ARCH(l) Model 169 4.2.1.2 The ARCH(g0) Model 172 4.2.2 The GARCH(p0,(?o) Process 173 4.2.3 AR Processes with ARCH or GARCH Noise 175 177 5. Nonstationary Time Series Models 179 5.1 Deterministic Signal-plus-Noise Models 179 5.1.1 Trend-Component Models 180 5.1.2 Harmonic Component Models 182 5.2 ARIMA(p,d,c/) and ARUMA(p,d,<?) Processes 183 5.2.1 Extended Autocorrelations of an ARUMA(p,d,^) Process 5.2.2 Cyclical Models 189 5.3 Multiplicative Seasonal ARUMA(p,d,c?) x {PS,DS,QS)S Process... 5.3.1 Factor Tables for Seasonal Models of the Form 184 190 (5.17) with s = 4 and s 12 = 190 5.4 Random Walk Models 192 5.4.1 Random Walk 192 5.4.2 Random Walk with Drift 193 5.5 G-Stationary Models for Data with Time-Varying Frequencies 194 Exercises 195 6. Forecasting 197 6.1 Mean Square Prediction Background 197 6.2 Box-Jenkins Forecasting for ARMA(p,q) Models 200 6.3 Properties of the Best Forecast 201 6.4 7r-Weight Form of the Forecast Function 203 6.5 Forecasting Based on the Difference Equation 204 6.6 Eventual Forecast Function 210 6.7 Probability Limits for Forecasts 212 6.8 Forecasts Using AR\MA(p,d,q) Models 216 6.9 Forecasts Using Multiplicative Seasonal ARUMA Models 222 6.10 Forecasts Based on Signal-plus-Noise Models 225 6.A Appendix 229 Exercises 230

xii Contents 7. Parameter Estimation 235 7.1 Introduction 235 7.2 Preliminary Estimates 236 7.2.1 PreUminary Estimates for AR(p) Models 236 7.2.1.1 Yule-Walker Estimates 236 7.2.1.2 Least Squares Estimation 238 7.2.1.3 Burg Estimates 240 7.2.2 Preliminary Estimates for MA(cj) Models 242 7.2.2.1 Method-of-Moment Estimation foranma^) 243 7.2.2.2 MA(q) Estimation Using the Innovations Algorithm 244 7.2.3 Preliminary Estimates for ARMA( p,q) Models 245 7.2.3.1 Extended Yule-Walker Estimates of the Autoregressive Parameters 246 7.2.3.2 Tsay-Tiao (TT) Estimates of the Autoregressive Parameters 246 7.2.3.3 Estimating the Moving Average Parameters 248 7.3 Maximum Likelihood Estimation of ARMA( p,q) Parameters 248 7.3.1 Conditional and Unconditional Maximum Likelihood Estimation 248 7.3.2 ML Estimation Using the Innovations Algorithm 253 7.4 Backcasting and Estimating 254 7.5 Asymptotic Properties of Estimators 257 7.5.1 Autoregressive Case 257 7.5.1.1 Confidence Intervals: Atitoregressive Case 258 7.5.2 ARMA(p^) Case 259 7.5.2.1 Confidence Intervals for ARMA( p,q) Parameters 262 7.5.3 Asymptotic Comparisons of Estimators foranma(l) 263 7.6 Estimation Examples Using Data 264 7.7 ARMA Spectral Estimation 270 7.8 ARUMA Spectral Estimation 274 Exercises 275 8. Model Identification 279 8.1 Preliminary Check for White Noise 279 8.2 Model Identification for Stationary ARMA Models 282 8.2.1 Model Identification Based on AIC and Related Measures 283

B)d Contents xiii 8.3 Model Identification for Nonstationary AR\JMA(p,d,Cj) Models 286 8.3.1 Including a Nonstationary Factor in the Model 287 8.3.2 Identifying Nonstationary Component(s) in a Model 287 8.3.3 Decision between a Stationary or a Nonstationary Model 292 8.3.4 Deriving a Final ARUMA Model 292 8.3.5 More on the Identification of Nonstationary Components 295 8.A Appendix: - 8.3.5.1 Including a Factor (1 in the Model 295 8.3.5.2 Testing for a Unit Root 298-8.3.5.3 Including a Seasonal Factor (1 Bs) in the Model 299 Model Identification Based on Pattern Recognition 309 Exercises 323 9. Model Building 327 9.1 Residual Analysis 327 9.1.1 Check Sample Autocorrelations of Residuals versus 95% Limit Lines 328 9.1.2 Ljung-Box Test 328 9.1.3 Other Tests for Randomness 329 9.1.4 Testing Residuals for Normality 332 9.2 Stationarity versus Nonstationarity 333 9.3 Signal-plus-Noise versus Purely Autocorrelation-Driven Models 338 9.3.1 Cochrane Orcutt, ML, and Frequency Domain Method 339 9.3.2 A Bootstrapping Approach 341 9.3.3 Other Methods for Trend Testing 341 9.4 Checking Realization Characteristics 342 9.5 Comprehensive Analysis of Time Series Data: A Summary 347 Exercises 348 10. Vector-Valued (Multivariate) Time Series 351 10.1 Multivariate Time Series Basics 351 10.2 Stationary Multivariate Time Series 353 10.2.1 Estimating the Mean and Covariance for Stationary Multivariate Processes 357 10.2.1.1 Estimating^ 357 10.2.1.2 Estimating i 358

[v Contents 10.3 Multivariate (Vector) ARMA Processes 358 10.3.1 Forecasting Using VAR(p) Models 364 10.3.2 Spectrum of a VAR(p) Model 367 10.3.3 Estimating the Coefficients of a VAR(p) Model 367 10.3.3.1 Yule-Walker Estimation 367 10.3.3.2 Least Squares and Conditional Maximum Likelihood Estimation 368 10.3.3-3 Burg-Type Estimation 368 10.3.4 Calculating the Residuals and Estimating Ta 369 10.3.5 VAR(p) Spectral Density Estimation 369 10.3.6 Fitting a VAR(p) Model to Data 370 10.3.6.1 Model Selection 370 10.3.6.2 Estimating the Parameters 370 10.3.6.3 Testing the Residuals for White Noise 370 10.4 Nonstationary VARMA Processes 371 10.5 Testing for Association between Time Series 372 10.5.1 Testing for Independence of Two Stationary Time Series 374 10.5.2 Testing for Cointegration between Nonstationary Time Series 377 10.6 State-Space Models 379 10.6.1 State Equation 379 10.6.2 Observation Equation 379 10.6.3 Goals of State-Space Modeling 382 10.6.4 Kalman Filter 383 10.6.4.1 Prediction (Forecasting) 383 10.6.4.2 Filtering 383 10.6.4.3 Smoothing Using the Kalman Filter 384 10.6.4.4 ft-step Ahead Predictions 384 10.6.5 Kalman Filter and Missing Data 387 10.6.6 Parameter Estimation 389 10.6.7 Using State-Space Methods to Find Additive Components of a Univariate Autoregressive Realization 389 10.6.7.1 Revised State-Space Model 390 10.6.7.2 i/^real 391 10.6.7.3 Complex 391 10.A Appendix: Derivation of State-Space Results 393 Exercises 398 11. Long-Memory Processes 401 11.1 Long Memory 402 11.2 Fractional Difference and FARMA Processes 403 11.3 Gegenbauer and GARMA Processes 409 11.3.1 Gegenbauer Polynomials 410

Contents xv 11.3.2 Gegenbauer Process 410 11.3.3 GARMA Process 414 11.4 fc-factor Gegenbauer and GARMA Processes 417 11.4.1 Calculating Autocovariances 421 11.4.2 Generating Realizations 423 11.5 Parameter Estimation and Model Identification 424 11.6 Forecasting Based on the fc-factor GARMA Model 428 11.7 Modeling Atmospheric C02 Data Using Long-Memory Models 429 Exercises 433 12. Wavelets 435 12.1 Shortcomings of Traditional Spectral Analysis for TVF Data 435 12.2 Window-Based Methods That Localize the "Spectrum" in Time 438 12.2.1 Gabor Spectrogram 438 12.2.2 Wigner-Ville Spectrum 441 12.3 Wavelet Analysis 441 12.3.1 Fourier Series Background 442 12.3.2 Wavelet Analysis Introduction 442 12.3.3 Fundamental Wavelet Approximation Result 446 12.3.4 Discrete Wavelet Transform for Data Sets of Finite Length 447 12.3.5 Pyramid Algorithm 451 12.3.6 Multiresolution Analysis 451 12.3.7 Wavelet Shrinkage 457 12.3.8 Scalogram: Time-Scale Plot 459 12.3.9 Wavelet Packets 463 12.3.10 Two-Dimensional Wavelets 469 12.5 Concluding Remarks on Wavelets 472 12.A Appendix: Mathematical Preliminaries for This Chapter 473 Exercises 476 13. G-Stationary 13.1 Generalized-Stationary Processes 479 13.1.1 General Strategy for Analyzing G-Stationary Processes 480 Processes 479 13.2 M-Stationary 13.2.1 Continuous M-Stationary Process 481 Processes 481 13.2.2 Discrete M-Stationary Process 483 13.2.3 Discrete Euler(p) Model 483 13.2.4 Time Transformation and Sampling 484

xvi Contents 13.3 G(A)-Stationary Processes 488 13.3.1 Continuous G(p;A) Model 490 13.3.2 Sampling the Continuous G(A)-Stationary 13.3.2.1 Equally Spaced Sampling from Processes... 492 G(p;A) Processes 492 13.3.3 Analyzing TVP Data Using the G(p;A) Model 493 13.3.3.1 G(p;A) Spectral Density 495 13.4 Linear Chirp Processes 505 13.4.1 Models for Generalized Linear Chirps 508 13.5 Concluding Remarks 512 13.A Appendix 512 Exercises 516 References 519 Index 529