Automatic Autocorrelation and Spectral Analysis

Similar documents
Automatic Spectral Analysis With Time Series Models

THE PROCESSING of random signals became a useful

Automatic spectral analysis with time series models

Time Series: Theory and Methods

Spectral Analysis of Irregularly Sampled Data with Time Series Models

THE ESTIMATED spectra or autocorrelation functions

Elements of Multivariate Time Series Analysis

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Order Selection for Vector Autoregressive Models

Some Time-Series Models

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

New Introduction to Multiple Time Series Analysis

CONTENTS NOTATIONAL CONVENTIONS GLOSSARY OF KEY SYMBOLS 1 INTRODUCTION 1

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

ADAPTIVE FILTER THEORY

Figure 18: Top row: example of a purely continuous spectrum (left) and one realization

Parameter estimation: ACVF of AR processes

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

On Moving Average Parameter Estimation

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Parametric Method Based PSD Estimation using Gaussian Window

Part III Spectrum Estimation

Advanced Digital Signal Processing -Introduction

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 2: Univariate Time Series

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation.

Statistical and Adaptive Signal Processing

Statistics 349(02) Review Questions

MCMC analysis of classical time series algorithms.

Classic Time Series Analysis

A Course in Time Series Analysis

Autoregressive Approximation in Nonstandard Situations: Empirical Evidence

ADAPTIVE FILTER THEORY

Time Series I Time Domain Methods

COMPUTER ALGEBRA DERIVATION OF THE BIAS OF LINEAR ESTIMATORS OF AUTOREGRESSIVE MODELS

Exercises - Time series analysis

Statistical Methods for Forecasting

INTRODUCTION TO APPLIED STATISTICAL SIGNAL ANALYSIS: GUIDE TO BIOMEDICAL AND ELECTRICAL ENGINEERING APPLICATIONS

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Department of Econometrics and Business Statistics

Chapter 4: Models for Stationary Time Series

Biomedical Signal Processing and Signal Modeling

Autoregressive tracking of vortex shedding. 2. Autoregression versus dual phase-locked loop

System Identification

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Lecture 4 - Spectral Estimation

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

Akaike criterion: Kullback-Leibler discrepancy

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection

Introduction to Signal Processing

APPLIED TIME SERIES ECONOMETRICS

Univariate Time Series Analysis; ARIMA Models

Akaike criterion: Kullback-Leibler discrepancy

Econ 427, Spring Problem Set 3 suggested answers (with minor corrections) Ch 6. Problems and Complements:

9. Model Selection. statistical models. overview of model selection. information criteria. goodness-of-fit measures

Gaussian processes. Basic Properties VAG002-

MATLAB Signal Processing Toolbox. Greg Reese, Ph.D Research Computing Support Group Academic Technology Services Miami University

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Empirical Market Microstructure Analysis (EMMA)

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

CONSIDER the p -th order autoregressive, AR(p ), explanation

Chapter 6: Model Specification for Time Series

Analysis. Components of a Time Series

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

3 Theory of stationary random processes

Modelling using ARMA processes

6.3 Forecasting ARMA processes

The Behaviour of the Akaike Information Criterion when Applied to Non-nested Sequences of Models

6. Methods for Rational Spectra It is assumed that signals have rational spectra m k= m

Time Series Analysis

Covariances of ARMA Processes

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Example: Baseball winning % for NL West pennant winner

EEG- Signal Processing

Finite-Sample Bias Propagation in the Yule-Walker Method of Autoregressive Estimation

Chapter 18 The STATESPACE Procedure

Ch 6. Model Specification. Time Series Analysis

Econometría 2: Análisis de series de Tiempo

COMPUTER SESSION: ARMA PROCESSES

National Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation

STAD57 Time Series Analysis. Lecture 23

Local Polynomial Modelling and Its Applications

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

Econ 424 Time Series Concepts

TIME SERIES DATA ANALYSIS USING EVIEWS

Econ 623 Econometrics II Topic 2: Stationary Time Series

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing

Likely causes: The Problem. E u t 0. E u s u p 0

2.2 Classical Regression in the Time Series Context

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Evaluating some Yule-Walker Methods with the Maximum-Likelihood Estimator for the Spectral ARMA Model

Regression, Ridge Regression, Lasso

Time Series Analysis

Discrete time processes

LINEAR STOCHASTIC MODELS

Regression and Time Series Model Selection in Small Samples. Clifford M. Hurvich; Chih-Ling Tsai

Transcription:

Piet M.T. Broersen Automatic Autocorrelation and Spectral Analysis With 104 Figures Sprin ger

1 Introduction 1 1.1 Time Series Problems 1 2 Basic Concepts 11 2.1 Random Variables 11 2.2 Normal Distribution 14 2.3 Conditional Densities 17 2.4 Functions of Random Variables 18 2.5 Linear Regression 20 2.6 General Estimation Theory 23 2.7 Exercises 26 3 Periodogram and Lagged Product Autocorrelation 29 3.1 Stochastic Processes 29 3.2 Autocorrelation Function 31 3.3 Spectral Density Function 33 3.4 Estimation of Mean and Variance 38 3.5 Autocorrelation Estimation 40 3.6 Periodogram Estimation 49 3.7 Summary of Nonparametric Methods 55 3.8 Exercises 56 4 ARMA Theory 59 4.1 Time Series Models 59 4.2 White Noise 60 4.3 Moving Average Processes 61 4.3.1 MA(1) Process with Zero Outside the Unit Circle 63 4.4 Autoregressive Processes 63 4.4.1 AR(1) Processes 64 4.4.2 AR(1) Processes with a Pole Outside the Unit Circle 68 4.4.3 AR(2) Processes 69 4.4.4 AR( p) Processes 72 4.5 ARMA( p,q) Processes 74 4.6 Harmonie Processes with Poles on the Unit Circle 78

x 4.7 Spectra of Time Series Models 80 4.7.1 Some Examples 82 4.8 Exercises 86 Relations for Time Series Models 89 5.1 Time Series Estimation 89 5.2 Yule-W alker Relations and the Levinson-Durbin Recursion 89 5.3 Additional AR Representations 95 5.4 Additional AR Relations 96 5.4.1 The Relation between the Variances of x and e n for an AR( p) Process 96 5.4.2 Parameters from Reflection Coefficients 96 5.4.3 Reflection Coefficients from Parameters 97 5.4.4 Autocorrelations from Reflection Coefficients 97 5.4.5 Autocorrelations from Parameters 98 5.5 Relation for MA Parameters 98 5.6 Accuracy Measures for Time Series Models 99 5.6.1 Prediction Error 99 5.6.2 Model Error 102 5.6.3 Power Gain 103 5.6.4 Spectral Distortion 104 5.6.5 More Relative Measures 104 5.6.6 Absolute and Squared Measures 105 5.6.7 Cepstrum as a Measure for Autocorrelation Functions 107 5.7 ME and the Triangulär Bias 108 5.8 Computational Rules for the ME 111 5.9 Exercises 113 Estimation of Time Series Models 117 6.1 Historical Remarks About Spectral Estimation 117 6.2 Are Time Series Models Generally Applicable? 120 6.3 Maximum Likelihood Estimation 121 6.3.1 AR ML Estimation 121 6.3.2 MA ML Estimation 122 6.3.3 ARMA ML Estimation 123 6.4 AR Estimation Methods 124 6.4.1 Yule-Walker Method 124 6.4.2 Forward Least-squares Method 125 6.4.3 Forward and Backward Least-squares Method 125 6.4.4 Burg's Method 126 6.4.5 Asymptotic AR Theory 129 6.4.6 Finite-sample Practice for Burg Estimates of White Noise 130 6.4.7 Finite-sample Practice for Burg Estimates of an AR(2) Process 133 6.4.8 Model Error (ME) of Burg Estimates of an AR(2) Process 134 6.5 MA Estimation Methods 135

xi 6.6 ARMA Estimation Methods 140 6.6.1 ARMA( p,q) Estimation, First-stage 141 6.6.2 ARMA( p,q) Estimation, First-stage Long AR 142 6.6.3 ARMA( p,q) Estimation, First-stage Long MA 143 6.6.4 ARMA( p,q) Estimation, First-stage Long COV 143 6.6.5 ARMA(/?,<?) Estimation, First-stage Long Rinv 144 6.6.6 ARMA( p,q) Estimation, Second-stage 144 6.6.7 ARMA( p,q) Estimation, Simulations 146 6.7 Covariance Matrix of ARMA Parameters 155 6.7.1 The Covariance Matrix of Estimated AR Parameters 155 6.7.2 The Covariance Matrix of Estimated MA Parameters 157 6.7.3 The Covariance Matrix of Estimated ARMA Parameters 157 6.8 Estimated Autocovariance and Spectrum 160 6.8.1 Estimators for the Mean and the Variance 160 6.8.2 Estimation of the Autocorrelation Function 160 6.8.3 The Residual Variance 162 6.8.4 The Power Spectral Density 162 6.9 Exercises 164 7 AR Order Selection 167 7.1 Overview of Order Selection 167 7.2 Order Selection in Linear Regression 169 7.3 Asymptotic Order-selection Criteria 176 7.4 Relations for Order-selection Criteria 180 7.5 Finite-sample Order-selection Criteria 183 7.6 Kullback-Leibler Discrepancy 187 7.7 The Penalty Factor 192 7.8 Finite-sample AR Criterion CIC 200 7.9 Order-selection Simulations 203 7.10 Subset Selection 208 7.11 Exercises 208 8 MA and ARMA Order Selection 209 8.1 Introduction 209 8.2 Intermediate AR Orders for MA and ARMA Estimation 210 8.3 Reduction of the Number of ARMA Candidate Models 213 8.4 Order Selection for MA Estimation 216 8.5 Order Selection for ARMA Estimation 218 8.6 Exercises 221 9 ARMASA Toolbox with Applications 223 9.1 Introduction 223 9.2 Selection of the Model Type 223 9.3 The Language of Random Data 226 9.4 Reduced-statistics Order Selection 227 9.5 Accuracy of Reduced-statistics Estimation 230 9.6 ARMASA Applied to Harmonie Processes 233

xii 9.7 ARMASA Applied to Simulated Random Data 235 9.8 ARMASA Applied to Real-life Data 236 9.8.1 Turbulence Data 236 9.8.2 Radar Data 243 9.8.3 Satellite Data 244 9.8.4 Lung Noise Data 245 9.8.5 River Data 246 9.9 Exercises 248 ARMASA Toolbox 250 10 Advanced Topics in Time Series Estimation 251 10.1 Accuracy of Lagged Product Autocovariance Estimates 251 10.2 Generation of Data 262 10.3 Subband Spectral Analysis 264 10.4 Missing Data 268 10.5 Irregulär Data 276 10.5.1 Multishift, Slotted, Nearest-neighbour Resampling 282 10.5.2 ARMAsel for Irregulär Data 283 10.5.3 Performance of ARMAsel for Irregulär Data 284 10.6 Exercises 286 Bibliography 287 Index 295