Multiresolution Models of Time Series

Similar documents
Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Classic Time Series Analysis

Lecture 2: ARMA(p,q) models (part 2)

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Time Series Analysis -- An Introduction -- AMS 586

Introduction to ARMA and GARCH processes

Trend-Cycle Decompositions

Time Series: Theory and Methods

Wavelet Neural Networks for Nonlinear Time Series Analysis

Using wavelet tools to estimate and assess trends in atmospheric data

ARIMA Models. Richard G. Pierse

Univariate Nonstationary Time Series 1

Econ 424 Time Series Concepts

Stochastic Processes

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Econometría 2: Análisis de series de Tiempo

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

An Introduction to Wavelets with Applications in Environmental Science

Econ 623 Econometrics II Topic 2: Stationary Time Series

Elements of Multivariate Time Series Analysis

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Cointegrated VARIMA models: specification and. simulation

Chapter 3 - Temporal processes

THIS work addresses a class of inverse problems that are

Multiresolution analysis & wavelets (quick tutorial)

ECON 616: Lecture 1: Time Series Basics

6.3 Forecasting ARMA processes

Statistics of Stochastic Processes

Class 1: Stationary Time Series Analysis

Univariate, Nonstationary Processes

Advanced Econometrics

If we want to analyze experimental or simulated data we might encounter the following tasks:

1 Linear Difference Equations

Notes on Time Series Modeling

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

M O N A S H U N I V E R S I T Y

Lecture Notes 5: Multiresolution Analysis

Review Session: Econometrics - CLEFIN (20192)

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

2. An Introduction to Moving Average Models and ARMA Models

New Introduction to Multiple Time Series Analysis

Nonparametric and Parametric Defined This text distinguishes between systems and the sequences (processes) that result when a WN input is applied

Quantitative Finance II Lecture 10

Ross Bettinger, Analytical Consultant, Seattle, WA

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Time Series 3. Robert Almgren. Sept. 28, 2009

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Applied time-series analysis

3 Theory of stationary random processes

Vector Auto-Regressive Models

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

VAR Models and Applications

Basics: Definitions and Notation. Stationarity. A More Formal Definition

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

STAD57 Time Series Analysis. Lecture 23

Ch. 14 Stationary ARMA Process

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania

Empirical Market Microstructure Analysis (EMMA)

Lecture 2: Univariate Time Series

Time Series Outlier Detection

5: MULTIVARATE STATIONARY PROCESSES

at least 50 and preferably 100 observations should be available to build a proper model

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

KALMAN-TYPE RECURSIONS FOR TIME-VARYING ARMA MODELS AND THEIR IMPLICATION FOR LEAST SQUARES PROCEDURE ANTONY G AU T I E R (LILLE)

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

Adaptive wavelet decompositions of stochastic processes and some applications

FE570 Financial Markets and Trading. Stevens Institute of Technology

Econometrics II Heij et al. Chapter 7.1

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Some basic properties of cross-correlation functions of n-dimensional vector time series

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Cointegration, Stationarity and Error Correction Models.

Long-range dependence

Haar wavelets. Set. 1 0 t < 1 0 otherwise. It is clear that {φ 0 (t n), n Z} is an orthobasis for V 0.

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements

Expressions for the covariance matrix of covariance data

An Evaluation of Errors in Energy Forecasts by the SARFIMA Model

Introduction to Signal Processing

Analysis of Fractals, Image Compression and Entropy Encoding

ELEMENTS OF PROBABILITY THEORY

Nonstationary Time Series:

Discrete time processes

Chapter 4: Models for Stationary Time Series

Exercises - Time series analysis

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Nontechnical introduction to wavelets Continuous wavelet transforms Fourier versus wavelets - examples

Wavelet methods and null models for spatial pattern analysis

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Mohsen Pourahmadi. 1. A sampling theorem for multivariate stationary processes. J. of Multivariate Analysis, Vol. 13, No. 1 (1983),

ARIMA Modelling and Forecasting

Transcription:

Multiresolution Models of Time Series Andrea Tamoni (Bocconi University ) 2011 Tamoni Multiresolution Models of Time Series 1/ 16

General Framework Time-scale decomposition General Framework Begin with a univariate time series {x t i } i Z For a specified positive integer m, define the process x s (m) by x (m) s = 1 m m 1 i=0 x t+(s 1)m i The x s (m) values are the averages of non-overlapping groups of m consecutive x values Refer to m as the coarsening window, x t as the fine-level process and x s (m) as the coarse-level aggregate process Tamoni Multiresolution Models of Time Series 2/ 16

Time-scale Decomposition General Framework Let {x t i } i Z be the observed time series Let us focus on block of length N = 4 (for convenience, we assume N = 2 J for some J) X t = [x t 3, x t 2, x t 1, x t ] Consider the two-level (orthogonal) transform matrix, i.e. 1 4 1 4 1 4 1 4 W 2 = 1 4 1 1 1 4 4 4 1 1 2 2 0 0 0 0 1 1 2 2 Tamoni Multiresolution Models of Time Series 3/ 16

General Framework Time-scale Decomposition - Cont d Two-level decomposition is performed as follows W = W 2 X t where W = (a 2,t, w 2,t, w 1,t 2, w 1,t ) a 2,t = (xt + x t 1 + x t 2 + x t 3 ) 4 w 2,t = (xt + x t 1 x t 2 x t 3 ) 4 w 1,t 2 = (x t 2 x t 3 ) 2 w 1,t = (xt x t 1) 2 It is possible to reconstruct the time series at different levels of aggregation, i.e. x t = a 2,t + w 2,t + w 1,t x (2) t = a 2,t + w 2,t x (4) t = a 2,t Tamoni Multiresolution Models of Time Series 4/ 16

General Framework Decomposition in time-scale components - Interpretation Note that the j-th coefficient is in general obtained as follows w j,t = 2 j 1 i=0 x t i }{{ 2 j } [ low pass filter 0, ] 1 2 j 2 j 1 1 i=0 x t 2 j 1 i } 2{{ j 1 } [ low pass filter 0, Overall w j,t is a band-pass filter with [ 1 2 j 1, 1 2 j ] ] 1 2 j 1 Recall that x t = 2 j=1 w j,t + a 2,t and x (2) t = w 2,t + a 2,t and x (4) t = a 2,t Intuitively w 1,t yields changes on scale 1, i.e. [... t, t + 1, t + 2,...] w 2,t yields changes on scale 2, i.e. [... t, t + 2, t + 4,...] a 2,t yields changes on scale 2, i.e. [... t, t + 4, t + 8,...] Tamoni Multiresolution Models of Time Series 5/ 16

General Framework Decomposition in time-scale components - Extensions Let X t = [x t N+1,..., x t 2, x t 1, x t ] where N = 2 J Consider the J-level (orthogonal) transform matrix W J and construct W = W J X t We can partition W as follows W = a J W J W J 1. W 1 We obtain J + 1 time series {{W j } j, a J } each containing N 2 j coefficients, i.e. W j = (w j,t 2 J +1,..., w j,t 2 j, w j,t ) The series W j is related to times spaced 2 j units apart Tamoni Multiresolution Models of Time Series 6/ 16

Statistical Properties of the coefficients W j Let {x t i } i Z be a wide-sense stationary process Wong (1993) establishes that {w j,k } are wide-sense stationary for a fixed j We can write the following representation (understood to be in mean square sense) where x t = J j=1 k=0 w j,k ɛ j,t k 2 j (1) ɛ j,t = x (2j ) t P x (2j ) Mj,t 2 j t are called Haar innovations and we let M j,t = sp { x (2j ) t k2 j }k N The spectral representation (1) holds for a broader class of processes known as Affine stationary processes (see Yazici, 1996) Tamoni Multiresolution Models of Time Series 7/ 16

Statistical Properties of the coefficients W j - Cont d Whenever ɛ j,t }{{} x (2j ) t P Mj,t 2 j x(2j ) t = 2 j 1 i=0 ɛ t i 2 j }{{} 2 j 1 i=0 x t i P Mt i 1 x t i { where M j,t = sp x (2j ) t k2 }k N and M j t = sp {x t k } k N then we obtain the standard wold decomposition { The sp x (2j ) is the set of all linear combinations of t k2 }k N j and therefore is a subset of the sigma-algebra { } generated by x (2j ) t, x (2j ),... t 2 j x (2j ) t k2 j Tamoni Multiresolution Models of Time Series 8/ 16

Time series and shocks at different time scales Tamoni Multiresolution Models of Time Series 9/ 16

Classical vs. Multiscale modeling Illustrative Example ARMA time series model for fine-level process In this section we not only aggregate data but we also aggregate the model! Let x t be an ARMA(p,q) process Brewer (1973) shows that the time series model governing x s (m) is an ARMA(p,q ) where p = p and q = p + 1 + (q p 1) m for any finite m Assume x t follows an AR(1) process with autoregressive coefficient equal to ρ The pattern of autocorrelations of the temporally aggregated variable x s (m) is determined by the autoregressive coefficient ρ m Tamoni Multiresolution Models of Time Series 10/ 16

Multi-scale time-series modeling Classical vs. Multiscale modeling Illustrative Example What if instead the autocorrelation function at horizon h has a decay governed by ρ h > ρ? The question is how can we make the autocorrelation functions at the fine- and coarse-level compatible We use the approach suggested by Dijkerman (1994) and define a multiresolution model on the {W j } series. The multi-scale autoregressive process is defined by an AR(1) process at each level j, i.e. w j,t+2 j = ρ j w j,t + ɛ j,t+2 j for all j, where ɛ j,t+2 j is Gaussian noise with variance equal to σ j independent across scales Tamoni Multiresolution Models of Time Series 11/ 16

An example Time-scale decomposition Classical vs. Multiscale modeling Illustrative Example The fine scale interval is 1 year. Suppose J = 3. a 3,t = ρ 4 a 3,t 8 + ɛ 3,t+4 w 3,t = ɛ 3,t w 2,t = ɛ 2,t w 1,t = ρ 1 w 1,t 2 + ɛ 1,t Recall that x (8) t = a 3,t thus a 3,t pins down the autocorrelation of the process at coarse level. Moreover x t = w 1,t + w 2,t + w 3,t + a 3,t. The decay of the autocorrelation function at the fine-level is controlled by the parameters σ j j = 1,..., J (only through the ratio) Tamoni Multiresolution Models of Time Series 12/ 16

Classical vs. Multiscale modeling Illustrative Example Correlation functions on different time scales Tamoni Multiresolution Models of Time Series 13/ 16

Classical vs. Multiscale modeling Illustrative Example Correlation functions on different time scales - Cont d Tamoni Multiresolution Models of Time Series 14/ 16

Literature Time-scale decomposition Classical vs. Multiscale modeling Illustrative Example Tiao, G.C., 1972, Asymptotic behavior of temporal aggregates of time series, Biometrika 59, 525-531. Brewer, K.R.W, 1973, Some Consequences of Temporal Aggregation and Systematic Sampling for ARMA and ARMAX Models, Journal of Econometrics, 1, 133-154 Wong, P.W., 1993, Multiresolution Decomposition of Harmonizable Random Processes, IEEE Transactions on Information Theory, pp.7-18. R.W. Dijkerman and R.R. Mazumdar, 1994, Wavelet Representations of Stochastic Processes and Multiresolution Stochastic Models. IEEE Transactions on Signal Processing, Vol. 42. No. 7 B. Yazici and R. L. Kashyap, 1996, Affine stationary processes, Proceedings of Conference on Information Sciences and Systems, pp. 259-263, Princeton NJ. March Tamoni Multiresolution Models of Time Series 15/ 16

Affine Stationary Processes Classical vs. Multiscale modeling Illustrative Example A stochastic process x j,t is called affine stationary if it satisfies the following conditions: ( j E[x j,t x j,t ] = R j, 1 ) j (t t ) Within the same scale, i.e. j = j an affine stationary process reduces to an ordinary wide sense stationary process Across the scales for a fixed shift index, i.e. t = t the process reduces to a class of self-similar processes known as scale stationary process. Example: the increments of the fractional Brownian motion Wold decomposition Tamoni Multiresolution Models of Time Series 16/ 16