Time series models in the Frequency domain. The power spectrum, Spectral analysis

Similar documents
Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

1. Fundamental concepts

Theoretical and Simulation-guided Exploration of the AR(1) Model

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of


6.435, System Identification

Econometrics Summary Algebraic and Statistical Preliminaries

Time Series Solutions HT 2009

Econometrics of financial markets, -solutions to seminar 1. Problem 1

VIII. Coherence and Transfer Function Applications A. Coherence Function Estimates

Discrete time processes

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Statistics 349(02) Review Questions

Bispectral resolution and leakage effect of the indirect bispectrum estimate for different types of 2D window functions

If we want to analyze experimental or simulated data we might encounter the following tasks:

Time Series 3. Robert Almgren. Sept. 28, 2009

E 4101/5101 Lecture 6: Spectral analysis

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Statistical signal processing

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

Chapter 4: Models for Stationary Time Series

Chapter 2: Unit Roots

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Modern Navigation. Thomas Herring

Chapter 6: Nonparametric Time- and Frequency-Domain Methods. Problems presented by Uwe

Advanced Econometrics

GMM, HAC estimators, & Standard Errors for Business Cycle Statistics

Statistics 910, #5 1. Regression Methods

White Noise Processes (Section 6.2)

A time series is called strictly stationary if the joint distribution of every collection (Y t

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -20 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

ECE 636: Systems identification

interval forecasting

1 Linear Difference Equations

TREND ESTIMATION AND THE HODRICK-PRESCOTT FILTER

14 - Gaussian Stochastic Processes

Econ 424 Time Series Concepts

Problem Sheet 1 Examples of Random Processes

Problem Set 2: Box-Jenkins methodology

EE/CpE 345. Modeling and Simulation. Fall Class 9

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

11.1 Gujarati(2003): Chapter 12

SIO 221B, Rudnick adapted from Davis 1. 1 x lim. N x 2 n = 1 N. { x} 1 N. N x = 1 N. N x = 1 ( N N x ) x = 0 (3) = 1 x N 2

Heteroskedasticity and Autocorrelation Consistent Standard Errors

ECE 636: Systems identification

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model

8.2 Harmonic Regression and the Periodogram

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

Lecture 11: Spectral Analysis

Time Series Analysis

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Empirical Market Microstructure Analysis (EMMA)

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Chapter 3 - Temporal processes

Signals and Spectra - Review

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

DATA IN SERIES AND TIME I. Several different techniques depending on data and what one wants to do

F9 F10: Autocorrelation

ARIMA modeling to forecast area and production of rice in West Bengal

Computational Data Analysis!

Regression of Time Series

Ch. 14 Stationary ARMA Process

The Identification of ARIMA Models

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

Minitab Project Report Assignment 3

Ch3. TRENDS. Time Series Analysis

Cointegration, Stationarity and Error Correction Models.

ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Review performance measures (means, probabilities and quantiles).

System Identification & Parameter Estimation

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

Non-parametric identification

IDENTIFICATION OF ARMA MODELS

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Some Time-Series Models

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Part II. Time Series

Statistics: Learning models from data

MA Advanced Econometrics: Applying Least Squares to Time Series

Generalised AR and MA Models and Applications

Long-range dependence

Economic modelling and forecasting

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

11. Further Issues in Using OLS with TS Data

Probability and Statistics

II. Nonparametric Spectrum Estimation for Stationary Random Signals - Non-parametric Methods -

Economics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)

Chapter 2: The Fourier Transform

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

Signals and Spectra (1A) Young Won Lim 11/26/12

Random processes and probability distributions. Phys 420/580 Lecture 20

Chapter 9: Forecasting

Use of the Autocorrelation Function for Frequency Stability Analysis

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Cross-Sectional Vs. Time Series Benchmarking in Small Area Estimation; Which Approach Should We Use? Danny Pfeffermann

Transcription:

ime series models in the Frequency domain he power spectrum, Spectral analysis

Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ ( ) ( ) sinλ = yt Y t + Yt Y t t= t= λ = / ; ( λt t= t= cos = sinλ t= 0) = + t, s= = ( Yt Y)( Ys Y) cos λ ( s t) t, s= = ( Yt Y) + ( Yt Y ) ( Ys Y) cos λ ( s t ) t= ( Yt Y)( Ys Y)( cosλt cosλ s sinλt sinλ s) = 0+ p= t< s= C C cosλ p = [ C + C cos λ p] ; p=s-t 0 p= p p = C [+ r cosλ p]. 0 p= p

Periodogram and autocorrelations (cont.) ( ) 0 p= p. Ι () = C + C cosλ p A weighted average of the autocorrelations but the two statistics play a different role. he periodogram represents cyclical effects, which are deterministic. hus, if we have high correlations at lags, 4, we should see them in the periodogram. However, if we analyse, for example, a seasonal MA() process, it is no longer deterministic, the seasonal effects change over time and we only have r 0 with the other empirical correlations close to zero (not lie in a deterministic process). he periodogram may no longer indicate this phenomenon because the correlation is not the outcome of a single or few cycles with discrete frequencies. his suggests considering all the frequencies in the range [ 0, ], which is what Spectral analysis does. By considering the spectrum we should observe a hump around the seasonal frequencies. 3

he power spectrum he power spectrum of a stationary process is defined by the continuous function, f( λ ) = γ 0+ cos τ= τ γ λτ, 0 λ. Compare with the periodogram, ( ) p= f( λ ) = C0+ Cp cosλ p. f ( λ ) is defined for all λ and not just for λ / If t =. he summation is from to. Y is white noise, ( t) 0 ( ) γ = for all t 0 and f ( λ) = γ 0 / =const. hus, the spectrum is flat or the process may be regarded as consisting of an infinite number of cyclical components, all of which have equal weight. (his is what defines a white noise). 4

Notice that he power spectrum (cont.) γ 0 ( 0) 5 ( 0) dλ= γ (area under the spectrum). In fact, it is true in general that, 0 0 f ( λ) dλ= γ 0 ( ). (his is because ( λt) sin cos ( λt) dλ= 0 = 0). he function t f ( λ) / γ (0) is called the Spectral Density (uses correlations instead of covariances). Example MA() process- Yt = εt + θε t. ( ) ( ) ( ) ( ) γ 0 = σ + θ,, γ = θσ, γ = 0 ; ε ( + ) σ ε θ σε f ( λ) = + θ cos( λ) σε For θ = 0.5, f ( λ) = [5+ 4cos ( λ) ] 4 σε For θ = 0.5, f ( λ) = [5 4cos ( λ) ]. 4 ε

f σ = [5+ 4cos ]; θ = 0.5, σ ε = 4 ε ( λ) ( λ) 6

f σ = [5 4cos ]; θ = 0.5, σ ε = 4 ε ( λ) ( λ) 7

Exercises Exercise 5: Compute the power spectrum of the MA() model, Y = ε 0.5ε 0.6ε ; Var( ε ) = t t t t t Exercise 6: Let X t, Y t be two independent stationary series. 6.. Show that the series Zt = X t + Yt is also stationary. 6.. Let f ( λ), f ( λ ) denote the power X Y spectrum of the series X, Y. Show that, f ( λ) = f ( λ) + f ( λ) Z X Y 6.3. Compute f ( λ ) for the case where, X = ε 0.5 ε ; Var( ε ) = t t t t t t t t Z Y = e + 0.5 e ; Var( e ) = ; Cov(, e ) 0 t t ε = for all. t t 8

Properties of the periodogram as an estimator of the spectrum Power Spectrum: f ( λ ) = [ γ + γ cos λτ ] 0 τ= τ Periodogram: Ι () = [C + C cosλ p] 0 p= p It is natural to thin of I( λ ) / evaluated at any λ as an estimate of the spectrum. his is so because we would usually expect the covariances γ ( τ ) and hence C( τ ) to decay to zero as τ. However things are not so simple and to illustrate the problem consider the case where noise. K Y,..., Y are normal white 9

Poperties of the periodogram as an estimator of the power spectrum (cont.) For the white noise case, ( ˆ α ˆ ) K + βk I( λ ) σ ~ = = f ( λk ) σ K χ() E for λ = /. However, Var K 4 K σ σ f I( λ ) = 4 = = ( λk ). 4 hus, I( λk ) / is unbiased for f ( λ ), but the variance does not decay to zero with, so that the estimator is not consistent. his happens because we actually estimate parameters (the covariances appearing in the expression for the periodogram) by observations. Each covariance estimate is O(/), but the cumulative effect is O(). If U = Vi and Var(V i ) = VarU = 0.

Poperties of the periodogram as an estimator of the power spectrum (cont.) Notice also that since the periodograms are uncorrelated, Cov[ fˆ ( λ ), fˆ ( λ )] = 0 for j, so that the periodogram has a very irregular appearance. It can be shown that for any stationary fˆ ( λ) process, χ() for all 0 λ f ( λ) ( ˆ E f ( λ)) f ( λ) ; Var( fˆ ( λ)) f ( λ) j K. Also, the ordinates at different frequencies are asymptotically independent, giving rise to the erratic form of the estimates. In the white noise case, fˆ ( λ) ( ˆ α ˆ ) ˆ ˆ K + βk γ 0 ( α K + βk ) = / =. f ( λ) σ

Estimation of the power spectrum We are interested in estimating a continuous function and hence we lie our estimate to be a smooth function. Spectral analysis: How to estimate the power spectrum consistently. Assumption: he series is stationary, i.e., no trend or deterministic seasonality. Also needed from a practical point of view; we may not be able to detect frequencies other than those corresponding to deterministic trends and seasonals).

More on the stationarity requirement he stationarity requirement seems to rule out the use of the model, [ n / ] Y = α + ( u cosλ t+ v sin λ t) + ε. t 0 t = However, if we assume that u and v are random variables satisfying, E( u ) = E( v ) = 0 ; Var( u ) = Var( v ) =σ E( u u ) = E( v v ) = 0 i j i j i j E( u v ) = 0 for all i and j, i E( Y t ) [ n / ] = [ n / ] = j α =, γ ( τ ) = E( Y Y ) = = 0 Var( Y ) t [ n / ] ε σ = = σ +, and σ {cos( λ t)cos[ λ ( t τ )] + sin( λ t)sin[ λ ( t τ )]} t t τ σ cos( λτ ). 3

Stationarity requirement (cont.) he signal is stationary and yet deterministic (fixed) for any given realization, so that it is more a matter of interpretation. 4

Estimation of the power spectrum Method : Use estimators of the form, M fˆ( λ ) = [ w C + w C cos( λ )], 0 0 = where the { w } are decreasing weights called the lag window and M << is the truncation point. Basic Idea: we only consider some of the covariances and since the precision of the estimates C decreases as increases, the weights decrease in. Under normality assumptions, the periodogram estimates Ι ( ˆ λ K ) are sufficient statistics for the periodograms Ι ( λ K ) (considered as parameters). his means that if we are looing for smoothed estimates of the power spectrum, all that we need to do is to smooth the periodogram estimators. Indeed, the estimator f ˆ( λ ) defined above can be shown to be a smoothed function of the periodograms. 5

Lag windows in common use ) Parzen window 3 6 + 6 0 M / M M 3 M wk = M M 0 > M ) uey window w K + cos = 0,..., M = M 0 > M he two windows are similar, but uey s weights are somewhat larger. 3) Daniel (rectangular) window w K sin( / M ) = =,... ; w 0=. / M 6

Sampling properties of spectral estimates w = for = 0,,,M, Suppose first that M such that, fˆ( λ ) = [ C0+ C cos( λ )]. = E[ ( Y )( ) t µ y Y t t µ y = γ = +, - C obtained by substituting Y for µ y and dividing by instead of -, is consistent for γ. We find that for M,, ( M / ) 0, M ˆ E[ f ( λ)] [ γ + ( ) γ cos( λ)] f ( λ) and 0 =, Var[ fˆ ( λ)] 0, since we only estimate M covariances and M<<. 7

Properties of spectral estimates (cont.) fˆ( λ ) = [ w0c0 + wc cos( λ )]. = he following results can be shown to hold: lim E fˆ( λ ) = f ( λ ) for all λ, ˆ Var [ f ( λ)] ( + δλ,0, ) f ( λ) w K= ( ) λ= 0, w = w ; δλ,0, =. 0 otherwise Also, f ˆ( λ ) and f ˆ( λ ) are correlated positively when λ λ 0. he correlation increases as λ λ decreases f ˆ( λ ) is a smooth function. he asymptotic results assume certain behaviour of the weights w. For consistency it is sufficient that M / 0 as. 8

Bias and variance of window estimates Window Bias var ˆ { f ( λ)}/ Mf ( λ) 6 Parzen M f ''( λ ) 0.54 uey Daniel 4M 6M f ( λ) 0.75 f ( λ).00 he smaller the bias, the larger the variance. he values in the table assume λ 0,. For λ= 0,, the variances should be multiplied by. 9

Example MA() Yt = εt 0.5ε t 0.6ε t ; Var( ε t ) = γ (0) =.5 +.36+ =.6 γ () = 0.5+ 0.5 0.6= 0.; γ () = 0.6 f ( λ) = [ γ (0) + γ ( τ )cos( λτ )] τ= = [.6 + ( 0.cosλ 0.6cos λ)] = [.6 0.4cosλ.cos λ]. f '( λ) = [0.4sinλ+.4sin λ] f ''( λ) = [0.4cosλ+ 4.8cos λ] 0

f ( λ ) [.6 0.4cosλ.cos λ] = ; σ ε =

Window Example (cont.) Bias 6 Parzen M uey Daniel 4M 6M f var ˆ ''( λ ) 0.54 f ( λ) 0.75 f ( λ).00 Suppose that =00, M=0 { f ( λ)}/ Mf ( λ) = =. ( M / ) f ( ) 0.6 ; ( M / ) f ( ) 0.03 Window Bias Variance MSE λ / / / Parzen -.03.00.086.04.087.04 uey -.009.009.0.00.0.00 Daniel -.006.006.60.06.60.06 f '( ) =.3, f '( ) = 0 ; f ''( ) =.53, f ''( ) =.4.

Exercise 7: Compute the bias, variance and MSE of the windows of Parzen, uey and Daniel at λ = 0., 0.5, 0.9 for the model, Y = ε ε σ =. Assume =00, M=0., t t t ε Choice of M (truncation point) Can be chosen in an optimal way if the underlying process is nown. Compromise between bias and variance; he smaller M, the smaller is the variance but the larger is the bias. Small Values of M possibly too smooth but may give an idea of the large peas (but may hide the small peas). Large values of M possibly too many peas (lie with the periodogram). 3

Choice of M (cont.) A possible procedure for determining M use the estimated autocorrelations, set M such that ˆ ρ j 0 for j> M Not very efficient because the ˆ ρ j are autocorrelated, so that they tend to decay slower than the theoretical autocorrelations. Another procedure is window closing where we try several values of M, say 3, such that ( M 3 / M) 4 and study the results. (Low values will highlight the large peas, high values will show other important peas but also spurious peas; compromise achieved by studying the results for the intermediate values of M. 4

Choice of M (cont.) It is sometimes recommended in the literature to choose M as a fixed proportion of, say M = (implies ( M / ) 0 as ) but there is no theoretical justification for this strategy. For example, in the white noise case the best choice is M=0. In principle, we can estimate the spectrum at any value of λ but often it is evaluated at frequencies j/q, where Q is chosen to be sufficiently large to show all the important cycles (common choice Q=M). 5

Estimation of the power spectrum, an alternative method So far we considered estimators of the form, M fˆ( λ ) = [ w C + w C cos( λ )]. 0 0 = Method : An alternative method is to use weighted averages of the periodograms, i.e., l ( j+ l) l= m l= m * * fˆ( λj ) = wl I ( λj+ ) = wl I ( ) l= m l= m j ; λ = j /, and l varies over m+ successive integers such that the λ l are symmetric around the λ for which the estimator is computed. j Example: l= m l fˆ( λj ) = I ( λj+ ) m +. l= m See later for other examples of weights. 6

Alternative method (cont.) General expression of estimators: l ; l= m * fˆ( λ j ) = w l m lι ( λ = j+ ) l * * * * * * l l 0 m l = l w = w, w w... w > 0 w = At the end points, m * fˆ(0) wl I ( l / ) l = m * f I wl I l= =, l ˆ( ) = [ ( ) + ( )]. ;. Assumes symmetry of the periodogram around 0 and and I (0) = 0. 7

Rationale of Method he periodogram estimators are asymptotically unbiased and since neighbouring estimators are asymptotically uncorrelated, the variance of f ˆ( λ ) obtained this way is of order [/(m+)]. However, f ˆ( λ ) is generally biased; ˆ ( j+ l) ( j+ l) E f E w w f [ ; l= m * l= m * [ ( λj)] = [ lι ( )] = l ( ) l= m l= m E[ ( ) f ( ) Ι ]. However, if f ( λ ) is approximately linear around λ, the bias will be small. 8

Method (cont.) l l= m * fˆ( λ j ) = w l m lι ( λ = j + ). In general, the bias is small if m is small and ( ) f λ is a reasonably smooth function of λ. Choice of m same problem as with M under the previous procedure. he larger is m, the smaller is the variance but the larger is the bias. Advantage of Method over method Method is more efficient computationally, particularly with the use of the Fast Fourier ransform, and this is now the standard method used in computer software. 9

Confidence Intervals under Method It can be shown that for the estimator, M fˆ( λ ) = ( w C + w C cos( λ ) (Method ) 0 0 = Asym.. q fˆ( λ) / f ( λ) ~ χ ; M 0 q q = /[ w + w ]. = A 00( -α)% confidence interval for f ( λ ) is, q fˆ ( λ) q fˆ ( λ) [, ] χ χ q, α / q, α / For Parzen window, q = 3.7 / M. For uey window, q =.67 / M. 30