STAD57 Time Series Analysis. Lecture 23

Similar documents
Fourier Analysis of Stationary and Non-Stationary Time Series

Homework 4. 1 Data analysis problems

Statistics 349(02) Review Questions

Time Series: Theory and Methods

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Classic Time Series Analysis

Time Series Examples Sheet

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Statistics of Stochastic Processes

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Time Series Examples Sheet

Econometría 2: Análisis de series de Tiempo

STAD57 Time Series Analysis. Lecture 8

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

Heteroskedasticity and Autocorrelation Consistent Standard Errors

Introduction to Time Series Analysis. Lecture 18.

Advanced Econometrics

8.2 Harmonic Regression and the Periodogram

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes

6.3 Forecasting ARMA processes

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

ITSM-R Reference Manual

Moving Average (MA) representations

Introduction to Time Series Analysis. Lecture 11.

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

STAT 153: Introduction to Time Series

Time Series Analysis

Nonlinear Time Series

3 Theory of stationary random processes

Applied time-series analysis

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

Forecasting locally stationary time series

X random; interested in impact of X on Y. Time series analogue of regression.

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

Long-range dependence

Automatic Autocorrelation and Spectral Analysis

Advanced Digital Signal Processing -Introduction

Lesson 9: Autoregressive-Moving Average (ARMA) models

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Classical Decomposition Model Revisited: I

Generalised AR and MA Models and Applications

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Akaike criterion: Kullback-Leibler discrepancy

E 4101/5101 Lecture 6: Spectral analysis

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Ch 4. Models For Stationary Time Series. Time Series Analysis

Time Series 2. Robert Almgren. Sept. 21, 2009

Motion Models (cont) 1 2/15/2012

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Practical Spectral Estimation

Exercises - Time series analysis

Analysis of Violent Crime in Los Angeles County

1 Class Organization. 2 Introduction

Lecture 4 - Spectral Estimation

distributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as

STAT 436 / Lecture 16: Key

Ross Bettinger, Analytical Consultant, Seattle, WA

ARMA models with time-varying coefficients. Periodic case.

Econ 623 Econometrics II Topic 2: Stationary Time Series

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

IDENTIFICATION OF ARMA MODELS

Univariate Time Series Analysis; ARIMA Models

Empirical Market Microstructure Analysis (EMMA)

TMA4285 December 2015 Time series models, solution.

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Time Series Solutions HT 2009

1 h 9 e $ s i n t h e o r y, a p p l i c a t i a n

INTRODUCTION TO APPLIED STATISTICAL SIGNAL ANALYSIS: GUIDE TO BIOMEDICAL AND ELECTRICAL ENGINEERING APPLICATIONS

Part III Spectrum Estimation

FE570 Financial Markets and Trading. Stevens Institute of Technology

Figure 18: Top row: example of a purely continuous spectrum (left) and one realization

3. ARMA Modeling. Now: Important class of stationary processes

Stochastic Processes

Some Time-Series Models

Introduction to Time Series Analysis. Lecture 19.

Multiresolution Models of Time Series

A time series is called strictly stationary if the joint distribution of every collection (Y t

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

at least 50 and preferably 100 observations should be available to build a proper model

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

1 Random walks and data

1.4 Properties of the autocovariance for stationary time-series

Parameter estimation: ACVF of AR processes

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Massachusetts Institute of Technology

Lecture 11: Spectral Analysis

STA 6857 Estimation ( 3.6)

6.435, System Identification

INTRODUCTION Noise is present in many situations of daily life for ex: Microphones will record noise and speech. Goal: Reconstruct original signal Wie

ECE 636: Systems identification

Lecture 2: ARMA(p,q) models (part 2)

MAT3379 (Winter 2016)

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Akaike criterion: Kullback-Leibler discrepancy

Transcription:

STAD57 Time Series Analysis Lecture 23 1

Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)= U(ω+dω) U(ω), such that: Process U(ω) has ( ) U( ) 12 12 E du( ) 0 & Var du( ) f ( ) d 12 Var U( ) f ( v) dv F( ) 2

Frequency Domain Representation Spectral Representation Theorem: For any stationary process {X t } w/ autocovariance γ(h) there is a spectral density f(ω) such that: 1 2 i 1 2 2 h i 2 t 1 2 t 1 2 ( h) e f ( ) d & X e du ( ) where U(ω) is stochastic process w/ independent 0-mean increments & Var U( ) F( ) f ( v) dv 12 Moreover, spectral density f(ω) is given by: i2 h ( ) ( ), 1 2 1 2 f h e h 3

Frequency Domain Representation Spectral density f (ω) uniquely determines autocovariance γ (h) & vice versa Duality between time & frequency domain For real series (X t R) the spectral density f(ω) is symmetric around 0 & measures variance (i.e. strength) at frequency ω in the series f ( ) F( ) F(1/ 2) (0) 12 12 12 12 4

Example Find spectral density of 2 { Wt} ~ WN(0, w) 5

Spectral Representation of ARMA Process For causal & invertible ARMA(p,q) process 2 ( B) Xt ( B) Wt, Wt ~ WN(0, w) can show that its spectral density is given by f p ( z) 1 j 1 jz where p ( e ) ( z) 1 j 1 jz i2 2 ( e ) 2 ( ) w, i2 2 k k ratio of squared complex moduli AR & MA polynomials 6

Example Spectral density of MA(1): Xt Wt Wt 1 7

0.5 1.0 1.5 2.0 Example X W W f t t t1 w f ( ) 2.5, 1 ( ) 1.25 cos(2 ) In R: arma.spec( ma=.5, var.noise=1, log='no') MA part σ 2 w don t plot f(ω) on log-scale 8

Example Spectral density of AR(1): X X W t t 1 t 9

0.5 1.5 2.5 3.5 Example X X W f 2.5, 1 ( ) t t1 t w 1 1.25 cos(2 ) f ( ) 10

0.5 1.0 1.5 2.0 2.5 3.0 0 1 2 3 4 5 6 0 1 2 3 4 5 Example AR(2) : Xt.9 Xt 1.5Xt2Wt MA(2) : Xt Wt.6 Wt 1.6W t2 f ( ) ARMA(2,2) : X.9 X.5X t t1 t2 W.6 W.6W t t1 t2 11

-0.4 0.0 0.4-0.4 0.0 0.4 Example Signals from earthquake (eq) & explosion (ex) (eq) (ex) 0 500 1000 1500 2000 Want automatic way to distinguish the 2 phenomena based on series characteristics 0 500 1000 1500 2000 12

-0.5 0.0 0.5 1.0-0.5 0.0 0.5-0.5 0.0 0.5 1.0-0.5 0.0 0.5 1.0 Example (cont d) ACF PACF (eq) 0 20 40 60 80 100 0 20 40 60 80 100 (ex) 0 20 40 60 80 100 0 20 40 60 80 100 13

Example (cont d) Can try to use ARMA, but fitted models are hard to interpret in term of signal behavior (eq) ARIMA(3,0,4) with zero mean ar1 ar2 ar3 ma1 ma2 ma3 ma4 2.3782-1.8504 0.4543 0.3872-0.5436-0.4937-0.2900 s.e. 0.0695 0.1357 0.0688 0.0718 0.0497 0.0632 0.0567 sigma^2 estimated as 0.000144: log likelihood=1504.99 (ex) ARIMA(3,0,4) with zero mean ar1 ar2 ar3 ma1 ma2 ma3 ma4 1.9941-1.5476 0.3741-0.9165-0.6511 0.6297 0.0915 s.e. 0.1269 0.2086 0.1222 0.1326 0.1154 0.0965 0.0970 sigma^2 estimated as 0.002096: log likelihood=831.22 14

0.0 0.4 0.8 0.0 0.4 0.8 Example (cont d) Look at corresp. spectral densities instead (eq) f ( ) (ex) 15

1e-07 1e-03 1e-07 1e-03 Example (cont d) Or, more commonly, use the log-spectrum (eq) log-scale on y-axis log f ( ) (ex) 16

Parametric Spectral Estimation Spectral densities in previous example were derived from fitted ARMA model, i.e. from 2 estimated parameters ˆ,, ˆ, ˆ,, ˆ, ˆ 1 p 1 q w Called parametric spectral estimation In practice, it is often preferable to use AR instead of ARMA model AR is easier/faster to fit (w/ Yule-Walker) There is always an AR model that approximates f(ω) pretty well, but w/ possibly large order Can use AIC/BIC to select AR model order p 17

1e-07 1e-03 1e-07 1e-03 Example AR vs ARMA parametric estimates of f (ω) (eq) log f ( ) (ex) AR(12) - - ARMA(3,4) AR(26) - - ARMA(3,4) Parametric f (ω) estimation w/ AR better suited for spectra w/ lots of peaks R function: spectrum(eq, method="ar") 18

Nonparametric Spectral Estimation It is possible to avoid fitting ARMA model & estimate f (ω) directly from data instead Method is called nonparametric spectral estimation and uses the periodogram For data (x 1,,x n ) & frequencies ω j =j/n, j=0,,n 1 ( ) Discrete Fourier Transform (DFT): Periodogram: 1/2 n i2 jt d( j) n x t 1 te I( ) d( ) j j 2 j 0,..., n 1 ( ) ω j =j/n, j=0,,n 1 called Fourier or fundamental frequencies 19

Periodogram Periodogram is like sample version of f(ω) n1 ˆ i2 j h i2h j h( n1) h I( ) ( h) e & f ( ) ( h) e In particular, as n, we have: i2h [ ( )] ( ) ( ) h E I h e f I.e. periodogram I(ω j ) is unbiased estimate of spectral density f(ω), where ω j is closest fundamental frequency to ω (for #n of data) 20

1e-07 1e-03 1e-07 1e-03 Example Raw I(ω) - - ARMA f (ω) (eq) (ex) Raw I(ω) - - ARMA f (ω) R function: spec.pgram(eq,taper=0) 21

1e-07 1e-05 1e-03 1e-01 Smoothed Periodogram Raw periodogram tends to be noisy (choppy) So, typically look at smoothed periodogram To smooth I(ω j ) use kernel smoother, i.e. a weighted moving average Average I(ω j ) within each rolling window with certain weights 22

1e-07 1e-03 1e-07 1e-03 Example (eq) Smooth I(ω) - - ARMA f (ω) (ex) Smooth I(ω) - - ARMA f (ω) R function: spectrum(eq, spans = c(13,3)) smoothing params 23

Final Exam Monday Apr 29, 9am-12pm @ IC212 3hrs total, ~8 questions Aids allowed: scientific calculator one 2-sided, standard letter-sized (8½ 11) aid sheet with your own notes Will hold office hours before exam see announcements on Bb 24

Final Exam Material covered: everything in lectures 1-23 In terms of textbook: Chapter 1: sections 1.1-1.6 Chapter 2: sections 2.1-2.3 (2.2 is Regression review; just read it through) Chapter 3: sections 3.1-3.9 (for 3.3 just study the examples, not solutions to difference equations) Chapter 4: sections 4.1-4.3 Chapter 5: sections 5.2 and 5.4 25

Final Exam No R programming questions in final But you should know how to interpret R output To prepare: go over problem sets, term tests & assignments Can also look at past exams from library, but I don t have sols Note: there will be no remarks for term tests & assignments after Mon, April 22 Contact me earlier if you suspect marking error 26