Introduction to Time Series Analysis. Lecture 19.

Size: px
Start display at page:

Download "Introduction to Time Series Analysis. Lecture 19."

Transcription

1 Introduction to Time Series Analysis. Lecture Review: Spectral density estimation, sample autocovariance. 2. The periodogram and sample autocovariance. 3. Asymptotics of the periodogram. 1

2 Estimating the Spectrum: Outline We have seen that the spectral density gives an alternative view of stationary time series. Given a realization x 1,...,x n of a time series, how can we estimate the spectral density? One approach: replace γ( ) in the definition f(ν) = h= γ(h)e 2πiνh, with the sample autocovariance ˆγ( ). Another approach, called the periodogram: compute I(ν), the squared modulus of the discrete Fourier transform (at frequencies ν = k/n). 2

3 Estimating the spectrum: Outline These two approaches are identical at the Fourier frequencies ν = k/n. The asymptotic expectation of the periodogram I(ν) isf(ν). We can derive some asymptotic properties, and hence do hypothesis testing. Unfortunately, the asymptotic variance ofi(ν) is constant. It is not a consistent estimator of f(ν). 3

4 Review: Spectral density estimation If a time series {X t } has autocovariance γ satisfying h= γ(h) <, then we define its spectral density as f(ν) = h= γ(h)e 2πiνh for < ν <. 4

5 Review: Sample autocovariance Idea: use the sample autocovariance ˆγ( ), defined by ˆγ(h) = 1 n n h (x t+ h x)(x t x), for n < h < n, as an estimate of the autocovariance γ( ), and then use ˆf(ν) = n 1 h= n+1 ˆγ(h)e 2πiνh for 1/2 ν 1/2. 5

6 Discrete Fourier transform For a sequence (x 1,...,x n ), define the discrete Fourier transform (DFT) as (X(ν 0 ),X(ν 1 ),...,X(ν n 1 )), where X(ν k ) = 1 n n x t e 2πiν kt, and ν k = k/n (for k = 0,1,...,n 1) are called the Fourier frequencies. (Think of{ν k : k = 0,...,n 1} as the discrete version of the frequency range ν [0,1].) First, let s show that we can view the DFT as a representation ofxin a different basis, the Fourier basis. 6

7 Discrete Fourier transform Consider the space C n of vectors ofncomplex numbers, with inner product a,b = a b, where a is the complex conjugate transpose of the vector a C n. Suppose that a set {φ j : j = 0,1,...,n 1} of n vectors inc n are orthonormal: 1 ifj = k, φ j,φ k = 0 otherwise. Then these {φ j } span the vector space C n, and so for any vector x, we can write x in terms of this new orthonormal basis, x = n 1 φ j,x φ j. (picture) j=0 7

8 Discrete Fourier transform Consider the following set ofnvectors inc n : { e j = 1 ( e 2πiν j,e 2πi2ν j,...,e 2πinν ) } j : j = 0,...,n 1. n It is easy to check that these vectors are orthonormal: e j,e k = 1 n e 2πit(ν k ν j ) = 1 n (e 2πi(k j)/n) t n n 1 ifj = k, = 1 n e2πi(k j)/n 1 (e 2πi(k j)/n ) n otherwise 1 e 2πi(k j)/n 1 ifj = k, = 0 otherwise, 8

9 Discrete Fourier transform where we have used the fact that S n = n αt satisfies αs n = S n +α n+1 αand sos n = α(1 α n )/(1 α) for α 1. So we can represent the real vector x = (x 1,...,x n ) C n in terms of this orthonormal basis, x = n 1 e j,x e j = j=0 n 1 j=0 X(ν j )e j. That is, the vector of discrete Fourier transform coefficients (X(ν 0 ),...,X(ν n 1 )) is the representation ofxin the Fourier basis. 9

10 Discrete Fourier transform An alternative way to represent the DFT is by separately considering the real and imaginary parts, X(ν j ) = e j,x = 1 n n = 1 n n e 2πitν j x t = X c (ν j ) ix s (ν j ), cos(2πtν j )x t i 1 n n sin(2πtν j )x t where this defines the sine and cosine transforms, X s andx c, of x. 10

11 Periodogram The periodogram is defined as I(ν j ) = X(ν j ) 2 = 1 n n 2 e 2πitν j x t = X 2 c(ν j )+X 2 s(ν j ). X c (ν j ) = 1 n n X s (ν j ) = 1 n n cos(2πtν j )x t, sin(2πtν j )x t. 11

12 Periodogram SinceI(ν j ) = X(ν j ) 2 for one of the Fourier frequencies ν j = j/n (for j = 0,1,...,n 1), the orthonormality of thee j implies that we can write n 1 n 1 x x = X(ν j )e j X(ν j )e j j=0 j=0 = n 1 j=0 X(ν j ) 2 = n 1 j=0 I(ν j ). For x = 0, we can write this as ˆσ 2 x = 1 n n x 2 t = 1 n n 1 j=0 I(ν j ). 12

13 Periodogram This is the discrete analog of the identity σ 2 x = γ x (0) = 1/2 1/2 f x (ν)dν. (Think ofi(ν j ) as the discrete version off(ν) at the frequency ν j = j/n, and think of(1/n) ν j as the discrete version of ν dν.) 13

14 Estimating the spectrum: Periodogram Why is the periodogram at a Fourier frequency (that is,ν = ν j ) the same as computingf(ν) from the sample autocovariance? Almost the same they are not the same at ν 0 = 0 when x 0. But if either x = 0, or we consider a Fourier frequency ν j with j {1,...,n 1},... 14

15 Estimating the spectrum: Periodogram I(ν j ) = 1 n = 1 n = 1 n n 2 e 2πitν j x t = 1 2 n e 2πitν j (x n t x) ( n )( n ) e 2πitν j (x t x) e 2πitν j (x t x) e 2πi(s t)ν j (x s x)(x t x) = s,t n 1 h= n+1 ˆγ(h)e 2πihν j, where the fact that ν j 0 implies n e 2πitν j = 0 (we showed this when we were verifying the orthonormality of the Fourier basis) has allowed us to subtract the sample mean in that case. 15

16 Asymptotic properties of the periodogram We want to understand the asymptotic behavior of the periodogram I(ν) at a particular frequency ν, as n increases. We ll see that its expectation converges tof(ν). We ll start with a simple example: Suppose that X 1,...,X n are i.i.d.n(0,σ 2 ) (Gaussian white noise). From the definitions, X c (ν j ) = 1 n n cos(2πtν j )x t, X s (ν j ) = 1 n n we have that X c (ν j ) and X s (ν j ) are normal, with EX c (ν j ) = EX s (ν j ) = 0. sin(2πtν j )x t, 16

17 Asymptotic properties of the periodogram Also, Var(X c (ν j )) = σ2 n = σ2 2n n cos 2 (2πtν j ) n (cos(4πtν j )+1) = σ2 2. Similarly, Var(X s (ν j )) = σ 2 /2. 17

18 Asymptotic properties of the periodogram Also, Cov(X c (ν j ),X s (ν j )) = σ2 n = σ2 2n Cov(X c (ν j ),X c (ν k )) = 0 Cov(X s (ν j ),X s (ν k )) = 0 Cov(X c (ν j ),X s (ν k )) = 0. n cos(2πtν j )sin(2πtν j ) n sin(4πtν j ) = 0, for any j k. 18

19 Asymptotic properties of the periodogram That is, if X 1,...,X n are i.i.d.n(0,σ 2 ) (Gaussian white noise; f(ν) = σ 2 ), then thex c (ν j ) and X s (ν j ) are all i.i.d.n(0,σ 2 /2). Thus, 2 σ 2I(ν j) = 2 σ 2 ( X 2 c (ν j )+X 2 s(ν j ) ) χ 2 2. So for the case of Gaussian white noise, the periodogram has a chi-squared distribution that depends on the variance σ 2 (which, in this case, is the spectral density). 19

20 Asymptotic properties of the periodogram Under more general conditions (e.g., normal {X t }, or linear process {X t } with rapidly decaying ACF), thex c (ν j ),X s (ν j ) are all asymptotically independent andn(0,f(ν j )/2). Consider a frequency ν. For a given value of n, let ˆν (n) be the closest Fourier frequency (that is, ˆν (n) = j/n for a value ofj that minimizes ν j/n ). Asnincreases, ˆν (n) ν, and (under the same conditions that ensure the asymptotic normality and independence of the sine/cosine transforms), f(ˆν (n) ) f(ν). (picture) In that case, we have 2 f(ν) I(ˆν(n) ) = 2 f(ν) ( X 2 c(ˆν (n) )+X 2 s(ˆν (n) )) d χ

21 Asymptotic properties of the periodogram Thus, EI(ˆν (n) ) = f(ν) ( 2 2 E f(ν) f(ν) 2 E(Z2 1 +Z 2 2) = f(ν), ( ) ) Xc(ˆν 2 (n) )+Xs(ˆν 2 (n) ) where Z 1,Z 2 are independent N(0,1). Thus, the periodogram is asymptotically unbiased. 21

22 Introduction to Time Series Analysis. Lecture Review: Spectral density estimation, sample autocovariance. 2. The periodogram and sample autocovariance. 3. Asymptotics of the periodogram. 22

Introduction to Time Series Analysis. Lecture 18.

Introduction to Time Series Analysis. Lecture 18. Introduction to Time Series Analysis. Lecture 18. 1. Review: Spectral density, rational spectra, linear filters. 2. Frequency response of linear filters. 3. Spectral estimation 4. Sample autocovariance

More information

Introduction to Time Series Analysis. Lecture 15.

Introduction to Time Series Analysis. Lecture 15. Introduction to Time Series Analysis. Lecture 15. Spectral Analysis 1. Spectral density: Facts and examples. 2. Spectral distribution function. 3. Wold s decomposition. 1 Spectral Analysis Idea: decompose

More information

Introduction to Time Series Analysis. Lecture 21.

Introduction to Time Series Analysis. Lecture 21. Introduction to Time Series Analysis. Lecture 21. 1. Review: The periodogram, the smoothed periodogram. 2. Other smoothed spectral estimators. 3. Consistency. 4. Asymptotic distribution. 1 Review: Periodogram

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

STAD57 Time Series Analysis. Lecture 23

STAD57 Time Series Analysis. Lecture 23 STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Statistics of Stochastic Processes

Statistics of Stochastic Processes Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Advanced Digital Signal Processing -Introduction

Advanced Digital Signal Processing -Introduction Advanced Digital Signal Processing -Introduction LECTURE-2 1 AP9211- ADVANCED DIGITAL SIGNAL PROCESSING UNIT I DISCRETE RANDOM SIGNAL PROCESSING Discrete Random Processes- Ensemble Averages, Stationary

More information

Notes on Random Processes

Notes on Random Processes otes on Random Processes Brian Borchers and Rick Aster October 27, 2008 A Brief Review of Probability In this section of the course, we will work with random variables which are denoted by capital letters,

More information

The Discrete Fourier Transform (DFT) Properties of the DFT DFT-Specic Properties Power spectrum estimate. Alex Sheremet.

The Discrete Fourier Transform (DFT) Properties of the DFT DFT-Specic Properties Power spectrum estimate. Alex Sheremet. 4. April 2, 27 -order sequences Measurements produce sequences of numbers Measurement purpose: characterize a stochastic process. Example: Process: water surface elevation as a function of time Parameters:

More information

Introduction to Signal Processing

Introduction to Signal Processing to Signal Processing Davide Bacciu Dipartimento di Informatica Università di Pisa bacciu@di.unipi.it Intelligent Systems for Pattern Recognition Signals = Time series Definitions Motivations A sequence

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

8.2 Harmonic Regression and the Periodogram

8.2 Harmonic Regression and the Periodogram Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process

More information

6.435, System Identification

6.435, System Identification System Identification 6.435 SET 3 Nonparametric Identification Munther A. Dahleh 1 Nonparametric Methods for System ID Time domain methods Impulse response Step response Correlation analysis / time Frequency

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Fourier Analysis of Stationary and Non-Stationary Time Series

Fourier Analysis of Stationary and Non-Stationary Time Series Fourier Analysis of Stationary and Non-Stationary Time Series September 6, 2012 A time series is a stochastic process indexed at discrete points in time i.e X t for t = 0, 1, 2, 3,... The mean is defined

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

A=randn(500,100); mu=mean(a); sigma_a=std(a); std_a=sigma_a/sqrt(500); [std(mu) mean(std_a)] % compare standard deviation of means % vs standard error

A=randn(500,100); mu=mean(a); sigma_a=std(a); std_a=sigma_a/sqrt(500); [std(mu) mean(std_a)] % compare standard deviation of means % vs standard error UCSD SIOC 221A: (Gille) 1 Reading: Bendat and Piersol, Ch. 5.2.1 Lecture 10: Recap Last time we looked at the sinc function, windowing, and detrending with an eye to reducing edge effects in our spectra.

More information

Wavelet Methods for Time Series Analysis. Part IX: Wavelet-Based Bootstrapping

Wavelet Methods for Time Series Analysis. Part IX: Wavelet-Based Bootstrapping Wavelet Methods for Time Series Analysis Part IX: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work with

More information

Statistics 349(02) Review Questions

Statistics 349(02) Review Questions Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation

More information

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements Don Percival Applied Physics Lab, University of Washington, Seattle overheads for talk available at http://staff.washington.edu/dbp/talks.html

More information

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

Wavelet Methods for Time Series Analysis. Motivating Question

Wavelet Methods for Time Series Analysis. Motivating Question Wavelet Methods for Time Series Analysis Part VII: Wavelet-Based Bootstrapping start with some background on bootstrapping and its rationale describe adjustments to the bootstrap that allow it to work

More information

STAT 153: Introduction to Time Series

STAT 153: Introduction to Time Series STAT 153: Introduction to Time Series Instructor: Aditya Guntuboyina Lectures: 12:30 pm - 2 pm (Tuesdays and Thursdays) Office Hours: 10 am - 11 am (Tuesdays and Thursdays) 423 Evans Hall GSI: Brianna

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Introduction to Time Series Analysis. Lecture 7.

Introduction to Time Series Analysis. Lecture 7. Last lecture: Introduction to Time Series Analysis. Lecture 7. Peter Bartlett 1. ARMA(p,q) models: stationarity, causality, invertibility 2. The linear process representation of ARMA processes: ψ. 3. Autocovariance

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Spectral analysis. Reference note to lecture 9 in ECON 5101 Time Series Econometrics

Spectral analysis. Reference note to lecture 9 in ECON 5101 Time Series Econometrics Spectral analysis. Reference note to lecture 9 in ECON 5101 Time Series Econometrics Ragnar Nymoen March 31 2014 1 Introduction This reference note is a self contained supplement to for example Ch 6 in

More information

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

Outline. STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Summary Statistics of Midterm Scores

Outline. STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Summary Statistics of Midterm Scores STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Outline 1 Midterm Results 2 Multiple Series and Cross-Spectra 3 Linear Filters Arthur Berg STA 6857 Cross Spectra,

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science : Discrete-Time Signal Processing Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.34: Discrete-Time Signal Processing OpenCourseWare 006 ecture 8 Periodogram Reading: Sections 0.6 and 0.7

More information

STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8)

STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) STA 6857 Cross Spectra, Linear Filters, and Parametric Spectral Estimation ( 4.6, 4.7, & 4.8) Outline 1 Midterm Results 2 Multiple Series and Cross-Spectra 3 Linear Filters Arthur Berg STA 6857 Cross Spectra,

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

ECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES

ECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES ECO 513 Fall 2009 C. Sims CONDIIONAL EXPECAION; SOCHASIC PROCESSES 1. HREE EXAMPLES OF SOCHASIC PROCESSES (I) X t has three possible time paths. With probability.5 X t t, with probability.25 X t t, and

More information

Lecture 4: Least Squares (LS) Estimation

Lecture 4: Least Squares (LS) Estimation ME 233, UC Berkeley, Spring 2014 Xu Chen Lecture 4: Least Squares (LS) Estimation Background and general solution Solution in the Gaussian case Properties Example Big picture general least squares estimation:

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Signals and Systems Lecture (S2) Orthogonal Functions and Fourier Series March 17, 2008

Signals and Systems Lecture (S2) Orthogonal Functions and Fourier Series March 17, 2008 Signals and Systems Lecture (S) Orthogonal Functions and Fourier Series March 17, 008 Today s Topics 1. Analogy between functions of time and vectors. Fourier series Take Away Periodic complex exponentials

More information

Figure 18: Top row: example of a purely continuous spectrum (left) and one realization

Figure 18: Top row: example of a purely continuous spectrum (left) and one realization 1..5 S(). -.2 -.5 -.25..25.5 64 128 64 128 16 32 requency time time Lag 1..5 S(). -.5-1. -.5 -.1.1.5 64 128 64 128 16 32 requency time time Lag Figure 18: Top row: example o a purely continuous spectrum

More information

Lesson 7: Estimation of Autocorrelation and Partial Autocorrela

Lesson 7: Estimation of Autocorrelation and Partial Autocorrela Lesson 7: Estimation of Autocorrelation and Partial Autocorrelation Function Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Estimation

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

CS711008Z Algorithm Design and Analysis

CS711008Z Algorithm Design and Analysis CS711008Z Algorithm Design and Analysis Lecture 5 FFT and Divide and Conquer Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 / 56 Outline DFT: evaluate a polynomial

More information

STA 4322 Exam I Name: Introduction to Statistics Theory

STA 4322 Exam I Name: Introduction to Statistics Theory STA 4322 Exam I Name: Introduction to Statistics Theory Fall 2013 UF-ID: Instructions: There are 100 total points. You must show your work to receive credit. Read each part of each question carefully.

More information

Statistics 910, #5 1. Regression Methods

Statistics 910, #5 1. Regression Methods Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known

More information

Characteristics of Time Series

Characteristics of Time Series Characteristics of Time Series Al Nosedal University of Toronto January 12, 2016 Al Nosedal University of Toronto Characteristics of Time Series January 12, 2016 1 / 37 Signal and Noise In general, most

More information

IEOR 165 Lecture 7 1 Bias-Variance Tradeoff

IEOR 165 Lecture 7 1 Bias-Variance Tradeoff IEOR 165 Lecture 7 Bias-Variance Tradeoff 1 Bias-Variance Tradeoff Consider the case of parametric regression with β R, and suppose we would like to analyze the error of the estimate ˆβ in comparison to

More information

Spectral Analysis of Random Processes

Spectral Analysis of Random Processes Spectral Analysis of Random Processes Spectral Analysis of Random Processes Generally, all properties of a random process should be defined by averaging over the ensemble of realizations. Generally, all

More information

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements

A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements A Tutorial on Stochastic Models and Statistical Analysis for Frequency Stability Measurements Don Percival Applied Physics Lab, University of Washington, Seattle overheads for talk available at http://staffwashingtonedu/dbp/talkshtml

More information

Math 181B Homework 1 Solution

Math 181B Homework 1 Solution Math 181B Homework 1 Solution 1. Write down the likelihood: L(λ = n λ X i e λ X i! (a One-sided test: H 0 : λ = 1 vs H 1 : λ = 0.1 The likelihood ratio: where LR = L(1 L(0.1 = 1 X i e n 1 = λ n X i e nλ

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Technische Universität Kaiserslautern WS 2016/17 Fachbereich Mathematik Prof. Dr. J. Franke 19. January 2017

Technische Universität Kaiserslautern WS 2016/17 Fachbereich Mathematik Prof. Dr. J. Franke 19. January 2017 Technische Universität Kaiserslautern WS 2016/17 Fachbereich Mathematik Prof. Dr. J. Franke 19. January 2017 Exercises for Computational Modelling with Statistics IV This exercise is about the spectral

More information

Lecture 11: Spectral Analysis

Lecture 11: Spectral Analysis Lecture 11: Spectral Analysis Methods For Estimating The Spectrum Walid Sharabati Purdue University Latest Update October 27, 2016 Professor Sharabati (Purdue University) Time Series Analysis October 27,

More information

Lecture 4 - Spectral Estimation

Lecture 4 - Spectral Estimation Lecture 4 - Spectral Estimation The Discrete Fourier Transform The Discrete Fourier Transform (DFT) is the equivalent of the continuous Fourier Transform for signals known only at N instants separated

More information

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 2 - Probability Models Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 34 Agenda 1 Introduction 2 Stochastic Process Definition 1 Stochastic Definition

More information

STT 843 Key to Homework 1 Spring 2018

STT 843 Key to Homework 1 Spring 2018 STT 843 Key to Homework Spring 208 Due date: Feb 4, 208 42 (a Because σ = 2, σ 22 = and ρ 2 = 05, we have σ 2 = ρ 2 σ σ22 = 2/2 Then, the mean and covariance of the bivariate normal is µ = ( 0 2 and Σ

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

Financial Econometrics and Quantitative Risk Managenent Return Properties

Financial Econometrics and Quantitative Risk Managenent Return Properties Financial Econometrics and Quantitative Risk Managenent Return Properties Eric Zivot Updated: April 1, 2013 Lecture Outline Course introduction Return definitions Empirical properties of returns Reading

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Cross Spectra, Linear Filters, and Parametric Spectral Estimation

Cross Spectra, Linear Filters, and Parametric Spectral Estimation Cross Spectra,, and Parametric Spectral Estimation Outline 1 Multiple Series and Cross-Spectra 2 Arthur Berg Cross Spectra,, and Parametric Spectral Estimation 2/ 20 Outline 1 Multiple Series and Cross-Spectra

More information

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics)

Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Probability quantifies randomness and uncertainty How do I estimate the normalization and logarithmic slope of a X ray continuum, assuming

More information

Kloosterman sums and Fourier coefficients

Kloosterman sums and Fourier coefficients Kloosterman sums and Fourier coefficients This note contains the slides of my lecture at Haifa, with some additional remarks. Roberto Miatello and I have worked for years on the sum formula in various

More information

Practical Spectral Estimation

Practical Spectral Estimation Digital Signal Processing/F.G. Meyer Lecture 4 Copyright 2015 François G. Meyer. All Rights Reserved. Practical Spectral Estimation 1 Introduction The goal of spectral estimation is to estimate how the

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Thesis Proposal The Short Time Fourier Transform and Local Signals

Thesis Proposal The Short Time Fourier Transform and Local Signals Thesis Proposal The Short Fourier Transform and Local Signals Shuhei Oumura June, 010 1 Introduction/Summary In this thesis, I am going to investigate the properties of the short time Fourier transform

More information

DOA Estimation using MUSIC and Root MUSIC Methods

DOA Estimation using MUSIC and Root MUSIC Methods DOA Estimation using MUSIC and Root MUSIC Methods EE602 Statistical signal Processing 4/13/2009 Presented By: Chhavipreet Singh(Y515) Siddharth Sahoo(Y5827447) 2 Table of Contents 1 Introduction... 3 2

More information

ECE 673-Random signal analysis I Final

ECE 673-Random signal analysis I Final ECE 673-Random signal analysis I Final Q ( point) Comment on the following statement "If cov(x ; X 2 ) 0; then the best predictor (without limitation on the type of predictor) of the random variable X

More information

Tiling functions and Gabor orthonormal bases

Tiling functions and Gabor orthonormal bases Tiling functions and Gabor orthonormal bases Elona Agora, Jorge Antezana, and Mihail N. Kolountzakis Abstract We study the existence of Gabor orthonormal bases with window the characteristic function of

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering Stochastic Processes and Linear Algebra Recap Slides Stochastic processes and variables XX tt 0 = XX xx nn (tt) xx 2 (tt) XX tt XX

More information

1.4 Properties of the autocovariance for stationary time-series

1.4 Properties of the autocovariance for stationary time-series 1.4 Properties of the autocovariance for stationary time-series In general, for a stationary time-series, (i) The variance is given by (0) = E((X t µ) 2 ) 0. (ii) (h) apple (0) for all h 2 Z. ThisfollowsbyCauchy-Schwarzas

More information

X random; interested in impact of X on Y. Time series analogue of regression.

X random; interested in impact of X on Y. Time series analogue of regression. Multiple time series Given: two series Y and X. Relationship between series? Possible approaches: X deterministic: regress Y on X via generalized least squares: arima.mle in SPlus or arima in R. We have

More information

Multivariate Regression

Multivariate Regression Multivariate Regression The so-called supervised learning problem is the following: we want to approximate the random variable Y with an appropriate function of the random variables X 1,..., X p with the

More information

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University Lecture 19 Modeling Topics plan: Modeling (linear/non- linear least squares) Bayesian inference Bayesian approaches to spectral esbmabon;

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

Multiscale and multilevel technique for consistent segmentation of nonstationary time series

Multiscale and multilevel technique for consistent segmentation of nonstationary time series Multiscale and multilevel technique for consistent segmentation of nonstationary time series Haeran Cho Piotr Fryzlewicz University of Bristol London School of Economics INSPIRE 2009 Imperial College London

More information

A Hilbert Space for Random Processes

A Hilbert Space for Random Processes Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts A Hilbert Space for Random Processes I A vector space for random processes X t that is analogous to L 2 (a, b) is of

More information

Delta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0)

Delta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0) Delta Method Often estimators are functions of other random variables, for example in the method of moments. These functions of random variables can sometimes inherit a normal approximation from the underlying

More information

Homework 2 Solutions

Homework 2 Solutions Math 506 Spring 201 Homework 2 Solutions 1. Textbook Problem 2.7: Since {Y t } is stationary, E[Y t ] = µ Y for all t and {Y t } has an autocovariance function γ Y. Therefore, (a) E[W t ] = E[Y t Y t 1

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information