Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Similar documents
Statistics of stochastic processes

Econometría 2: Análisis de series de Tiempo

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Econ 424 Time Series Concepts

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Time Series Analysis -- An Introduction -- AMS 586

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

ARMA Models: I VIII 1

Module 9: Stationary Processes

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Time Series 2. Robert Almgren. Sept. 21, 2009

1 Class Organization. 2 Introduction

STAT Financial Time Series

Lesson 4: Stationary stochastic processes

white noise Time moving average

Time Series 3. Robert Almgren. Sept. 28, 2009

Stochastic Processes. A stochastic process is a function of two variables:

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Classic Time Series Analysis

Nonlinear time series

Review Session: Econometrics - CLEFIN (20192)

Some Time-Series Models

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

Introduction to Stochastic processes

Discrete time processes

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

1 Linear Difference Equations

STAT 248: EDA & Stationarity Handout 3

Chapter 3 - Temporal processes

Gaussian processes. Basic Properties VAG002-

Lecture 4 - Spectral Estimation

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

Spectral representations and ergodic theorems for stationary stochastic processes

Stochastic Processes. Monday, November 14, 11

Akaike criterion: Kullback-Leibler discrepancy

2. An Introduction to Moving Average Models and ARMA Models

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Random Processes Why we Care

3. ARMA Modeling. Now: Important class of stationary processes

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

3 Theory of stationary random processes

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

Chapter 6. Random Processes

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 2: Repetition of probability theory and statistics

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Nonlinear Time Series Modeling

Classical Decomposition Model Revisited: I

at least 50 and preferably 100 observations should be available to build a proper model

Introduction to ARMA and GARCH processes

Applied time-series analysis

Autoregressive Moving Average (ARMA) Models and their Practical Applications

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

Properties of the Autocorrelation Function

Econ 623 Econometrics II Topic 2: Stationary Time Series

1. Stochastic Processes and Stationarity

Statistics 349(02) Review Questions

Multivariate Time Series

Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2

Characteristics of Time Series

Chapter 4: Models for Stationary Time Series

Final Exam. Economics 835: Econometrics. Fall 2010

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

Statistical signal processing

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

8.2 Harmonic Regression and the Periodogram

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

3 ARIMA Models. 3.1 Introduction

Time series and spectral analysis. Peter F. Craigmile

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Gaussian Random Variables Why we Care

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

A time series is a set of observations made sequentially in time.

Long-range dependence

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

Ross Bettinger, Analytical Consultant, Seattle, WA

Lecture 2: ARMA(p,q) models (part 2)

Random Variables and Their Distributions

ECON 616: Lecture 1: Time Series Basics

5: MULTIVARATE STATIONARY PROCESSES

Testing for IID Noise/White Noise: I

16.584: Random (Stochastic) Processes

ARIMA Modelling and Forecasting

Statistics of Stochastic Processes

If we want to analyze experimental or simulated data we might encounter the following tasks:

Class 1: Stationary Time Series Analysis

STAT Chapter 5 Continuous Distributions

Regression with correlation for the Sales Data

Transcription:

Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1

stochastic process is: Stochastic Processes: II informally: bowl + drawing mechanism formally: family of random variables (RVs) indexed by t t called time for convenience real-valued RV: mapping from sample space to real line denote individual real-valued RVs by X t or X(t) notation: discrete parameter stochastic process: {X t } = {X t : t 2 Z}, where Z = {..., 2, 1, 0, 1, 2,...} continuous parameter stochastic process: {X(t)} = {X(t) : t 2 R}, where R = {t : 1 < t < 1} SAPA2e 22, 23, 24 II 2

Basic Theory for Stochastic Processes: I cumulative probability distribution function (CPDF): F t (a) P[X t apple a] can define bivariate CPDF (and multivariate CPDF): note dependence on t F t1,t 2 (a 1, a 2 ) P[X t1 apple a 1, X t2 apple a 2 ] too complicated to use as models for time series summarize using moments (assuming they exist) SAPA2e 24, 25 II 3

Basic Theory for Stochastic Processes: II first moment: µ t E{X t } = = = X i Z 1 Z 1 1 1 x df t (x) (see pages 26 7) xf t (x) dx x i p(x i ) second-order central moments: if f t (x) = df t(x) dx exists if F t ( ) is a step function t 0,t 1 cov{x t0, X t1 } = E{(X t0 µ t0 )(X t1 µ t1 )} 2 t var{x t} = still too complicated! Z 1 1 (x µ t ) 2 df t (x) SAPA2e 25, 26 II 4

Simplification: Stationary Processes {X t } is by definition stationary if: 1. E{X t } = µ, a finite constant independent of t 2. cov{x t+, X t } = s, a finite constant that can depend on (known as the lag), but not t autocovariance sequence (ACVS) for {X t }: {s } = {s : 2 Z} autocovariance function (ACVF) for {X(t)}: s( ) = function of defined by s( ) for 2 R several terms for this type of stationarity: second-order, covariance, wide sense, weak other type of stationarity: complete (strong, strict) stationarity SAPA2e 28, 29 II 5

Basic Properties: I var{x t } = cov{x t, X t } = s 0 (i.e., variance constant over t) s = s (i.e., symmetric about = 0) correlation between X t and X t+ : cov{x t+, X t } corr{x t+, X t } p var{xt } var{x t+ } = cov{x t+, X t } = s var{x t } s 0 autocorrelation sequence (ACS) for {X t }: { } = { : 2 Z} autocorrelation function (ACF) for {X(t)}: ( ) = function defined by ( ) for 2 R apple 1 implies s apple s 0 SAPA2e 29 II 6

Basic Properties: II {s } is positive semidefinite, i.e., n 1 X j=0 n 1 X k=0 s tj t k a j a k 0 (1) holds for any positive integer n, any set of n real numbers a 0,..., a n 1 and any set of n integers t 0,..., t n 1 proof: for W P n 1 j=0 a jx tj = a T V, have 0 apple var{w } = var{a T V} [2.2] = a T a is var/cov matrix for V, i.e., its (j, k)th element is cov{x tj, X tk } = s tj t k a T a = (1) imposes severe limitation (Makhoul, 1990) SAPA2e 29, 30 II 7

Basic Properties: III var/cov matrix for [X 0,..., X N 1 ] T is Toeplitz; when N = 6, have 2 3 s 0 s 1 s 2 s 3 s 4 s 5 s 1 s 0 s 1 s 2 s 3 s 4 s 2 s 1 s 0 s 1 s 2 s 3 6 s 3 s 2 s 1 s 0 s 1 s 2 7 4 s 4 s 3 s 2 s 1 s 0 s 1 5 s 5 s 4 s 3 s 2 s 1 s 0 if {X t } Gaussian, completely characterized by µ and {s } extension: complex-valued stationary process {Z t } cov{z t+, Z t } E{(Z t+ µ)(z t µ) } = s (asterisk indicates complex conjugate) now have s = s (rather than s = s ) SAPA2e 30, 31 II 8

Example White Noise Process assume E{X t } = µ and var{x t } = 2 (< 1) {X t } called white noise if, when 6= 0, X t+ and X t are uncorrelated; i.e., cov{x t+, X t } = 0 white noise is stationary process with ACVS: ( 2, = 0; s = 0, otherwise six realizations of white noise, all with µ = 0 and 2 = 1 SAPA2e 32, 33 II 9

1 0 1 2 White Noise with Gaussian Distribution 0 20 40 60 80 100 t SAPA2e 33 II 10

1.5 0.5 0.0 0.5 1.0 1.5 White Noise with Uniform Distribution 0 20 40 60 80 100 t SAPA2e 33 II 11

3 2 1 0 1 2 White Noise with Double Exponential Distribution 0 20 40 60 80 100 t SAPA2e 33 II 12

4 2 0 2 4 White Noise with Discrete Distribution 0 20 40 60 80 100 t SAPA2e 33 II 13

4 2 0 2 White Noise with Random Distributions 0 20 40 60 80 100 t SAPA2e 33 II 14

4 2 0 2 4 White Noise with Blocky Distribution 0 20 40 60 80 100 t SAPA2e 33 II 15

White Noise and IID Noise all six examples have exactly the same second-order properties! as fifth and sixth examples illustrate, no need for distributions of X t and X t+1 to be the same for second-order stationarity to hold related concept: IID noise {X t : t 2 Z} are independent and identically distributed all IID noise with finite variance is white noise, but converse is false SAPA2e 33 II 16

Example Moving Average Process: I let { t } be white noise, mean 0, variance construct MA(q) process: qx X t = µ q,j t j = µ j=0 2 1X j= 1 q,j t j, where q,0 6= 0, q,q 6= 0 and q,j = 0 for j < 0 or j > q claim: {X t } is stationary, which follows if we can show that E{X t } does not depend on t cov{x t+, X t } does not depend on t (but can depend on ) now E{X t } = µ (why is this true?) SAPA2e 34 II 17

Example Moving Average Process: II since X t µ = P j q,j t j, have 80 1 0 < cov{x t+, X t } = E @ X : q,k t+ k A @ k = X X q,k q,j E{ t+ k t j } k j = X q,j+ 2 q,j (why?) j s note: no restrictions on q,j s X j q,j t j 19 = A ; SAPA2e 34 II 18

Example Moving Average Process: III in terms of just q,0,..., q,q, can reexpress ACVS as ( 2 P q s = j=0 q,j+ q,j, apple q; 0, otherwise note: ACVS for MA(q) process is identically zero when > q note: variance of MA(q) process is qx s 0 = 2 j=0 2 q,j two realizations of MA(1) process X t = t 1,1 t 1 constructed from Gaussian white noise { t }, one with 1,1 = +1, and the other, 1,1 = 1 SAPA2e 34 II 19

6 4 2 0 2 4 6 MA(1) Process with 1,1 = +1; 1 = s 1 /s 0 = 1/2 0 20 40 60 80 100 120 t SAPA2e 35 II 20

6 4 2 0 2 4 6 MA(1) Process with 1,1 = 1; 1 = 1/2 0 20 40 60 80 100 120 t SAPA2e 35 II 21

Example Autoregressive Process let { t } be white noise, mean 0, variance construct AR(p) process px X t = µ + j=1 p,j(x t j µ) + t with p,p 6= 0 claim: {X t } is stationary under restrictions on p,j s proof: Priestley (1981) rich class of processes (Chapter 8) extension: ARMA(p,q) process (Box & Jenkins, 1970) 2 p SAPA2e 34, 35, 37 II 22

4 2 0 2 4 Second Order Autoregressive Process 1st example of realization of AR(2) process X t = 0.75X t 1 0.5X t 2 + t with 2 2 = 1 0 200 400 600 800 1000 t SAPA2e 36 II 23

4 2 0 2 4 Second Order Autoregressive Process 2nd example of realization of AR(2) process X t = 0.75X t 1 0.5X t 2 + t with 2 2 = 1 0 200 400 600 800 1000 t SAPA2e 36 II 23

4 2 0 2 4 Second Order Autoregressive Process 3rd example of realization of AR(2) process X t = 0.75X t 1 0.5X t 2 + t with 2 2 = 1 0 200 400 600 800 1000 t SAPA2e 36 II 23

4 2 0 2 4 Second Order Autoregressive Process 4th example of realization of AR(2) process X t = 0.75X t 1 0.5X t 2 + t with 2 2 = 1 0 200 400 600 800 1000 t SAPA2e 36 II 23

4 2 0 2 4 Fourth Order Autoregressive Process 1st example of realization of AR(4) process (here 2 4 = 0.002) X t = 2.7607X t 1 3.8106X t 2 +2.6535X t 3 0.9238X t 4 + t 0 200 400 600 800 1000 t SAPA2e 36 II 24

4 2 0 2 4 Fourth Order Autoregressive Process 2nd example of realization of AR(4) process (here 2 4 = 0.002) X t = 2.7607X t 1 3.8106X t 2 +2.6535X t 3 0.9238X t 4 + t 0 200 400 600 800 1000 t SAPA2e 36 II 24

4 2 0 2 4 Fourth Order Autoregressive Process 3rd example of realization of AR(4) process (here 2 4 = 0.002) X t = 2.7607X t 1 3.8106X t 2 +2.6535X t 3 0.9238X t 4 + t 0 200 400 600 800 1000 t SAPA2e 36 II 24

4 2 0 2 4 Fourth Order Autoregressive Process 4th example of realization of AR(4) process (here 2 4 = 0.002) X t = 2.7607X t 1 3.8106X t 2 +2.6535X t 3 0.9238X t 4 + t 0 200 400 600 800 1000 t SAPA2e 36 II 24

Example Harmonic Process: I for fixed L > 0, let A 1,..., A L, B 1,..., B L be set of uncorrelated zero mean RVs such that var {A l } = var {B l } l 2 construct harmonic process: LX X t = µ + A l cos (2 f l t) + B l sin (2 f l t), l=1 where µ and f l s are real-valued constants can reexpress harmonic process as LX X t = µ + D l cos (2 f l t + l ), l=1 where Dl 2 = A2 l + B2 l and tan ( l ) = B l /A l note: also have A l = D l cos ( l ) and B l = D l sin ( l ) SAPA2e 37 II 25

3 2 1 0 1 2 3 First Order Gaussian Harmonic Process 1st example of realization of harmonic process X t = A cos (2 20 1 t) + B sin (2 20 1 t), where A & B are independent Gaussian RVs with unit variance 0 20 40 60 80 100 t SAPA2e 38 II 26

3 2 1 0 1 2 3 First Order Gaussian Harmonic Process 2nd example of realization of harmonic process X t = A cos (2 20 1 t) + B sin (2 20 1 t), where A & B are independent Gaussian RVs with unit variance 0 20 40 60 80 100 t SAPA2e 38 II 26

3 2 1 0 1 2 3 First Order Gaussian Harmonic Process 3rd example of realization of harmonic process X t = A cos (2 20 1 t) + B sin (2 20 1 t), where A & B are independent Gaussian RVs with unit variance 0 20 40 60 80 100 t SAPA2e 38 II 26

Example Harmonic Process: II claim: harmonic process LX X t = µ + A l cos (2 f l t) + B l sin (2 f l t) is stationary (?!) l=1 proof is subject of Exercise [38], which shows ACVS to be s = LX l=1 2 l cos (2 f l ) (same f l s as making up X t, but cosine terms are all in phase ) SAPA2e 37, 38 II 27

Example Harmonic Process: III specialize to case where RVs A l and B l are Gaussian recall, if Y 0, Y 1,..., Y 1 are IID zero mean Gaussian RVs with unit variance, 2 Y0 2 + Y 1 2 + + Y 2 1 said to have chi-square distribution with degrees of freedom since A l / l and B l / l are independent zero mean, unit variance Gaussian RVs, Dl 2/ l 2 = (A 2 l + B2 l )/ l 2 has chi-square distribution with 2 degrees of freedom probability density function (PDF) for RV given by ( e u/2 /2, u 0; f 22 (u) 0, u < 0 2 2 SAPA2e 38, 39 II 28

Example Harmonic Process: IV can deduce PDF for Dl 2 from fact that PDF for D2 l / l 2 is 2 2 : ( e u/(2 l 2) /(2 f D 2(u) l 2 ), u 0; l 0, u < 0 above special case of exponential PDF f(u) = exp( u/ )/ with mean value = 2 2 l random amplitude D l is thus the square root of an exponential RV and is said to obey a Rayleigh distribution symmetry of bivariate distribution for A l & B l dictates to be uniformily distributed over (, ] and independent of D l thus Gaussian harmonic processes involve Rayleigh distributed D l s and uniformily distributed l s, with all 2L RVs being independent of one another SAPA2e 39 II 29

Example Harmonic Process: V now consider X t = µ + LX D l cos (2 f l t + l ), l=1 l s are IID RVs uni- where D l s are positive constants, while formily distributed over (, ] X t is necessarily bounded and hence cannot have a Gaussian distribution SAPA2e 39 II 30

3 2 1 0 1 2 3 First-Order Non-Gaussian Process X t 1st example of realization of process X t = p 2 cos (2 20 1 t + ), where is uniformily distributed over interval (, ] 0 20 40 60 80 100 t SAPA2e 38 II 31

3 2 1 0 1 2 3 First-Order Non-Gaussian Process X t 2nd example of realization of process X t = p 2 cos (2 20 1 t + ), where is uniformily distributed over interval (, ] 0 20 40 60 80 100 t SAPA2e 38 II 31

3 2 1 0 1 2 3 First-Order Non-Gaussian Process X t 3rd example of realization of process X t = p 2 cos (2 20 1 t + ), where is uniformily distributed over interval (, ] 0 20 40 60 80 100 t SAPA2e 38 II 31

Example Harmonic Process: VI claim: X t = µ + is a stationary process LX D l cos (2 f l t + l ), l=1 to prove when L = 1, i.e., X t = µ + D cos (2 ft + show E{X t } & cov{x t+, X t } do not depend on t now E{X t } = µ + D E{cos (2 ft + )} = µ + D Z ), need to cos (2 ft + ) 1 2 d = µ (why?) SAPA2e 39 II 32

Example Harmonic Process: VII since X t µ = D cos (2 ft + ), have cov{x t+, X t } = E {(D Z cos (2 f[t + ] + )) (D cos (2 ft + ))} = D 2 cos (2 f[t + ] + ) cos (2 ft + ) 1 2 d Z trig = D2 cos (2 f[2t + ] + 2 ) + cos (2 f ) d 4 = D 2 cos (2 f )/2 s thus can dispense with random amplitudes! basis for spectral representation theorem (Chapter 4) Exercises [2.20] & [2.21]: non-uniformily distributed l doesn t work, and neither do uncorrelated uniformily distributed l s SAPA2e 39, 40, 49, 50 II 33

200 600 1000 Stationary Processes as Models: I stationarity property of models, not data spectral analysis assumes stationarity need to examine assumption for each time series example: spinning rotor series (Figure 41) 0 50 100 150 200 bin number SAPA2e 41, 42 II 34

40 0 20 40 Stationary Processes as Models: II can detrend using linear regression model: X t = + t + Y t letting ˆ and ˆ denote least squares estimates, detrended series gotten from Ŷ t = X t ˆ + ˆt 0 50 100 150 200 bin number SAPA2e 41, 42 II 35

50 0 50 Stationary Processes as Models: III can also get detrended series by first di erencing (filtering): X (1) t X t X t 1 = + Y (1) t, where Y (1) t Y t Y t 1 0 50 100 150 200 bin number SAPA2e 41, 42 II 36

27.8 27.9 28.0 28.1 28.2 Stationary Processes as Models: IV second example: standard resistor series (Figure 43) 0 500 1000 1500 2000 days (from 1/1/80) SAPA2e 43 II 37