Statistics of stochastic processes

Similar documents
Time Series Analysis -- An Introduction -- AMS 586

Part II. Time Series

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STOR 356: Summary Course Notes

Econometría 2: Análisis de series de Tiempo

3. ARMA Modeling. Now: Important class of stationary processes

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

A time series is called strictly stationary if the joint distribution of every collection (Y t

Introduction to Stochastic processes

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

Econ 424 Time Series Concepts

Some Time-Series Models

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Chapter 4: Models for Stationary Time Series

Applied time-series analysis

Time Series Solutions HT 2009

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

1 Linear Difference Equations

STAT 443 (Winter ) Forecasting

Discrete time processes

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Minitab Project Report - Assignment 6

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Class 1: Stationary Time Series Analysis

Long-range dependence

Lesson 4: Stationary stochastic processes

1. Fundamental concepts

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

5: MULTIVARATE STATIONARY PROCESSES

Lesson 9: Autoregressive-Moving Average (ARMA) models

Characteristics of Time Series

Module 9: Stationary Processes

Akaike criterion: Kullback-Leibler discrepancy

1 Class Organization. 2 Introduction

Time Series Analysis - Part 1

3 Time Series Regression

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

STAT Financial Time Series

Exercises - Time series analysis

Multivariate Time Series

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Ch. 14 Stationary ARMA Process

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

STAT 248: EDA & Stationarity Handout 3

1. Stochastic Processes and Stationarity

Statistics of Stochastic Processes

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

ECON 616: Lecture 1: Time Series Basics

Basic concepts and terminology: AR, MA and ARMA processes

Time Series Outlier Detection

Chapter 3 - Temporal processes

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

A nonparametric test for seasonal unit roots

Time Series I Time Domain Methods

Classic Time Series Analysis

Brooklyn College, CUNY. Lecture Notes. Christian Beneš

Econ 623 Econometrics II Topic 2: Stationary Time Series

Lesson 2: Analysis of time series

ECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Introduction to Time Series Analysis. Lecture 7.

Time series and spectral analysis. Peter F. Craigmile

Simple Descriptive Techniques

X t = a t + r t, (7.1)

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

7 Introduction to Time Series

Lecture 2: ARMA(p,q) models (part 2)

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Univariate Time Series Analysis; ARIMA Models

3 Theory of stationary random processes

Stochastic process for macro

Lecture 19 Box-Jenkins Seasonal Models

Econometric Forecasting

Time Series: Theory and Methods

A time series is a set of observations made sequentially in time.

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

Gaussian processes. Basic Properties VAG002-

X random; interested in impact of X on Y. Time series analogue of regression.

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Empirical Market Microstructure Analysis (EMMA)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Ch 4. Models For Stationary Time Series. Time Series Analysis

Levinson Durbin Recursions: I

Time Series 2. Robert Almgren. Sept. 21, 2009

Stochastic Processes

Levinson Durbin Recursions: I

STAT Financial Time Series

Transcription:

Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014 1 / 31

Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. In statistics of stochastic processes (= time series analysis) we will assume y 1,..., y n to be realizations of a stochastic process... Y 1,..., Y n,... with some rules for dependence. 14 settembre 2014 1 / 31

Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) Remarks 14 settembre 2014 2 / 31

Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] Remarks 14 settembre 2014 2 / 31

Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] test hypotheses (increasing trend? influence of factors) understand causes Remarks 14 settembre 2014 2 / 31

Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] test hypotheses (increasing trend? influence of factors) understand causes Remarks predict future values 14 settembre 2014 2 / 31

Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] test hypotheses (increasing trend? influence of factors) understand causes Remarks predict future values - Random terms generally described as a stationary process. 14 settembre 2014 2 / 31

Introduction Aims of time series analysis Drawing inference from available data, but first we need to find an appropriate model. Once a model has been selected: provide a compact and correct description of data (trend, seasonal and random terms) adjust data (filtering, missing values) [separating noise from signal] test hypotheses (increasing trend? influence of factors) understand causes Remarks predict future values - Random terms generally described as a stationary process. - Linear analysis (additive decomposition of trend, seasonal and stationary process) 14 settembre 2014 2 / 31

Introduction Stationary process Definition A stochastic process {X t } t Z is (strictly) stationary if the joint distribution of (X t1, X t2,..., X tk ) is equal to the distribution of (X t1 +h, X t2 +h,..., X tk +h) k N, h Z, t 1, t 2,..., t k Z. In particular, if a stationary stochastic process has finite second moment, then E(X t ) and Cov(X t, X t+h ) do not depend on t. 14 settembre 2014 3 / 31

Introduction Stationary process. 2 Linear time series analysis looks only at second-order properties. Then Definition A stochastic process {X t } t Z is stationary if it is in L 2 and E(X t ) = µ Cov(X t, X t+h ) = γ(h). 14 settembre 2014 4 / 31

Introduction Stationary process. 2 Linear time series analysis looks only at second-order properties. Then Definition A stochastic process {X t } t Z is stationary if it is in L 2 and E(X t ) = µ Cov(X t, X t+h ) = γ(h). If a Gaussian process is stationary, then it is strictly stationary. 14 settembre 2014 4 / 31

Introduction Stationary process. 2 Linear time series analysis looks only at second-order properties. Then Definition A stochastic process {X t } t Z is stationary if it is in L 2 and E(X t ) = µ Cov(X t, X t+h ) = γ(h). If a Gaussian process is stationary, then it is strictly stationary. A Gaussian process is such that all finite-dimensional distributions are multivariate normal. 14 settembre 2014 4 / 31

Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. 14 settembre 2014 5 / 31

Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. Equivalently, Y is multivariate normal there exists b R n A (n m) matrix, X = (X 1,..., X m ) independent standard normal r.v. such that Y = AX + b. 14 settembre 2014 5 / 31

Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. Equivalently, Y is multivariate normal there exists b R n A (n m) matrix, X = (X 1,..., X m ) independent standard normal r.v. such that Y = AX + b. = E(Y ) = b, Cov(Y ) = AA t, i.e. Y N(b, AA t ). 14 settembre 2014 5 / 31

Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. Equivalently, Y is multivariate normal there exists b R n A (n m) matrix, X = (X 1,..., X m ) independent standard normal r.v. such that Y = AX + b. = E(Y ) = b, Cov(Y ) = AA t, i.e. Y N(b, AA t ). Alternative characterization via characteristic function. 14 settembre 2014 5 / 31

Introduction Reminders on multivariate normal Definition Y = (Y 1,..., Y n ) is multivariate normal if, a R n, a t Y is a univariate normal. Equivalently, Y is multivariate normal there exists b R n A (n m) matrix, X = (X 1,..., X m ) independent standard normal r.v. such that Y = AX + b. = E(Y ) = b, Cov(Y ) = AA t, i.e. Y N(b, AA t ). Alternative characterization via characteristic function. If Cov(Y ) = S positive definite (i.e. invertible), Y N(µ, S) has density f Y (y) = (2π) n/2 S 1/2 exp{ (y µ) t S 1 (y µ)/2}. (non-singular distribution) 14 settembre 2014 5 / 31

Introduction Gaussian processes Definition A process {X t } is Gaussian, if for any n > 0 and any (t 1,..., t n ) the vector X = (X t1,..., X tn ) has a non-singular multivariate normal distribution. 14 settembre 2014 6 / 31

Introduction Gaussian processes Definition A process {X t } is Gaussian, if for any n > 0 and any (t 1,..., t n ) the vector X = (X t1,..., X tn ) has a non-singular multivariate normal distribution. Then let µ = (µ t1,..., µ tn ) = E(X) and Cov(X) = Γ = {γ(t i, t j ), i, j = 1... n}. X has density function { g(x, µ, Γ) = (2π) n/2 Γ 1/2 exp 1 } 2 Γ 1 (x µ), x µ. 14 settembre 2014 6 / 31

Introduction Gaussian processes Definition A process {X t } is Gaussian, if for any n > 0 and any (t 1,..., t n ) the vector X = (X t1,..., X tn ) has a non-singular multivariate normal distribution. Then let µ = (µ t1,..., µ tn ) = E(X) and Cov(X) = Γ = {γ(t i, t j ), i, j = 1... n}. X has density function { g(x, µ, Γ) = (2π) n/2 Γ 1/2 exp 1 } 2 Γ 1 (x µ), x µ. {X t } is (weakly) stationary if µ t µ and γ(t i, t j ) = γ( t i t j ); then is also strictly stationary, as the distribution depends only on µ and Γ. 14 settembre 2014 6 / 31

Introduction Gaussian processes Definition A process {X t } is Gaussian, if for any n > 0 and any (t 1,..., t n ) the vector X = (X t1,..., X tn ) has a non-singular multivariate normal distribution. Then let µ = (µ t1,..., µ tn ) = E(X) and Cov(X) = Γ = {γ(t i, t j ), i, j = 1... n}. X has density function { g(x, µ, Γ) = (2π) n/2 Γ 1/2 exp 1 } 2 Γ 1 (x µ), x µ. {X t } is (weakly) stationary if µ t µ and γ(t i, t j ) = γ( t i t j ); then is also strictly stationary, as the distribution depends only on µ and Γ. Linear time series analysis is very well suited for Gaussian processes; less so for non-gaussian ones. 14 settembre 2014 6 / 31

Introduction Hilbert spaces Many time series problems can be solved using Hilbert space theory. Indeed space L 2 (Ω) is a Hilbert space with X, Y = E(XY ), X Y 2 = E( X Y 2 ). Restricting to the 0-mean subspace X, Y = Cov(X, Y ). 14 settembre 2014 7 / 31

Introduction Detrending data Often data do not appeat as arising from stationary processes. Estimating trend, and then study residuals (differences from trend) smoothing polynomial (esp. line) fitting Study differenced series In all cases, trasformations may be useful More systematic model fitting in the future. 14 settembre 2014 8 / 31

Johnson & Johnson quarterly earnings J & J Earnings per Share 0 5 10 15 data 3 points smoothing 5 points smoothing 1960 1965 1970 1975 1980 Time 14 settembre 2014 9 / 31

Johnson & Johnson data: deviations from trend Deviations from moving average -3-2 -1 0 1 2 1960 1965 1970 1975 1980 Time 14 settembre 2014 10 / 31

Johnson & Johnson data: deviations in log-scale Deviations (in log scale) from moving average -0.4-0.3-0.2-0.1 0.0 0.1 0.2 0.3 1960 1965 1970 1975 1980 Time 14 settembre 2014 11 / 31

Sunspots data 1700-1980 sunspots 0 50 100 150 1700 1750 1800 1850 1900 1950 year 14 settembre 2014 12 / 31

Sunspots data 1700-1980: square-root transformation sunspots 0 2 4 6 8 10 12 14 1700 1750 1800 1850 1900 1950 year 14 settembre 2014 13 / 31

PanAm international air passengers 1949-60 Passengers (1000's) 100 200 300 400 500 600 1950 1952 1954 1956 1958 1960 Time 14 settembre 2014 14 / 31

PanAm yearly data Annual air passengers aggregate(ap) 2000 3000 4000 5000 1950 1952 1954 1956 1958 1960 Time 14 settembre 2014 15 / 31

PanAm monthly variation Seasonal component in air passengers 100 200 300 400 500 600 1 2 3 4 5 6 7 8 9 10 11 12 Month 14 settembre 2014 16 / 31

Level of Lake Huron 1875-1972 Level of lake Huron ft 6 7 8 9 10 11 12 1880 1900 1920 1940 1960 Time 14 settembre 2014 17 / 31

Lake Huron level: deviations from trend Deviations from trend in level of lake Huron ft -2-1 0 1 2 0 20 40 60 80 100 Time 14 settembre 2014 18 / 31

sales of red wine in Australia 1980-91 Red wine sales in Australia kilolitres 500 1000 1500 2000 2500 3000 1980 1982 1984 1986 1988 1990 1992 Time 14 settembre 2014 19 / 31

Deviation from trend in wine sales Deviations from trend in sales of red wine kilolitres -1000-500 0 500 1000 0 20 40 60 80 100 120 140 Time 14 settembre 2014 20 / 31

PanAm monthly variation Seasonal variation in wine sales (AUS) 500 1000 1500 2000 2500 3000 1 2 3 4 5 6 7 8 9 10 11 12 14 settembre 2014 21 / 31

Global temperature data 1856-2005 Global temperature data Anomalies from 1961-90 mean -1.0-0.5 0.0 0.5 Monthly averages Yearly averages 1900 1950 2000 Time 14 settembre 2014 22 / 31

Global temperature: recent years and trend Global temperatures 1971-2005 (regression line in blue) Anomalies -0.4-0.2 0.0 0.2 0.4 0.6 0.8 1970 1975 1980 1985 1990 1995 2000 2005 Time 14 settembre 2014 23 / 31

Measles data in England 1944-1967 Measles in England cases per biweek 0 5000 10000 15000 20000 25000 1945 1950 1955 1960 1965 year 14 settembre 2014 24 / 31

EEG data from a subject with epilepsy EEG -150-50 50 150 0 1000 2000 3000 4000 time (arbitrary unit) 14 settembre 2014 25 / 31

De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( 1 2 x t q + q 1 j= (q 1) x t+j + 1 2 x t+q ). 14 settembre 2014 26 / 31

De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 q 1 j= (q 1) x t+j + 1 2 x t+q ) (x jt +k m jt +k ), k = 1... T.. 14 settembre 2014 26 / 31

De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. q 1 j= (q 1) x t+j + 1 2 x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1. 14 settembre 2014 26 / 31

De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. deseasonalized data d t = x t ŝ t. q 1 j= (q 1) x t+j + 1 2 x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1. 14 settembre 2014 26 / 31

De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. deseasonalized data d t = x t ŝ t. q 1 j= (q 1) x t+j + 1 2 x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1 ˆm t trend component on deseasonalized data.. 14 settembre 2014 26 / 31

De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. deseasonalized data d t = x t ŝ t. q 1 j= (q 1) x t+j + 1 2 x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1 ˆm t trend component on deseasonalized data. Ŷ t = x t ˆm t ŝ t random component.. 14 settembre 2014 26 / 31

De-trend and de-seasonalize (period T = 2q) yearly average: m t = 1 T ( seasonal deviation: w k = 1 n 1 2 x t q + n 1 j=0 seasonal component: ŝ k = w k 1 T ŝ t = ŝ t [ t 1 T ]T, t > T. deseasonalized data d t = x t ŝ t. q 1 j= (q 1) x t+j + 1 2 x t+q ) (x jt +k m jt +k ), k = 1... T. T w i, k = 1... T. i=1 ˆm t trend component on deseasonalized data. Ŷ t = x t ˆm t ŝ t random component. Otherwise, difference data: T X t := X t X t T. T X t are de-seasonalized; then a trend can be eliminated from these.. 14 settembre 2014 26 / 31

Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). 14 settembre 2014 27 / 31

Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). Recall the correlation ρ(x, Y ) = Cov(X, Y ) V (X )V (Y ). For a stationary process V (X t ) = V (X t+h ) = γ(0). Hence ρ(h) = ρ(x t, X t+h ) = γ(h) γ(0) is the Autocorrelation function (ACF). 14 settembre 2014 27 / 31

Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). Recall the correlation ρ(x, Y ) = Cov(X, Y ) V (X )V (Y ). For a stationary process V (X t ) = V (X t+h ) = γ(0). Hence ρ(h) = ρ(x t, X t+h ) = γ(h) γ(0) First properties of ACVF: is the Autocorrelation function (ACF). 14 settembre 2014 27 / 31

Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). Recall the correlation ρ(x, Y ) = Cov(X, Y ) V (X )V (Y ). For a stationary process V (X t ) = V (X t+h ) = γ(0). Hence ρ(h) = ρ(x t, X t+h ) = γ(h) γ(0) First properties of ACVF: is the Autocorrelation function (ACF). γ(h) = γ( h) [stationarity = Cov(X t, X t+h ) = Cov(X t h, X t )] 14 settembre 2014 27 / 31

Autocovariance and autocorrelation functions If a process {X t } is stationary, γ(h) := Cov(X t, X t+h ) is the Autocovariance function (ACVF). Recall the correlation ρ(x, Y ) = Cov(X, Y ) V (X )V (Y ). For a stationary process V (X t ) = V (X t+h ) = γ(0). Hence ρ(h) = ρ(x t, X t+h ) = γ(h) γ(0) First properties of ACVF: is the Autocorrelation function (ACF). γ(h) = γ( h) [stationarity = Cov(X t, X t+h ) = Cov(X t h, X t )] γ(h) γ(0) [as ρ(x, Y ) 1] 14 settembre 2014 27 / 31

Simple stationary processes and their ACVF IID(0, σ 2 ): {X t } t Z independent and identically distributed r. v. with E(X t ) = 0, V(X t ) = σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. 14 settembre 2014 28 / 31

Simple stationary processes and their ACVF IID(0, σ 2 ): {X t } t Z independent and identically distributed r. v. with E(X t ) = 0, V(X t ) = σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) [white noise] {X t } t Z uncorrelated random variables with mean 0 and variance σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. 14 settembre 2014 28 / 31

Simple stationary processes and their ACVF IID(0, σ 2 ): {X t } t Z independent and identically distributed r. v. with E(X t ) = 0, V(X t ) = σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) [white noise] {X t } t Z uncorrelated random variables with mean 0 and variance σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) need not be independent. For instance if {Z t } t Z are IID and N(0,1) [normal r.v.], then { Z t t odd X t = (Zt 1 2 1)/ is WN(0, 1) but not IID(0, 1). 2 t even It is not IID, since (e.g.) X 1 and X 2 are obviously not independent. Left for exercise that X t is WN. 14 settembre 2014 28 / 31

Simple stationary processes and their ACVF IID(0, σ 2 ): {X t } t Z independent and identically distributed r. v. with E(X t ) = 0, V(X t ) = σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) [white noise] {X t } t Z uncorrelated random variables with mean 0 and variance σ 2 : γ(0) = σ 2, γ(h) = 0 for h > 0. WN(0, σ 2 ) need not be independent. For instance if {Z t } t Z are IID and N(0,1) [normal r.v.], then { Z t t odd X t = (Zt 1 2 1)/ is WN(0, 1) but not IID(0, 1). 2 t even It is not IID, since (e.g.) X 1 and X 2 are obviously not independent. Left for exercise that X t is WN. Less contrived examples of {X t } t Z WN but not IID will be seen later in the course. 14 settembre 2014 28 / 31

Moving average processes and their ACVF. 2 MA(1): moving average {X t } t Z is MA(1) if X t = Z t + ϑz t 1, t Z where ϑ R, {Z t } WN(0, σ 2 ). A simple computation: γ(0) = σ 2 (1 + ϑ 2 ), γ(1) = ϑσ 2, γ(h) = 0 for h > 1. 14 settembre 2014 29 / 31

Moving average processes and their ACVF. 2 MA(1): moving average {X t } t Z is MA(1) if X t = Z t + ϑz t 1, t Z where ϑ R, {Z t } WN(0, σ 2 ). A simple computation: γ(0) = σ 2 (1 + ϑ 2 ), γ(1) = ϑσ 2, γ(h) = 0 for h > 1. Similarly {X t } t Z MA(q) if X t = Z t + ϑ 1 Z t 1 + ϑ q Z t q, t Z, with ϑ 1,..., ϑ q R, {Z t } WN(0, σ 2 ). Another simple computation leads to γ(h) = 0 for h > q. 14 settembre 2014 29 / 31

AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) 14 settembre 2014 30 / 31

AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) (1) is an (infinite set of) equation. It is not obvious that a stationary process exists satisfying them (this will be discussed later). 14 settembre 2014 30 / 31

AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) (1) is an (infinite set of) equation. It is not obvious that a stationary process exists satisfying them (this will be discussed later). We are not saying {X t } t N is the Markov chain defined through X t = φx t 1 + Z t, t > 0 with X 0 some prescribed r.v. 14 settembre 2014 30 / 31

AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) (1) is an (infinite set of) equation. It is not obvious that a stationary process exists satisfying them (this will be discussed later). We are not saying {X t } t N is the Markov chain defined through X t = φx t 1 + Z t, t > 0 with X 0 some prescribed r.v. Now, assume a stationary process {X t } t Z exists satisfying (1) and E(X t Z s ) = 0 for t < s (this latter property seems natural as X t should be defined in terms of Z t and the previous ones). 14 settembre 2014 30 / 31

AutoRegressive processes AR(1) [AutoRegressive] {X t } t Z is AR(1) if is stationary and X t = φx t 1 + Z t, t Z where φ R, {Z t } WN(0, σ 2 ). (1) (1) is an (infinite set of) equation. It is not obvious that a stationary process exists satisfying them (this will be discussed later). We are not saying {X t } t N is the Markov chain defined through X t = φx t 1 + Z t, t > 0 with X 0 some prescribed r.v. Now, assume a stationary process {X t } t Z exists satisfying (1) and E(X t Z s ) = 0 for t < s (this latter property seems natural as X t should be defined in terms of Z t and the previous ones). Then γ(0) = V(X t ) = E((φX t 1 + Z t ) 2 ) = φ 2 V(X t 1 ) + σ 2 + 2φE(X t 1 Z t ) = φ 2 γ(0) + σ 2. Hence γ(0) = σ2 1 φ 2 (makes sense only if φ 2 < 1 ). 14 settembre 2014 30 / 31

AutoRegressive processes. 2 Remarks: we have found φ 2 < 1 φ < 1 as a necessary condition for an AR(1) satisfying E(X t Z s ) = 0 for t < s. It will also be sufficient. Implicit assumption in the computations: E(X t ) = 0 (this can be proved analogously). More simply, one can then compute γ(h) for h > 0 (left for exercise). 14 settembre 2014 31 / 31