Identifiability, Invertibility

Similar documents
ARMA (and ARIMA) models are often expressed in backshift notation.

3 Theory of stationary random processes

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

3. ARMA Modeling. Now: Important class of stationary processes

Covariances of ARMA Processes

Chapter 4: Models for Stationary Time Series

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Lesson 9: Autoregressive-Moving Average (ARMA) models

Basic concepts and terminology: AR, MA and ARMA processes

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Time Series Outlier Detection

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

STAT 804: Lecture 1. Today: Plots of some time series. Discuss series using some of the jargon we will study. Basic classes of models.

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

6 NONSEASONAL BOX-JENKINS MODELS

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Ch. 14 Stationary ARMA Process

11. Further Issues in Using OLS with TS Data

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series 2. Robert Almgren. Sept. 21, 2009

Discrete time processes

Time Series I Time Domain Methods

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Time Series Analysis Fall 2008

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

FE570 Financial Markets and Trading. Stevens Institute of Technology

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Ch 4. Models For Stationary Time Series. Time Series Analysis

Time Series Analysis

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Econometría 2: Análisis de series de Tiempo

Lecture 2: ARMA(p,q) models (part 2)

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Time Series Examples Sheet

Some Time-Series Models

A time series is called strictly stationary if the joint distribution of every collection (Y t

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Applied time-series analysis

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

ARMA Models: I VIII 1

Dynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Multivariate Time Series: VAR(p) Processes and Models

Stochastic processes: basic notions

Time Series 3. Robert Almgren. Sept. 28, 2009

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

at least 50 and preferably 100 observations should be available to build a proper model

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

STOR 356: Summary Course Notes

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

1 Linear Difference Equations

Univariate Time Series Analysis; ARIMA Models

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

STA 6857 VAR, VARMA, VARMAX ( 5.7)

Econ 623 Econometrics II Topic 2: Stationary Time Series

LINEAR STOCHASTIC MODELS

Nonlinear time series

Lecture 2: Univariate Time Series

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Time Series Analysis

Econ 424 Time Series Concepts

CHAPTER 8 FORECASTING PRACTICE I

Autoregressive and Moving-Average Models

Chapter 9: Forecasting

E 4101/5101 Lecture 6: Spectral analysis

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Introduction to ARMA and GARCH processes

ECON 616: Lecture 1: Time Series Basics

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Precalculus Workshop - Equations and Inequalities

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Empirical Market Microstructure Analysis (EMMA)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

STAT 443 (Winter ) Forecasting

7. Forecasting with ARIMA models

3 ARIMA Models. 3.1 Introduction

x + 2y + 3z = 8 x + 3y = 7 x + 2z = 3

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

Lecture # 37. Prof. John W. Sutherland. Nov. 28, 2005

2. An Introduction to Moving Average Models and ARMA Models

STAD57 Time Series Analysis. Lecture 8

Multivariate Time Series

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Econometric Forecasting

STA205 Probability: Week 8 R. Wolpert

Time Series: Theory and Methods

Forecasting with ARMA

Statistics 910, #5 1. Regression Methods

Università di Pavia. Forecasting. Eduardo Rossi

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Statistics of stochastic processes

Cointegrated VARIMA models: specification and. simulation

Econometría 2: Análisis de series de Tiempo

1.1. VARs, Wold representations and their limits

Chapter 6: Model Specification for Time Series

Transcription:

Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q: From observations on X can we estimate the b s and σ 2 = Var(ǫ t ) accurately? NO. Defn: Model for data X is family {P θ ; θ Θ} of possible distributions for X. Defn: Model is identifiable if θ θ 2 implies P θ P θ2 ; different θ s give different distributions for data. Unidentifiable model: there are different values of θ which make exactly the same predictions about the data. So: data do not distinguish between these θ values. 34

Example: Suppose ǫ is an iid N(0, σ 2 ) series and that X t = b 0 ǫ t +b ǫ t. Then the series X has mean 0 and covariance C X (h) = (b 2 0 + b2 )σ2 h = 0 b 0 b σ 2 h = 0 otherwise Fact: normal distribution is specified by its mean and its variance. Consequence: two mean 0 normal time series with the same covariance function have the same distribution. Observe: if you multiply the ǫ s by a and divide both b 0 and b by a then the covariance function of X is unchanged. Thus: cannot hope to estimate all three parameters, b 0, b and σ. Arbitrary choice: b 0 = 35

Are parameters b and σ identifiable? We try to solve the equations and C(0) = ( + b 2 )σ 2 C() = bσ 2 to see if the solution is unique. Divide the two equations to see or C() C(0) = b + b 2 b 2 C(0) C() b + = 0 which has the solutions (C(0) C(0) C() ± ) 2 C() 4 2 36

You should notice two things:. If C(0) C() < 2 there are no solutions. Since C(0) = Var(X t )Var(X t+ ) we see C()/C(0) is the correlation between X t and X t+. So: have proved that for an MA() process this correlation cannot be more than /2 in absolute value. 2. If C(0) C() > 2 there are two solutions. 37

Note: two solutions multiply together to give the constant term in the quadratic equation. If two roots are distinct it follows that one of them is larger than and the other smaller in absolute value. Let b and b denote the two roots. Let α = C()/b and α = C()/b. Let ǫ t be iid N(0, α) and ǫ t be iid N(0, α ). Then and X t ǫ t + bǫ t X t ǫ t + b ǫ t have identical means and covariance functions. Observing X t you cannot distinguish the first of these models from the second. We will fit M A() models by requiring our estimated b to have ˆb. 38

Reason: manipulate model equation for X as for autoregressive process: ǫ t = X t bǫ t = X t b(x t bǫ t 2 ) =. 0 ( b) j X t j This manipulation makes sense if b <. If so then we can rearrange the equation to get X t = ǫ t ( b) j X t j which is an autoregressive process. 39

If, on the other hand, b > then we can write X t = b bǫ t + bǫ t Let ǫ t = bǫ t; ǫ is also white noise. We find ǫ t = X t b ǫ t = X t b (X t+ b ǫ t+ ) which means =. 0 ( b )j X t+j X t = ǫ t ( b )j X t+j This represents the current value as depending on the future which seems physically far less natural than the other choice. 40

Defn: An M A(p) process is invertible if it can be written in the form X t = a j X t j + ǫ t Defn: A process X is an autoregression of order p (written AR(p)) if X t = p a j X t j + ǫ t (so an invertible MA is an infinite order autoregression). Defn: The backshift operator transforms a time series into another time series by shifting it back one time unit; if X is a time series then BX is the time series with (BX) t = X t. The identity operator I satisfies IX = X. We use B j for j =,2,... to denote B composed with itself j times so that (B j X) t = X t j For j = 0 this gives B 0 = I. 4

Now use B to develop a formal method for studying the existence of a given AR(p) and the invertibility of a given MA(p). An AR() process satisfies (I a B)X = ǫ Think of I a B as infinite dimensional matrix; get formal identity X = (I a B) ǫ So how will we define this inverse of an infinite matrix? We use the idea of a geometric series expansion. If b is a real number then ( ab) = ab = j=0 (ab) j so we hope that (I a B) can be defined by (I a B) = a j Bj j=0 42

This would mean X = j=0 a j Bj ǫ or looking at the formula for a particular t and remembering the meaning of B j we get X t = j=0 a j ǫ t j This is the formula I had in lecture 2. Now consider a general AR(p) process: (I p a j B j )X = ǫ We will factor the operator applied to x. Let φ(x) = p a j x j Then φ is degree p polynomial so it has (theorem of C. F. Gauss) p roots /b,...,/b p. (None of the roots is 0 because the constant term in φ is.) This means we can factor φ as φ(x) = p ( b j x) 43

Now back to the definition of X: p (I b j B)X = ǫ can be solved by inverting each term in the product (in any order the terms in the product commute) to get X = p (I b j B) ǫ The inverse of I b B will exist if the sum b k j Bk k=0 converges; this requires b j <. Thus a stationary AR(p) solution of the equations exists if every root of the characteristic polynomial φ is larger than in absolute value (actually the roots can be complex and I mean larger than in modulus). 44

Summary An MA(q) process X t = ǫ t q j= b jǫ t j is invertible iff all roots of characteristic polynomial ψ(x) = q j= b jx j lie outside unit circle in complex plain. For given covariance function of an MA(q) process there is only one set of coefficients b,..., b q for which the process is invertible. An AR(p) process X t q j= a jx t j = ǫ t is asymptotically stationary iff all roots of characteristic polynomial φ(x) = p a jx j lie outside unit circle in complex plain. (Asymptotically stationary means: make X, X 2,..., X p anything; use equation defining AR(p) to define rest of X values; then as t the process gets closer to being stationary. Asymptotic stationarity is equivalent to existence of an exactly stationary solution of equations.) 45

Defn: A process X is an ARMA(p, q) (mixed autoregressive of order p and moving average of order q) if it satisfies φ(b)x = ψ(b)ǫ where ǫ is white noise and and φ(b) = I ψ(b) = I p q a j B j b j B j The ideas we used above can be stretched to show that the process X is identifiable and causal (can be written as an infinite order autoregression on the past) if the roots of ψ(x) lie outside the unit circle. A stationary solution, which can be written as an infinite order causal (no future ǫs in the average) moving average, exists if all the roots of φ(x) lie outside the unit circle. 46