Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Similar documents
OBJECTIVES OF TIME SERIES ANALYSIS

5. Stochastic processes (1)

14 Autoregressive Moving Average Models

Unit Root Time Series. Univariate random walk

Financial Econometrics Jeffrey R. Russell Midterm Winter 2009 SOLUTIONS

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.

Smoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

Properties of Autocorrelated Processes Economics 30331

Licenciatura de ADE y Licenciatura conjunta Derecho y ADE. Hoja de ejercicios 2 PARTE A

Lecture Notes 2. The Hilbert Space Approach to Time Series

20. Applications of the Genetic-Drift Model

Regression with Time Series Data

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

Distribution of Least Squares

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

Vehicle Arrival Models : Headway

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

Forecasting optimally

Chapter 15. Time Series: Descriptive Analyses, Models, and Forecasting

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

13.3 Term structure models

References are appeared in the last slide. Last update: (1393/08/19)

Econ Autocorrelation. Sanjaya DeSilva

Estimation Uncertainty

Stationary Time Series

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Two Coupled Oscillators / Normal Modes

Chapter 5. Heterocedastic Models. Introduction to time series (2008) 1

ECON 482 / WH Hong Time Series Data Analysis 1. The Nature of Time Series Data. Example of time series data (inflation and unemployment rates)

Linear Response Theory: The connection between QFT and experiments

Distribution of Estimates

1 Review of Zero-Sum Games

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

Physics 127b: Statistical Mechanics. Fokker-Planck Equation. Time Evolution

Christos Papadimitriou & Luca Trevisan November 22, 2016

Lecture 3: Exponential Smoothing

Biol. 356 Lab 8. Mortality, Recruitment, and Migration Rates

Notes on Kalman Filtering

Vector autoregression VAR. Case 1

Dynamic Econometric Models: Y t = + 0 X t + 1 X t X t k X t-k + e t. A. Autoregressive Model:

The Brock-Mirman Stochastic Growth Model

Exponential Weighted Moving Average (EWMA) Chart Under The Assumption of Moderateness And Its 3 Control Limits

Dynamic Models, Autocorrelation and Forecasting

STRUCTURAL CHANGE IN TIME SERIES OF THE EXCHANGE RATES BETWEEN YEN-DOLLAR AND YEN-EURO IN

Vectorautoregressive Model and Cointegration Analysis. Time Series Analysis Dr. Sevtap Kestel 1

Comparing Means: t-tests for One Sample & Two Related Samples

5.1 - Logarithms and Their Properties

Wednesday, November 7 Handout: Heteroskedasticity

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.

Forward guidance. Fed funds target during /15/2017

ACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin

- The whole joint distribution is independent of the date at which it is measured and depends only on the lag.

Applied Time Series Notes White noise: e t mean 0, variance 5 2 uncorrelated Moving Average

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Chapter 3, Part IV: The Box-Jenkins Approach to Model Building

Lecture 33: November 29

Generalized Least Squares

Elements of Stochastic Processes Lecture II Hamid R. Rabiee

2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes

Linear Time-invariant systems, Convolution, and Cross-correlation

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models

Types of Exponential Smoothing Methods. Simple Exponential Smoothing. Simple Exponential Smoothing

KINEMATICS IN ONE DIMENSION

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests

Wisconsin Unemployment Rate Forecast Revisited

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

Advanced time-series analysis (University of Lund, Economic History Department)

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =

Quarterly ice cream sales are high each summer, and the series tends to repeat itself each year, so that the seasonal period is 4.

Affine term structure models

I. Return Calculations (20 pts, 4 points each)

A Dynamic Model of Economic Fluctuations

The general Solow model

Time series Decomposition method

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

Matlab and Python programming: how to get started

Echocardiography Project and Finite Fourier Series

Macroeconomic Theory Ph.D. Qualifying Examination Fall 2005 ANSWER EACH PART IN A SEPARATE BLUE BOOK. PART ONE: ANSWER IN BOOK 1 WEIGHT 1/3

Nonstationarity-Integrated Models. Time Series Analysis Dr. Sevtap Kestel 1

Summer Term Albert-Ludwigs-Universität Freiburg Empirische Forschung und Okonometrie. Time Series Analysis

Cointegration and Implications for Forecasting

Empirical Process Theory

A Bayesian Approach to Spectral Analysis

(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum.

Linear Dynamic Models

1. Diagnostic (Misspeci cation) Tests: Testing the Assumptions

Chapter 16. Regression with Time Series Data

The Simple Linear Regression Model: Reporting the Results and Choosing the Functional Form

( ) ( ) if t = t. It must satisfy the identity. So, bulkiness of the unit impulse (hyper)function is equal to 1. The defining characteristic is

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon

Lesson 2, page 1. Outline of lesson 2

10. State Space Methods

GMM - Generalized Method of Moments

Math 10B: Mock Mid II. April 13, 2016

The average rate of change between two points on a function is d t

CONFIDENCE LIMITS AND THEIR ROBUSTNESS

3.1 More on model selection

Notes on exponentially correlated stationary Markov processes

Transcription:

Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance saionary, auocovariance funcion, auocorrelaion funcion, parial auocorrelaion funcion, and auoregression. Describe he requiremens for a series o be covariance saionary. Explain he implicaions of working wih models ha are no covariance saionary. Define whie noise, describe independen whie noise and normal (Gaussian) whie noise. Explain he characerisics of he dynamic srucure of whie noise. Explain how a lag operaor works. Describe Wold s heorem. Define a general linear process. Relae raional disribued lags o Wold s heorem. Calculae he sample mean and sample auocorrelaion, and describe he Box- Pierce Q-saisic and he Ljung-Box Q-saisic. Describe sample parial auocorrelaion. Learning objecive: Define covariance saionary, auocovariance funcion, auocorrelaion funcion, parial auocorrelaion funcion, and auoregression. Learning objecive: Describe he requiremens for a series o be covariance saionary. Learning objecive: Explain he implicaions of working wih models ha are no covariance saionary. Covariance Saionary Funcion When an independen variable in he regression equaion is a lagged value of he dependen variable (as is he case in auoregressive ime series models) saisical inferences based on OLS regression are no always valid. In order o conduc saisical inference based on hese models, we mus assume ha he ime series is covariance saionary or weakly saionary. here are hree basic requiremens for a ime series o be covariance saionary:. he expeced value or mean of he ime series mus be consan and finie in all periods.. he variance of he ime series mus be consan and finie in all periods. 3. he covariance of he ime series wih iself for a fixed number of periods in he pas or fuure mus be consan and finie in all periods. I is a good ime o noe wha ype of ime series is no mean saionary: if he ime series has a rend line such as GDP growh or employmen, by definiion ha ime series is no mean saionary. Same hing wih variance if he ime series has a rending or changing variance, i is no variance saionary. 07 Wiley 43 QA-.indd 43

Quaniaive Analysis (QA) For noaion in his reading we are going o use E(y ) = μ o say he expecaion of our sample pah y is he mean of y. In oher words, rule number. Auocovariance Funcion o deermine if our series has a sable covariance srucure over differen periods of ime we have o use an auocovariance funcion. Series analysis is usually limied o ime series in finance bu i can apply o any ype of daa. We refer o a displacemen as he nex sep in a series be i hree-dimensional daa or linear ime series daa. We need o include displacemen and ime in our auocovariance funcion like: γ (, τ) = γ (τ) his jus says our funcion has wo variables, ime and displacemen, and he covariance beween hose wo equals he expecaion of each of hem minus he mean, which from sep one we assume o be zero. In order o be covariance saionary, he series canno be dependen upon ime, which reduces our auocorrelaion equaion o reduce o be dependen jus on displacemen: γ (, τ) = cov(y, y τ ) = E(y μ) (y τ μ) Auocorrelaion Funcion Now, jus like covariance in oher areas of analysis doesn ell us much here are no unis, i can vary widely, ec. we use correlaion o normalize covariance. Recall ha we calculae correlaion by dividing covariance by sandard deviaion, and jus in he same way we creae he correlaion funcion, we divide our auocovariance funcion by sandard deviaion. When he auocorrelaions are ploed by ime sep we can graphically see how he dependence paern changes or alers by lag sep. Learning objecive: Define whie noise, describe independen whie noise and normal (Gaussian) whie noise. Whie noise is a special characerizaion of a ype of error erms wihin a ime series. Recall we use y o denoe an observed ime series and we furher wan o say he error erms have a mean 0 and some known, consan variance. Formulaically, y = ~ (0, σ ) 44 07 Wiley QA-.indd 44

Diebold, Chaper 7 Furhermore, if he error erms are uncorrelaed over ime, ha process is called whie noise. If we can furher show ha he series y is independen and idenically disribued, he whie noise becomes independen whie noise. Lasly, if we can show i.i.d as well as normally disribued, we say he whie noise is Gaussian (normally disribued). Learning objecive: Explain he characerisics of he dynamic srucure of whie noise. I is imporan o undersand ha whie noise is uncorrelaed over ime, has zero auocorrelaion, and is basically a random sochasic process. he ake-away from his reading is ha forecas errors should be whie noise and his is counerinuiive because if he errors aren whie noise hen he errors are serially correlaed, which also means he errors are forecasable, which also means he forecas iself is flawed or unreliable. Learning objecive: Explain how a lag operaor works. Since we are manipulaing ime series daa o explain how he pas evolves ino he fuure, we have o manipulae he forecas model and a lag operaor will urn he curren one ino he previous observaion like his: Ly = y A lag of wo seps would have L raised o he second power and so forh. For now, jus know he noaion. We will ge ino he reason, use, and meaning laer. Ly = LLy = Ly = y so he firs lag of our auocovariance equaion would look like his: γ() = cov( Y, Y ) = E(( Y µ )( Y µ )) You can visualize a lag by aking any graph and shifing he enire char by one or more ime seps however large he lag may be. Now, we can also lag a ime series by a polynomial where we se he observaion a plus some fracion of prior observaions like his: ( +.4L +.06 ) y = y +.4 + 0.6 Learning objecive: Describe Wold s heorem. 07 Wiley 45 QA-.indd 45

Quaniaive Analysis (QA) ime series analysis is all abou picking he righ model o fi a series of daa. When we have a se of daa ha is covariance saionary here are a lo of model choices ha could fi he daa wih differen degrees of effeciveness. his alone doesn ell us anyhing abou wheher or no his is he righ model. hink of his idea as analogous o correlaion doesn equal causaion. his breaks down he ime series ino wo pieces one deerminisic and one sochasic (a.k.a. random or whie noise) so Wold s heorem is he model of he covariance saionary residual. ime series models are consruced as linear funcions of fundamenal forecasing errors, also called innovaions or shocks. hese basic building blocks of models are he same: mean 0 wih some known variance, serially uncorrelaed, a.k.a. whie noise. In his sense, all errors are whie noise and unforecasable. Now, in his reading he error erms are ofen referred o as innovaions, which ges very confusing. he reason his came abou is because if we have an acual error, no whie noise, in a ime series, hen ha acually inroduces new informaion o he sysem ha can be modeled, can be prediced, and is in some way correlaed o he res of he ime series. Furhermore, a disribued lag is a weighed sum of previous values ha facor in some way o our esimaion of he curren value of he ime series, which is exacly wha he equaion from he las learning objecive is a disribued lag, meaning he curren value weigh is disribued over several previous values of he ime series. Recall he exponenially weighed model where we se lambda a some value o drive he decay of he informaional value of he hisorical daa. If we wan o grab all hisorical value, i s called infiniely disribuable. he formula looks like his: i i= 0 ~ WN( O, σ ) y = B( L) = b Learning objecive: Define a general linear process. his reading is quie complicaed and i is easy o ge bogged down in formulas when all hey are asking for is a definiion. In his case, a general linear process describes Wold s heorem because he previous formula is a series of linear funcions and is innovaions. his is probably a low probabiliy for he exam. Learning objecive: Relae raional disribued lags o Wold s heorem. 46 07 Wiley QA-.indd 46

Diebold, Chaper 7 here are wo ypes of lags in ime series analysis: infinie and finie. In an infinie series, we assume all daa in he pas has an impac on he dependen variable we are modeling so i is infiniely disribued. Same for a finie disribuion, we jus have o define how many lagged periods impac he ime series. In hese ypes of lagged models, we assume here is some weigh applied o each lag bu here can also be polynomials in he lag facor. he problem arises because models wih an infinie number of parameers can be esimaed from a finie sample of daa. However, an infinie polynomial in he lag operaor won necessarily have infinie free parameers. We can have an infinie series of polynomials ha only depend on, say, wo parameers. A raional disribued lag is he raio of he parameers in he infiniely disribued lag so ha we can approximae an infiniely lagged series from a finie sample of daa, which is how we recover Wold s heorem from raional disribued lags. Learning objecive: Calculae he sample mean and sample auocorrelaion, and describe he Box-Pierce Q-saisic and he Ljung-Box Q-saisic. Recall ha when we are dealing wih any ype of sample we are dealing wih esimaors, no parameers, and we exend he mean and auocorrelaion o accommodae ha we know here is some degree of error in he esimaor. his is called replacing expecaions wih sample averages. If we have a sample size of, he sample mean is calculaed as: y = = y his is no ineresing in iself bu we can use i o calculae he sample auocorrelaion funcion, which we calculae as: ρτ ˆ( ) = = τ+ (( y y)( y τ y)) = ( y y) = τ+ (( y y)( y y)) = τ ( y y) Learning objecive: Describe sample parial auocorrelaion. his is a low prioriy for he exam and is only a describe learning saemen. Insead of obaining parial auocorrelaions in a hough experimen of infinie regressions on an infinie daa se, we now perform he same on a hough experimen on a more manageable, finie daa se and ha is why i is called a sample. his is purely heoreical and a near zero chance on exam day. 07 Wiley 47 QA-.indd 47

QA-.indd 48