Midterm Suggested Solutions

Size: px
Start display at page:

Download "Midterm Suggested Solutions"

Transcription

1 CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ) = cov(y t, y t τ ) var(yt ) var(y t τ ) = γ(τ) γ(0) Partial autocorrelation measures the association between y t and y t τ after controlling for the effects of y t 1, y t 2,, y t τ+1. Thus, autocorrelation captures both direct and indirect relationship between y t and y t τ, whereas partial autocorrelation only measures the direct relation between y t and y t τ. That explains why autocorrelation at certain displacements be positive while the partial autocorrelations at those same displacements are negative. (b) Covariance Stationary requires 1) Time invariant mean 2) Time invariant variance and 3) Covariance only depends on time distance. Given the autocovariance functions, we can check whether 2) and 3) are satisfied. (i) From γ(t, τ) = α, we can see γ(t, τ) is a constant and independent of t, and γ(0) = α, a constant. Hence, this autocovariance function is consistent with covariance stationarity. (ii) From γ(t, τ) = e ατ, we can see γ(t, τ) depends on time distance τ only, and γ(0) = e 0 = 1, a constant. Hence, this covariance function is consistent with covariance stationarity. (iii) From γ(t, τ) = τα, we can see γ(t, τ) depends on time distance τ only and γ(0) = 0, a constant. Hence, this covariance function is consistent with covariance stationarity. Q2 Wold Theorem is important because any covariance stationarity can be written as a linear combination of its innovations. Since these innovations ɛ t W N(0, σ 2 ), we can easily model and 1

2 forecast covariance stationary series and examine their unconditional and conditional moment structures using properties of White Noise process. Though in the contex of Wold Theorem, {y t } is a zero-mean series, it involves no loss of generality as a zero-mean seriesy t can alwasys be constructed as deviations from mean, ie. {y t µ}. Wold reprentation has infinitely many lags, which is not practical. However, an infinite polymial can be written as a ratio of two finite order polynomials (rational polynomial). Hence y t = B(L)ɛ t = Θ(L) Φ(L) ɛ t where Θ(L) = q i=0 θ il i, and Φ(L) = q i=0 φ il i. For AR(p), where or y t = φ 1 y t φ p y t p + ɛ t = Ly t + L 2 y t + + L p y t + ɛ t Φ(L)y t = ɛ t If {y t } is covariance stationary, which requires that all roots of Φ(L) lie outside the unit circle, we can write y t = 1 Φ(L) ɛ t which is a rational polynomial whose numerator polynomial is of degree 0. Therefore, AR(p) is a special case of Wold representation under the condition that all roots of lag operator polynomial lie outside the unit circle. For MA(q), where y t = ɛ t + θ 1 ɛ t θ q ɛ t q = Θ(L)ɛ ( t) ɛ t W N(0, σ 2 ) which is a rational polynomial whose denominator polynomial is of degree 0. Therefore, MA(q) is a special case of Wold representation. No condition is needed here as MA(q) is always covariance stationary. For ARMA(p,q), where y t = φ 1 y t φ p y t p + ɛ t + θ 1 ɛ t θ q ɛ t q ɛ t W N(0, σ 2 ) or Φ(L)y t = Θ(L)ɛ t 2

3 If {y t } is covariance stationary, which requires that all roots of Φ(L) lie outside the unit circle, we can write y t = Θ(L) Φ(L) ɛ t which is the same as rational polynomial approximation of Wold Theorem. Therefore, ARMA(p,q) is a special case of Wold representation under the condition that all roots of AR lag operator polynomial lie outside the unit circle. Q3 The number of observations in the sample is T=47 (2000Q1:2011Q3), and we are asked to construct a 5-step-ahead out-of-sample forecast (2011Q3:2012Q4, h=5). 0 i=1,2,3, For the period 2012Q4, TIME 2012Q4 = T + h = 52, D i2012q4 = 1 i=4 Point Forecast : y 2012Q4,2011Q3 = y 52,47 = = % Interval Forecast : ± 1.96σ, where ˆσ 2 = 16, or y 2012Q4,2011Q3 [96.76, ] Density Forecast : y 2012Q4,2011Q3 N(104.6, 16) Q4 (a)a precise estimate requires an effecient estimator. OLS estimtor is effcient if Homoskedasiticy and No Serial Correlation are satisfied. Hence, to check whether her estimates are precise or not, we only need to check these two conditions. To test for Homoskedasiticy, we can perform Joint LM Test or White Test. To test for Serial Correlation, we can check Durbin-Waston Statistic to see whether there exhibits first order serial correlation. More Gernerally, we can plot correlogram and perform Box-Pierce Test or Ljung-Box Test to see whether ɛ t is White Noise. (b) As mentioned in the question, ɛ t has significant non-zero serial correlation, so we need to tackle serial correlation problem. One solution is to use ARMA model to capture the dynamics in residuals. Or a simpler method is to add lagged dependent variables as regressors. (c) If we model residuals by ARMA, 3

4 Step 1: Plot the correlogram of residuals in the original model and see how many significant lags there are for ACF and PACF. Say, we observe significant ACF up to the qth lag and significant PACF up to pth lag. Step2: Fit residuals by ARMA (p, q), where p [0, p] and q [0, q], and estimate the model again. Compare AIC and SIC of each specification, and choose the one with lowest AIC and SIC. Say, it is ARMA(p*,q*). Step 3: Check the residuals of the model whose residuals are modeled by ARMA(p*,q*) in the same way as Step 1. In addition, perform Box-Pierce Test or Ljung-Box Test to seek further evidence of White Noise Residuals. If the model passes the tests in Step 3, we can proceed with it and believe it produces precise estimates. However, if it could not pass, we should go back to Step 2, and choose the model with second lowest AIC and SIC. Then go through the entire process again until we eventually find a model which has White Noise residuals. Q5 (a) y T +2, T = ˆφy T +1,T + ɛ T +2,T + ˆθ 1 ɛ T +1,T + ˆθ 2 ɛ T,T = ˆφ( ˆφy T,T + ɛ T +1,T + ˆθ 1 ɛ T,T + ˆθ 2 ɛ T 1,T ) + ɛ T +2,T + ˆθ 1 ɛ T +1,T + ˆθ 2 ɛ T,T = ˆφ 2 y T + ˆφˆθ 1 ɛ T + ˆφˆθ 2 ɛ T 1 + ˆθ 2 ɛ T = ˆφ 2 y T + ( ˆφˆθ 1 + +ˆθ 2 )ɛ T + ˆφˆθ 2 ɛ T 1 (b) 2-step ahead forecast error = e T +2,T = y T +2 y T +2,T = (φy T +1 + ɛ T +2 + θ 1 ɛ T +1 + θ 2 ɛ T ) (φ 2 y T + (φθ 1 + +θ 2 )ɛ T + φθ 2 ɛ T 1 ) = [φ(φy T + ɛ T +1 + θ 1 ɛ T + θ 2 ɛ T 1 ) + ɛ T +2 + θ 1 ɛ T +1 + θ 2 ɛ T (φ 2 y T + (φθ 1 + +θ 2 )ɛ T + φθ 2 ɛ T 1 ) = (φ + θ 1 )ɛ T +1 + ɛ T +2 V ar(e T +2,T = [(φ + θ 1 ) 2 + 1]σ 2 Hence, 2-step ahead 95% interval forecast is ˆφ 2 y T + ( ˆφˆθ 1 + +ˆθ 2 )ɛ T + ˆφˆθ 2 ɛ T 1 ± 1.96 ( ˆφ + ˆθ 1 ) 2 + 1]ˆσ Q6 (a) On one hand, MSE is a biased estimator of out-of-sample 1-step-ahead prediction error variance. It provides an overly optimistic assessment of out-of-sample prediction error variance. On the other hand, MSE does not penalize for degree of freedom used. It will never rise if more independent variables are included in the model. Hence, it may lead to an incorrect model selection. (b) AIC and SIC penalize degree of freedom by penalty factors which are functions of k/t, so a more parsimonious model can be chosen by AIC and SIC. Moreover, AIC is asyptotically 4

5 effcient, and SIC is consistent. Therefore, with a larger and larger sample size, we are more likely to select a good model to capture the true data generating process. (c) To make SIC chooses more simple model than AIC does, SIC should penalize degree of freedom more than AIC. Note that so SIC > AIC for all T 8. AIC = e 2k/T σ 2 = (e 2 ) k/t σ k/t σ 2, SIC = T k/t σ 2, Q7 (a) Both GNI and EXP do not seem stationary. Firstly, they seem to have upward trends, so Constant Mean Codition is voilated. We should detrend or take first difference. Secondly, there seem to be seasonality in both series, which voilates the Condition that autovariance only depends on time distance. We should deseasonality. Therefore, T should be at least as large as 8 to make SIC chooses a more simple model than AIC does. (b) Apply Yule-Walker equations, Multiply ɛ t h to both sides, ɛ t = ρ 1 ɛ t 1 + ρ 2 ɛ t 2 + η t Take expectations on both sides, ɛ t ɛ t h = ρ 1 ɛ t 1 ɛ t h + ρ 2 ɛ t 2 ɛ t h + η t ɛ t h γ(h) = ρ 1 γ(h 1) + ρ 2 γ(h 2) + Cov(η t, ɛ t h ) For h = 0, γ(0) = ρ 1 γ(1) + ρ 2 γ(2) + σ 2, as Cov(η t, ɛ t ) = Cov(η t, ρ 1 ɛ t 1 + ρ 2 ɛ t 2 + η t ) = Cov(η t, η t ) = σ 2 For h 1, γ(h) = ρ 1 γ(h 1) + ρ 2 γ(h 2) ˆγ(1) = ˆρ 1ˆγ(0) + ˆρ 2ˆγ(1) ˆγ(2) = ˆρ 1ˆγ(1) + ˆρ 2ˆγ(0) 5

6 Plug in ˆγ(0) = 0.5, ˆγ(1) = 0.4, ˆγ(2) = 0.3 and solve the equation system, we can get, ˆρ 1 = 8/9, and ˆρ 2 = 1/9 (c) Recursive residuals and CUSUM test can be used to test parameter stability. Recursive residuals are from recursive parameter estimation. For the standard linear regression model, y t = k β i x i,t + ɛ t i=1 ɛ t iid N(0, σ 2 ) Instead of immediately using all he data to estimate the model, we begin our estimation with the first k observations, and add one observation each time after until the sample is exhausted. At each t, t = k,, T 1, we can compute a 1-step ahead forecast ŷ t+1,t = k i=1 x i,t+1. The corresponding forecast errors are recursive residuals, ê t+1 = y t+1 ŷ t+1,t.as the variance of ê t+1,t grows with the sample size, ê t+1,t N(0, σ 2 r t ) where r t > 1 for all t and r t is a function of the data. We can examine the plot of the recursive residuals and estimated two standard-error bands (±2ˆσ r t ). Under the null hypothesis of parameter stability, we expect all recursive risiduals lie inside the two standard-error bands. CUSUM is the cumulative sum of the standard recursive residuals, CUSUM t t w τ+1,τ, t = k,, T 1 τ=k w t+1,t êt+1,t σ iid N(0, 1) r t We can examine the time series plot of the CUSUM and its 95% probability bounds. Under null hypothesis of parameter stability, we expect CUSUM does not cross the bounds at any point. From the results in Figure 2, though recursive errors does not reveal parameter instability, CUSUM drops outside the lower band in , which raises an alert for parameter instability. (d) Since we suspect the true β is time varing, we can propose a time-varing parameter model. Namely, our model can be modified into: ln(exp t ) = α + β(t) ln(gni t ) + ɛ t 6

7 where β(t) is a function of time t. Bonus 1. Generate 1000 samples from the ARMA (1,1) process: y t = 0.9y t=1 + ɛ t + ɛ t 1 The sample size is 500 each. 2. Fit a ARMA (1,1) model for each sample, save all the estimated parameters and means of residuals in the object estimate. 7

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Lecture on ARMA model

Lecture on ARMA model Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

Econ 4120 Applied Forecasting Methods L10: Forecasting with Regression Models. Sung Y. Park CUHK

Econ 4120 Applied Forecasting Methods L10: Forecasting with Regression Models. Sung Y. Park CUHK Econ 4120 Applied Forecasting Methods L10: Forecasting with Regression Models Sung Y. Park CUHK Conditional forecasting model Forecast a variable conditional on assumptions about other variables. (scenario

More information

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, 61 4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, E(y t ) = µ + θ(l)e(ɛ t ) = µ 62 Example: MA(q) Model: y t = ɛ t + θ 1 ɛ

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Chapter 8: Model Diagnostics

Chapter 8: Model Diagnostics Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

CHAPTER 8 FORECASTING PRACTICE I

CHAPTER 8 FORECASTING PRACTICE I CHAPTER 8 FORECASTING PRACTICE I Sometimes we find time series with mixed AR and MA properties (ACF and PACF) We then can use mixed models: ARMA(p,q) These slides are based on: González-Rivera: Forecasting

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

White Noise Processes (Section 6.2)

White Noise Processes (Section 6.2) White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1

ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Modelling using ARMA processes

Modelling using ARMA processes Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994). Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Forecasting and Estimation

Forecasting and Estimation February 3, 2009 Forecasting I Very frequently the goal of estimating time series is to provide forecasts of future values. This typically means you treat the data di erently than if you were simply tting

More information

Introduction to Time Series Analysis. Lecture 12.

Introduction to Time Series Analysis. Lecture 12. Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale

More information

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

10. Time series regression and forecasting

10. Time series regression and forecasting 10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the

More information

Lesson 2: Analysis of time series

Lesson 2: Analysis of time series Lesson 2: Analysis of time series Time series Main aims of time series analysis choosing right model statistical testing forecast driving and optimalisation Problems in analysis of time series time problems

More information

Estimating AR/MA models

Estimating AR/MA models September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

E 4101/5101 Lecture 6: Spectral analysis

E 4101/5101 Lecture 6: Spectral analysis E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

A SARIMAX coupled modelling applied to individual load curves intraday forecasting A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h

More information

Forecasting. Francis X. Diebold University of Pennsylvania. August 11, / 323

Forecasting. Francis X. Diebold University of Pennsylvania. August 11, / 323 Forecasting Francis X. Diebold University of Pennsylvania August 11, 2015 1 / 323 Copyright c 2013 onward, by Francis X. Diebold. These materials are freely available for your use, but be warned: they

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Lecture note 2 considered the statistical analysis of regression models for time

Lecture note 2 considered the statistical analysis of regression models for time DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information