Define y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting
|
|
- Frank Smith
- 5 years ago
- Views:
Transcription
1 Forecasting Let {y t } be a covariance stationary are ergodic process, eg an ARMA(p, q) process with Wold representation y t = X μ + ψ j ε t j, ε t ~WN(0,σ 2 ) j=0 = μ + ε t + ψ 1 ε t 1 + ψ 2 ε t 2 + Let I t = {y t,y t 1,} denote the information set available at time t Recall, E[y t ] = μ var(y t ) = σ 2 X ψ 2 j j=0 Goal: Using I t produce optimal forecasts of y t+h for h =1, 2,,s Define y t+h t as the forecast of y t+h based on I t known parameters The forecast error is ε t+h t = y t+h y t+h t and the mean squared error of the forecast is MSE(ε t+h t ) = E[ε 2 t+h t ] = E[(y t+h y t+h t ) 2 ] Theorem: The minimum MSE forecast (best forecast) of y t+h based on I t is y t+h t = E[y t+h I t ] Proof: See Hamilton pages Note: y t+h = μ + ε t+h + ψ 1 ε t+h 1 + +ψ h 1 ε t+1 + ψ h ε t + ψ h+1 ε t 1 +
2 Remarks 1 The computation of E[y t+h I t ] depends on the distribution of {ε t } and may be a very complicated nonlinear function of the history of {ε t } Even if {ε t } is an uncorrelated process (eg white noise) it may be the case that E[ε t+1 I t ] 6= 0 2 If {ε t } is independent white noise, then E[ε t+1 I t ]= 0 and E[y t+h I t ] will be a simple linear function of {ε t } y t+h t = μ + ψ h ε t + ψ h+1 ε t 1 + Linear Predictors A linear predictor of y t+h t is a linear function of the variables in I t Theorem: The minimum MSE linear forecast (best linear predictor) of y t+h based on I t is y t+h t = μ + ψ h ε t + ψ h+1 ε t 1 + Proof See Hamilton page 74 The forecast error of the best linear predictor is ε t+h t = y t+h y t+h t = μ + ε t+h + ψ 1 ε t+h 1 + +ψ h 1 ε t+1 + ψ h ε t + (μ + ψ h ε t + ψ h+1 ε t 1 + ) = ε t+h + ψ 1 ε t+h ψ h 1 ε t+1 and the MSE of the forecast error is MSE(ε t+h t )=σ 2 (1 + ψ ψ2 h 1 )
3 Remarks Example: BLP for MA(1) process 1 E[ε t+h t ]=0 2 ε t+h t is uncorrelated with any element in I t Here y t = μ + ε t + θε t 1, ε t WN(0,σ 2 ) ψ 1 = θ, ψ h =0for h>1 3 The form of y t+h t is closely related to the IRF Therefore, y t+1 t = μ + θε t 4 MSE(ε t+h t )=var(ε t+h t ) var(y t ) 5 lim h y t+h t = μ 6 lim h MSE(ε t+h t )=var(y t ) y t+2 t = μ y t+h t = μ for h>1 The forecast errors and MSEs are ε t+1 t = ε t+1, MSE(ε t+1 t )=σ 2 ε t+2 t = ε t+2 + θε t+1, MSE(ε t+2 t )=σ 2 (1 + θ 2 )
4 Prediction Confidence Intervals Predictions with Estimated Parameters If {ε t } is Gaussian then y t+h I t N(y t+h t,σ 2 (1 + ψ ψ2 h 1 )) A95%confidence interval for the h step prediction has the form q y t+h t ± 196 σ 2 (1 + ψ ψ2 h 1 ) Let ŷ t+h t denote the BLP with estimated parameters: ŷ t+h t =ˆμ + ˆψ hˆε t + ˆψ h+1ˆε t 1 + where ˆε t is the estimated residual from the fitted model The forecast error with estimated parameters is ˆε t+h t = y t+h ŷ t+h t = (μ ˆμ)+ε t+h + ψ 1 ε t+h ψ h 1 ε t+1 + ³ ψ h ε t ˆψ hˆε t + ³ ψh+1 ε t 1 ˆψ h+1ˆε t 1 + Obviously, MSE(ˆε t+h t ) 6= MSE(ε t+h t )=σ 2 (1+ψ ψ2 h 1 ) Note: Most software computes dmse(ε t+h t )=ˆσ 2 (1 + ˆψ ˆψ 2 h 1)
5 Computing the Best Linear Predictor The BLP y t+h t maybecomputedinmanydifferent but equivalent ways The algorithm for computing y t+h t from an AR(1) model is simple and the methodology allows for the computation of forecasts for general ARMA models as well as multivariate models Example: AR(1) Model y t μ = φ(y t 1 μ)+ε t ε t ~WN(0,σ 2 ) μ, φ, σ 2 are known In the Wold representation ψ j = φ j Starting at t and iterating forward h periods gives y t+h = μ + φ h (y t μ)+ε t+h + φε t+h 1 + +φ h 1 ε t+1 = μ + φ h (y t μ)+ε t+h + ψ 1 ε t+h 1 + The best linear forecasts of y t+1,y t+2,,y t+h are computed using the chain-rule of forecasting (law of iterated projections) y t+1 t = μ + φ(y t μ) y t+2 t = μ + φ(y t+1 t μ) =μ + φ(φ(y t μ)) = μ + φ 2 (y t μ) y t+h t = μ + φ(y t+h 1 t μ) =μ + φ h (y t μ) The corresponding forecast errors are ε t+1 t = y t+1 y t+1 t = ε t+1 ε t+2 t = y t+2 y t+2 t = ε t+2 + φε t+1 = ε t+2 + ψ 1 ε t+1 ε t+h t = y t+h y t+h t = ε t+h + φε t+h 1 + +φ h 1 ε t+1 = ε t+h + ψ 1 ε t+h ψ h 1 ε t+1 +ψ h 1 ε t+1
6 The forecast error variances are var(ε t+1 t ) = σ 2 var(ε t+2 t ) = σ 2 (1 + φ 2 )=σ 2 (1 + ψ 2 1 ) var(ε t+h t ) = σ 2 (1 + φ φ 2(h 1) )=σ 21 φ2h 1 φ 2 Clearly, = σ 2 (1 + ψ ψ2 h 1 ) lim y t+h t = μ = E[y t ] h lim var(ε t+h t ) = σ 2 h 1 φ 2 = σ 2 X h=0 ψ 2 h = var(y t) AR(p) Models Consider the AR(p) model φ(l)(y t μ) = ε t, ε t WN(0,σ 2 ) φ(l) = 1 φ 1 L φ p L p The forecasting algorithm for the AR(p) models is essentiallythesameasthatforar(1)modelsonceweputthe AR(p) model in state space form Let X t = y t μ The AR(p) in state space form is or X t X t 1 X t p+1 = φ 1 φ 2 φ p ξ t = Fξ t 1 +w t var(w t ) = Σ w X t 1 X t 2 X t p + ε t 0 0
7 Starting at t and iterating forward h periods gives ξ t+h = F h ξ t + w t+h + Fw t+h F h 1 w t+1 Then the best linear forecasts of y t+1,y t+2,,y t+h are computed using the chain-rule of forecasting are ξ t+1 t = Fξ t ξ t+2 t = Fξ t+1 t = F 2 ξ t ξ t+h t = Fξ t+h 1 t = F h ξ t The forecast for y t+h is given by μ plus the first row of ξ t+h t = F h ξ t : ξ t+h t = φ 1 φ 2 φ p h y t μ y t 1 μ y t p+1 μ The forecast errors are given by w t+1 t = ξ t+1 ξ t+1 t = w t+1 w t+2 t = ξ t+2 ξ t+2 t = w t+2 + Fw t+1 w t+h t = ξ t+h ξ t+h t = w t+h + Fw t+h 1 + +F h 1 w t+1 and the corresponding forecast MSE matrices are var(w t+1 t ) = var(w t )=Σ w var(w t+2 t ) = var(w t+2 )+Fvar(w t+1 )F 0 var(w t+h t ) = Notice that = Σ w + FΣ w F 0 h 1 X F j Σ w F j0 j=0 var(w t+h t )=Σ w + Fvar(w t+h 1 t )F 0
8 Forecast Evaluation Diebold-Mariano Test for Equal Predictive Accuracy Let {y t } denote the series to be forecast and let y t+h t 1 and y t+h t 2 denote two competing forecasts of y t+h based on I t For example, y t+h t 1 could be computed from an AR(p) modelandy t+h t 2 could be computed from an ARMA(p, q) model The forecast errors from the two models are ε 1 t+h t = y t+h y 1 t+h t ε 2 t+h t = y t+h y 2 t+h t The h step forecasts are assumed to be computed for t = t 0,,T for a total of T 0 forecasts giving {ε 1 t+h t }T t 0, {ε 2 t+h t }T t 0 Because the h-step forecasts use overlapping data the forecast errors in {ε 1 t+h t }T t 0 and {ε 2 t+h t }T t 0 will be serially correlated
9 The accuracy of each forecast is measured by a particular loss function L(y t+h,yt+h t i )=L(εi t+h t ), i =1, 2 Some popular loss functions are: L(ε i t+h t ) = ³ ε i t+h t 2 : squared error loss L(ε i t+h t ) = ε i t+h t : absolute value loss To determine if one model predicts better than another wemaytestnullhypotheses H 0 : E[L(ε 1 t+h t )] = E[L(ε2 t+h t )] against the alternative H 1 : E[L(ε 1 t+h t )] 6= E[L(ε2 t+h t )] The Diebold-Mariano test is based on the loss differential d t = L(ε 1 t+h t ) L(ε2 t+h t ) The null of equal predictive accuracy is then H 0 : E[d t ]=0 The Diebold-Mariano test statistic is d S = ³ avar( d d) 1/2 = d ³ LRV d 1/2 d /T where d = 1 X T d t T 0 t=t 0 LRV d = X γ 0 +2 γ j, γ j = cov(d t,d t j ) j=1 Note: The long-run variance is used in the statistic because the sample of loss differentials {d t } T t 0 are serially correlated for h>1
10 Diebold and Mariano (1995) show that under the null of equal predictive accuracy S A ~ N(0, 1) So we reject the null of equal predictive accuracy at the 5% level if S > 196 One sided tests may also be computed
Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationAutoregressive and Moving-Average Models
Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationForecasting and Estimation
February 3, 2009 Forecasting I Very frequently the goal of estimating time series is to provide forecasts of future values. This typically means you treat the data di erently than if you were simply tting
More informationTrend-Cycle Decompositions
Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More information2.5 Forecasting and Impulse Response Functions
2.5 Forecasting and Impulse Response Functions Principles of forecasting Forecast based on conditional expectations Suppose we are interested in forecasting the value of y t+1 based on a set of variables
More informationVector Auto-Regressive Models
Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions
More informationVAR Models and Applications
VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions
More informationUniversità di Pavia. Forecasting. Eduardo Rossi
Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationConsider the trend-cycle decomposition of a time series y t
1 Unit Root Tests Consider the trend-cycle decomposition of a time series y t y t = TD t + TS t + C t = TD t + Z t The basic issue in unit root testing is to determine if TS t = 0. Two classes of tests,
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationWhite Noise Processes (Section 6.2)
White Noise Processes (Section 6.) Recall that covariance stationary processes are time series, y t, such. E(y t ) = µ for all t. Var(y t ) = σ for all t, σ < 3. Cov(y t,y t-τ ) = γ(τ) for all t and τ
More informationLecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationECONOMETRICS Part II PhD LBS
ECONOMETRICS Part II PhD LBS Luca Gambetti UAB, Barcelona GSE February-March 2014 1 Contacts Prof.: Luca Gambetti email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ Description This is
More informationLecture on ARMA model
Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment
More informationPrinciples of forecasting
2.5 Forecasting Principles of forecasting Forecast based on conditional expectations Suppose we are interested in forecasting the value of y t+1 based on a set of variables X t (m 1 vector). Let y t+1
More informationMultiple Linear Regression
Multiple Linear Regression Simple linear regression tries to fit a simple line between two variables Y and X. If X is linearly related to Y this explains some of the variability in Y. In most cases, there
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationCh 2: Simple Linear Regression
Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationWeighted Least Squares
Weighted Least Squares The standard linear model assumes that Var(ε i ) = σ 2 for i = 1,..., n. As we have seen, however, there are instances where Var(Y X = x i ) = Var(ε i ) = σ2 w i. Here w 1,..., w
More informationChapter 8: Model Diagnostics
Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationB y t = γ 0 + Γ 1 y t + ε t B(L) y t = γ 0 + ε t ε t iid (0, D) D is diagonal
Structural VAR Modeling for I(1) Data that is Not Cointegrated Assume y t =(y 1t,y 2t ) 0 be I(1) and not cointegrated. That is, y 1t and y 2t are both I(1) and there is no linear combination of y 1t and
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationVector autoregressions, VAR
1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationCircle the single best answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationLecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector
More informationUnivariate Nonstationary Time Series 1
Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction
More informationRegression Models - Introduction
Regression Models - Introduction In regression models, two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent variable,
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationClass 4: VAR. Macroeconometrics - Fall October 11, Jacek Suda, Banque de France
VAR IRF Short-run Restrictions Long-run Restrictions Granger Summary Jacek Suda, Banque de France October 11, 2013 VAR IRF Short-run Restrictions Long-run Restrictions Granger Summary Outline Outline:
More informationHeteroskedasticity in Time Series
Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationSimple Linear Regression (Part 3)
Chapter 1 Simple Linear Regression (Part 3) 1 Write an Estimated model Statisticians/Econometricians usually write an estimated model together with some inference statistics, the following are some formats
More informationSimple Linear Regression
Simple Linear Regression In simple linear regression we are concerned about the relationship between two variables, X and Y. There are two components to such a relationship. 1. The strength of the relationship.
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationEcon 583 Final Exam Fall 2008
Econ 583 Final Exam Fall 2008 Eric Zivot December 11, 2008 Exam is due at 9:00 am in my office on Friday, December 12. 1 Maximum Likelihood Estimation and Asymptotic Theory Let X 1,...,X n be iid random
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More informationProblem Set 2 Solution Sketches Time Series Analysis Spring 2010
Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More information4 Multiple Linear Regression
4 Multiple Linear Regression 4. The Model Definition 4.. random variable Y fits a Multiple Linear Regression Model, iff there exist β, β,..., β k R so that for all (x, x 2,..., x k ) R k where ε N (, σ
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationBasic concepts and terminology: AR, MA and ARMA processes
ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture
More informationClass: Trend-Cycle Decomposition
Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis
More informationState-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49
State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing
More informationECO Econometrics III. Daniel L. Millimet. Fall Southern Methodist University. DL Millimet (SMU) ECO 6375 Fall / 150
ECO 6375 Econometrics III Daniel L. Millimet Southern Methodist University Fall 2018 DL Millimet (SMU) ECO 6375 Fall 2018 1 / 150 Time Series Introduction TS models can be grouped into two categories Models
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationE 4101/5101 Lecture 9: Non-stationarity
E 4101/5101 Lecture 9: Non-stationarity Ragnar Nymoen 30 March 2011 Introduction I Main references: Hamilton Ch 15,16 and 17. Davidson and MacKinnon Ch 14.3 and 14.4 Also read Ch 2.4 and Ch 2.5 in Davidson
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function
More informationSGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection
SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More informationPeter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8
Contents 1 Linear model 1 2 GLS for multivariate regression 5 3 Covariance estimation for the GLM 8 4 Testing the GLH 11 A reference for some of this material can be found somewhere. 1 Linear model Recall
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationFinancial Econometrics and Volatility Models Estimation of Stochastic Volatility Models
Financial Econometrics and Volatility Models Estimation of Stochastic Volatility Models Eric Zivot April 26, 2010 Outline Likehood of SV Models Survey of Estimation Techniques for SV Models GMM Estimation
More informationHeteroskedasticity in Panel Data
Essex Summer School in Social Science Data Analysis Panel Data Analysis for Comparative Research Heteroskedasticity in Panel Data Christopher Adolph Department of Political Science and Center for Statistics
More informationSwitching Regime Estimation
Switching Regime Estimation Series de Tiempo BIrkbeck March 2013 Martin Sola (FE) Markov Switching models 01/13 1 / 52 The economy (the time series) often behaves very different in periods such as booms
More informationHeteroskedasticity in Panel Data
Essex Summer School in Social Science Data Analysis Panel Data Analysis for Comparative Research Heteroskedasticity in Panel Data Christopher Adolph Department of Political Science and Center for Statistics
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationLesson 13: Box-Jenkins Modeling Strategy for building ARMA models
Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,
More informationChapter 2 Multiple Regression I (Part 1)
Chapter 2 Multiple Regression I (Part 1) 1 Regression several predictor variables The response Y depends on several predictor variables X 1,, X p response {}}{ Y predictor variables {}}{ X 1, X 2,, X p
More informationStatistics 910, #15 1. Kalman Filter
Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations
More informationLecture 8: ARIMA Forecasting Please read Chapters 7 and 8 of MWH Book
Lecture 8: ARIMA Forecasting Please read Chapters 7 and 8 of MWH Book 1 Predicting Error 1. y denotes a random variable (stock price, weather, etc) 2. Sometimes we want to do prediction (guessing). Let
More informationCh. 19 Models of Nonstationary Time Series
Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.
More informationCross-Validation with Confidence
Cross-Validation with Confidence Jing Lei Department of Statistics, Carnegie Mellon University UMN Statistics Seminar, Mar 30, 2017 Overview Parameter est. Model selection Point est. MLE, M-est.,... Cross-validation
More informationMEI Exam Review. June 7, 2002
MEI Exam Review June 7, 2002 1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB)
More informationBox-Jenkins. (1) Identification ( ) (2) Estimation ( ) (3) Diagnostic Checking ( ) (1) Identification: ARMA(p,q) p, q. (2) Estimation: ARMA(p,q)
4 A RMA Box-Jenkins () Identification ( ) (2) Estimation ( ) (3) Diagnostic Checking ( ) () Identification: ARMA(p,q) p, q (2) Estimation: ARMA(p,q) φ(l)y t = m + θ(l)ε t φ = (φ,,φ p ) θ = (θ,,θ q ) m
More informationMultivariate Regression
Multivariate Regression The so-called supervised learning problem is the following: we want to approximate the random variable Y with an appropriate function of the random variables X 1,..., X p with the
More informationMultivariate Regression (Chapter 10)
Multivariate Regression (Chapter 10) This week we ll cover multivariate regression and maybe a bit of canonical correlation. Today we ll mostly review univariate multivariate regression. With multivariate
More information