Forecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because
|
|
- Shanon Bates
- 6 years ago
- Views:
Transcription
1 Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate the parameters in the ARIMA model and how to examine the goodness of fit of our model. So we are finally in a position to use our model to forecast new values for our time series. Suppose we have observed a time series up to time T, so that we have known values for X 1, X 2,, X T, and subsequently, we also know Z 1, Z 2,, Z T. This collection of information we know will be denoted as: I T = {X 1, X 2,, X T ; Z 1, Z 2,, Z T } Suppose that we want to forecast a value for X T+k, k time units into the future. We now introduce the most prevalent optimality criterion for forecasting. Theorem: The optimal k-step ahead forecast X T+k,T (which is a function of I T ) that will minimize the mean square error, is E(X T+k X T+k,T ) 2 X T+k,T = E(X T+k I T ) This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because E(X T+k X T+k,T ) = E [E ((X T+k X T+k,T ) I T )] = E[X T+k,T X T+k,T ] = 0 Lemma: Suppose Z and W are real random variables, then min h E [(Z h(w)) 2 ] = E [(Z E(Z W)) 2 ] 1
2 That is, the posterior mean E(Z W) will minimize the quadratic loss (mean square error). Proof: E [(Z h(w)) 2 ] = E [(Z E(Z W) + E(Z W) h(w)) 2 ] = E [(Z E(Z W)) 2 ] + 2E[(Z E(Z W))(E(Z W) h(w))] +E [(E(Z W) h(w)) 2 ] Conditioning on W, the cross term is zero. Thus E [(Z h(w)) 2 ] = E [(Z E(Z W)) 2 ] + E [(E(Z W) h(w)) 2 ] Note: We usually omit the word optimal in the K Periods Ahead (Optimal) Forecast, because in the time series context, when we refer to forecast, we mean the optimal minimum mean square error forecast, by default. 2
3 2. Forecast in MA(1) The MA(1) model is: X t = Z t + θz t 1, where Z t ~WN(0, σ 2 ) Given the observed data {X 1, X 2,, X T }, the white noise term Z t is not directly observed, however, we can perform an Recursive Estimation of the White Noise as follows: Given the initial condition Z 0 (not important), we have: Z 1 = X 1 θz 0 Z 2 = X 2 θz 1 Z T = X T θz T 1 Therefore, given {X 1, X 2,, X T }, we can also obtain {Z 1, Z 2,, Z T }, the entire information known can thus be written as: I T = {X 1, X 2,, X T ; Z 1, Z 2,, Z T } Since X T+1 = Z T+1 + θz T, the one period ahead (optimal) forecast is X T+1,T = E(X T+1 I T ) = E(Z T+1 + θz T I T ) = E(Z T+1 I T ) + θe(z T I T ) = E(Z T+1 ) + θe(z T Z T ) = 0 + θz T = θz T Note: Per convention in time series literature, we wrote: E(Z T Z T ) = Z T Although one can write this more rigorously as: E(Z T Z T = z T ) = z T Similarly, we wrote E(X T X T ) = X T, which can be written more rigorously as: E(X T X T = x T ) = x T e T+1,T = X T+1 X T+1,T = Z T+1 Since the forecast is unbiased, the minimum mean square error is equal to the forecast error variance: E(X T+1 X T+1,T ) 2 = Var(X T+1 X T+1,T ) = Var(Z T+1 ) = σ 2 Given I T = {X 1, X 2,, X T ; Z 1, Z 2,, Z T }, and X T+2 = Z T+2 + θz T+1, the two periods ahead (optimal) forecast is X T+2,T = E(X T+2 I T ) = E(Z T+2 + θz T+1 I T ) = E(Z T+2 I T ) + θe(z T+1 I T ) = 0 + θ 0 = 0 3
4 e T+2,T = X T+2 X T+2,T = Z T+2 + θz T+1 Since the forecast is unbiased, the minimum mean square error is equal to the forecast error variance: E(X T+2 X T+2,T ) 2 = Var(X T+2 X T+2,T ) = Var(Z T+2 + θz T+1 ) = (1 + θ 2 )σ 2 In summary, for more than one period ahead, just like the two-periods ahead forecast, the forecast is zero, the unconditional mean. That is, X T+2,T = E(X T+2 I T ) = 0 X T+3,T = E(X T+3 I T ) = 0 The MA(1) process is not forecastable for more than one period ahead (apart from the unconditional mean). Forecast for MA(1) with Intercept (non-zero mean) If the MA(1) model includes an intercept X t = μ+z t + θz t 1, where Z t ~WN(0, σ 2 ) We can perform forecasting using the same approach. For example, since X T+1 = μ+z T+1 + θz T, the one period ahead (optimal) forecast is X T+1,T = E(X T+1 I T ) = E(μ + Z T+1 + θz T I T ) = μ + E(Z T+1 I T ) + θe(z T I T ) = μ θz T = μ + θz T e T+1,T = X T+1 X T+1,T = Z T+1 Since the forecast is unbiased, the minimum mean square error is equal to the forecast error variance: E(X T+1 X T+1,T ) 2 = Var(X T+1 X T+1,T ) = Var(Z T+1 ) = σ 2 3. Generalization to MA(q): For a MA(q) process, we can forecast up to q out-of-sample periods. Can you write down the (optimal) point forecast for each period, the forecast error, and the forecast error variance? Please practice before the exam. 4. Forecast in AR(1) For the AR(1) model, X t = X t 1 +Z t, where Z t ~WN(0, σ 2 ) Given I T = {X 1, X 2,, X T ; Z 1, Z 2,, Z T }; The One Period Ahead (Optimal) Forecast can be computed as follows: 4
5 X T+1 = X T +Z T+1 So, given data I T, the one period ahead forecast is X T+1,T = E(X T+1 I T ) = E( X T +Z T+1 I T ) = E( X T I T ) + E(Z T+1 I T ) = X T + 0 = X T e T+1,T = X T+1 X T+1,T = Z T+1 The forecast error variance: Var(e T+1,T ) = Var(Z T+1 ) = σ 2 Two Periods Ahead Forecast By back-substitution X T+2 = X T+1 +Z T+2 X T+2 = ( X T +Z T+1 )+Z T+2 X T+2 = 2 X T + Z T+1 +Z T+2 So, given data I T, the two periods ahead (optimal) forecast is X T+2,T = E(X T+2 I T ) = E( 2 X T + Z T+1 +Z T+2 I T ) = 2 X T = 2 X T That is, the 2 periods ahead forecast is also a linear function of the final observed value, but with the coefficient 2. e T+2,T = X T+2 X T+2,T = Z T+1 +Z T+2 The forecast error variance is: Var(e T+2,T ) = Var( Z T+1 +Z T+2 ) = (1 + 2 )σ 2 K Periods Ahead Forecast Similarly X T+k = k X T + k 1 Z T Z T+k 1 + Z T+k So, given I T, the k periods ahead optimal forecast is X T+k,T = E(X T+k I T ) = k X T AR(1) with Intercept If the AR(1) model includes an intercept X t = α + X t 1 + Z t, where Z t ~WN(0, σ 2 ) Then the one period ahead forecast is X T+1,T = E(X T+1 I T ) = α + E( X T I T ) + E(Z T+1 I T ) 5
6 = α + X T What is the two periods ahead forecast? Forecast AR(1) recursively. Alternatively, rather than using the back-substitution method directly as qwe have shown above, we can forecast the AR(1) recursively as follows: X T+2 = X T+1 +Z T+2 So, given data I T, the two periods ahead (optimal) forecast is X T+2,T = E(X T+2 I T ) = E( X T+1 +Z T+2 I T ) = E(X T+1 I T ) = X T+1,T Similarly, one can obtain the general recursive forecast relationship as: X T+k,T = E(X T+k I T ) = E( X T+k 1 +Z T+k I T ) = E(X T+k 1 I T ) = X T+k 1,T = = k 1 X T+1,T This way, once you have computed the one-step ahead forecast as: X T+1,T = E(X T+1 I T ) = E( X T +Z T+1 I T ) = E( X T I T ) + E(Z T+1 I T ) = X T + 0 = X T You can obtain the rest of the forecast easily. Note: As mentioned in class, I prefer the back-substitution method (introduced first) in general because it can provide you with the forecast error directly as well. 5. AR(p) Process Consider an AR(2) model with intercept X t = α + 1 X t X t 2 + Z t, where Z t ~WN(0, σ 2 ) We have X T+1 = α + 1 X T + 2 X T 1 + Z T+1 Then the one period ahead forecast is X T+1,T = E(X T+1 I T ) = α + E( 1 X T I T ) + E( 2 X T 1 I T ) + E(Z T+1 I T ) = α + 1 X T + 2 X T 1 The two periods ahead forecast (by the recursive foreecasting method) is X T+2,T = E(X T+2 I T ) = α + E( 1 X T+1 I T ) + E( 2 X T I T ) + E(Z T+2 I T ) = α + 1 X T+1,T + 2 X T = α + 1 (α + 1 X T + 2 X T 1 ) + 2 X T = (1 + 1 )α + ( )X T X T 1 6
7 Alternatively, the two periods ahead forecast (by the back-substitution method) can be derived as: X T+2 = α + 1 X T X T + Z T+2 = α + 1 (α + 1 X T + 2 X T 1 + Z T+1 ) + 2 X T + Z T+2 = (1 + 1 )α + ( )X T X T Z T+1 + Z T+2 Therefore X T+2,T = E(X T+2 I T ) = (1 + 1 )α + ( )X T X T 1 e T+2,T = X T+2 X T+2,T = 1 Z T+1 + Z T+2 The forecast error variance is: Var(e T+2,T ) = Var( 1 Z T+1 + Z T+2 ) = ( )σ 2 What are the forecasts for more future periods? Please practice before the exam. 6. ARMA Process Consider an ARMA(1,1) model X t = X t 1 + Z t + θz t 1, where Z t ~WN(0, σ 2 ) Again we can use the back-substitution method to compute point forecast. X T+1 = X T + Z T+1 + θz T X T+2 = X T+1 + Z T+2 + θz T+1 = ( X T + Z T+1 + θz T ) + Z T+2 + θz T+1 One-step ahead forecast: X T+1,T = E(X T+1 I T ) = E( X T I T ) + E(Z T+1 I T ) + E(θZ T I T ) = X T + θz T Two-step ahead forecast: X T+2,T = E(X T+2 I T ) = E( X T+1 I T ) + E(Z T+2 I T ) + E(θZ T+1 I T ) = E(X T+1 I T ) = ( X T + θz T ) The two-step ahead forecast error is e T+2,T = X T+2 X T+2,T = Z T+1 + Z T+2 + θz T+1 = ( + θ)z T+1 + Z T+2 Again, since the forecast is unbiased, the minimum mean square error is equal to the forecast error variance: E(X T+2 X T+2,T ) 2 = Var(X T+2 X T+2,T ) = Var[( + θ)z T+1 + Z T+2 ] = [( + θ) 2 + 1]σ 2 7
8 7. ARIMA Process Suppose X t follows the ARIMA(1,1,0) model. That means its first difference Y t = X t = X t X t 1 follows the ARMA(1,0) model, which is simply the AR(1) model as follows: Y t = φy t 1 + Z t Here we assume Z t ~WN(0, σ Z 2 ) Note: we used σ Z 2 instead of σ 2 so you can get familiar with different notations. Substituting Y t = X t = X t X t 1, and Y t 1 = X t 1 = X t 1 X t 2 we can write the original ARIMA(1,1,0) model as: X t X t 1 = φ(x t 1 X t 2 ) + Z t That is: X t = (1 + φ)x t 1 φx t 2 + Z t One-step ahead forecast: X T+1,T = E[X T+1 I T ] = E[(1 + φ)x T φx T 1 + Z T+1 I T ] = (1 + φ)x T φx T 1 Thus the forecast error e T+1,T = X T+1 X T+1,T = (1 + φ)x T φx T 1 + Z T+1 [(1 + φ)x T φx T 1 ] = Z T+1 And the variance of the forecast error Var(e T+1,T ) = Var(Z T+1 ) = σ Z 2 Two-step ahead forecast: X T+2 = (1 + φ)x T+1 φx T + Z T+2 = (1 + φ)[(1 + φ)x T φx T 1 + Z T+1 ] φx T + Z T+2 = (1 + φ + φ 2 )X T (φ + φ 2 )X T 1 + Z T+2 + (1 + φ)z T+1 Therefore X T+2,T = E[X T+2 I T ] = E[(1 + φ + φ 2 )X T (φ + φ 2 )X T 1 + Z T+2 + (1 + φ)z T+1 I T ] = (1 + φ + φ 2 )X T (φ + φ 2 )X T 1 Thus the forecast error e T+2,T = X T+2 X T+2,T = Z T+2 + (1 + φ)z T+1 And the variance of the forecast error Var(e T+2,T ) = (2 + 2φ + φ 2 )σ Z 2 8. Prediction (Forecast) Error, and Prediction Interval Recall the k-step ahead prediction error is defined as e +, = X + E(X + I ) By the law of total variance: Var( ) = E[Var( X)] + Var[E( X)] 8
9 We can derive the variance of the prediction error: Var(e +, ) = Var(X + I ) Based on the predictor and its error variance, we may construct the prediction intervals. For example, if the X s are normally distributed, we can consider the 95% prediction interval E(X + I ) 0.02 Var(X + I ) 9
Chapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationCh 9. FORECASTING. Time Series Analysis
In this chapter, we assume the model is known exactly, and consider the calculation of forecasts and their properties for both deterministic trend models and ARIMA models. 9.1 Minimum Mean Square Error
More information7. Forecasting with ARIMA models
7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More informationESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45
ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More information6.3 Forecasting ARMA processes
6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationX t = a t + r t, (7.1)
Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical
More informationStatistics 349(02) Review Questions
Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationMAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)
MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationModule 4. Stationary Time Series Models Part 1 MA Models and Their Properties
Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h
More informationStochastic Modelling Solutions to Exercises on Time Series
Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationUniversità di Pavia. Forecasting. Eduardo Rossi
Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationCalculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by
Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary
More informationSampling Distributions
Sampling Distributions Mathematics 47: Lecture 9 Dan Sloughter Furman University March 16, 2006 Dan Sloughter (Furman University) Sampling Distributions March 16, 2006 1 / 10 Definition We call the probability
More informationWe use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.
Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 41: Applied Time Series Ioa State University Copyright 1 W. Q. Meeker. Segment 1 ARMA Notation, Conventions,
More informationLecture 8: ARIMA Forecasting Please read Chapters 7 and 8 of MWH Book
Lecture 8: ARIMA Forecasting Please read Chapters 7 and 8 of MWH Book 1 Predicting Error 1. y denotes a random variable (stock price, weather, etc) 2. Sometimes we want to do prediction (guessing). Let
More informationTime Series Outlier Detection
Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationITSM-R Reference Manual
ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationLevinson Durbin Recursions: I
Levinson Durbin Recursions: I note: B&D and S&S say Durbin Levinson but Levinson Durbin is more commonly used (Levinson, 1947, and Durbin, 1960, are source articles sometimes just Levinson is used) recursions
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationUnivariate ARIMA Forecasts (Theory)
Univariate ARIMA Forecasts (Theory) Al Nosedal University of Toronto April 10, 2016 Al Nosedal University of Toronto Univariate ARIMA Forecasts (Theory) April 10, 2016 1 / 23 All univariate forecasting
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationStatistics 910, #15 1. Kalman Filter
Statistics 910, #15 1 Overview 1. Summary of Kalman filter 2. Derivations 3. ARMA likelihoods 4. Recursions for the variance Kalman Filter Summary of Kalman filter Simplifications To make the derivations
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests
ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN
More informationA Data-Driven Model for Software Reliability Prediction
A Data-Driven Model for Software Reliability Prediction Author: Jung-Hua Lo IEEE International Conference on Granular Computing (2012) Young Taek Kim KAIST SE Lab. 9/4/2013 Contents Introduction Background
More informationModelling using ARMA processes
Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33
More informationWhite noise processes
Whie noise processes A d-dimensional ime series X = {, Z} is said o be whie noise wih mean 0 and covariance marix Σ, if WN(0, Σ), E = 0, E( ) = Σ, E( X( + u) ) = 0 for u 0. Example: iid N (0, 1) 3 2 1
More informationMAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)
MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationINTRODUCTION TO TIME SERIES ANALYSIS. The Simple Moving Average Model
INTRODUCTION TO TIME SERIES ANALYSIS The Simple Moving Average Model The Simple Moving Average Model The simple moving average (MA) model: More formally: where t is mean zero white noise (WN). Three parameters:
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More information{ ϕ(f )Xt, 1 t n p, X t, n p + 1 t n, { 0, t=1, W t = E(W t W s, 1 s t 1), 2 t n.
Innovations, Sufficient Statistics, And Maximum Lielihood In ARM A Models Georgi N. Boshnaov Institute of Mathematics, Bulgarian Academy of Sciences 1. Introduction. Let X t, t Z} be a zero mean ARMA(p,
More informationVariance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18
Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationMATH 5075: Time Series Analysis
NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show
More informationEcon 4120 Applied Forecasting Methods L10: Forecasting with Regression Models. Sung Y. Park CUHK
Econ 4120 Applied Forecasting Methods L10: Forecasting with Regression Models Sung Y. Park CUHK Conditional forecasting model Forecast a variable conditional on assumptions about other variables. (scenario
More informationForecasting. Simon Shaw 2005/06 Semester II
Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationinterval forecasting
Interval Forecasting Based on Chapter 7 of the Time Series Forecasting by Chatfield Econometric Forecasting, January 2008 Outline 1 2 3 4 5 Terminology Interval Forecasts Density Forecast Fan Chart Most
More informationTMA4285 December 2015 Time series models, solution.
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationChapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models
Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationUNIVARIATE TIME SERIES ANALYSIS BRIEFING 1970
UNIVARIATE TIME SERIES ANALYSIS BRIEFING 1970 Joseph George Caldwell, PhD (Statistics) 1432 N Camino Mateo, Tucson, AZ 85745-3311 USA Tel. (001)(520)222-3446, E-mail jcaldwell9@yahoo.com (File converted
More informationNext tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution. e z2 /2. f Z (z) = 1 2π. e z2 i /2
Next tool is Partial ACF; mathematical tools first. The Multivariate Normal Distribution Defn: Z R 1 N(0,1) iff f Z (z) = 1 2π e z2 /2 Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) (a column
More informationCh 5. Models for Nonstationary Time Series. Time Series Analysis
We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the
More information5 Transfer function modelling
MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1
ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN
More informationMinitab Project Report - Assignment 6
.. Sunspot data Minitab Project Report - Assignment Time Series Plot of y Time Series Plot of X y X 7 9 7 9 The data have a wavy pattern. However, they do not show any seasonality. There seem to be an
More informationPrinciples of forecasting
2.5 Forecasting Principles of forecasting Forecast based on conditional expectations Suppose we are interested in forecasting the value of y t+1 based on a set of variables X t (m 1 vector). Let y t+1
More informationLecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationStatistics 910, #5 1. Regression Methods
Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known
More informationRegression of Time Series
Mahlerʼs Guide to Regression of Time Series CAS Exam S prepared by Howard C. Mahler, FCAS Copyright 2016 by Howard C. Mahler. Study Aid 2016F-S-9Supplement Howard Mahler hmahler@mac.com www.howardmahler.com/teaching
More informationSTAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019
STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")
More informationUnivariate Nonstationary Time Series 1
Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationPart 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)
GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) The maximum number of points on this exam is
More informationOptimization and Simulation
Optimization and Simulation Variance reduction Michel Bierlaire Transport and Mobility Laboratory School of Architecture, Civil and Environmental Engineering Ecole Polytechnique Fédérale de Lausanne M.
More informationSTA 6857 Estimation ( 3.6)
STA 6857 Estimation ( 3.6) Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg STA 6857 Estimation ( 3.6) 2/ 19 Outline 1 Yule-Walker 2 Least Squares 3 Maximum Likelihood Arthur Berg
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More information