Stochastic Modelling Solutions to Exercises on Time Series
|
|
- Leon Robinson
- 5 years ago
- Views:
Transcription
1 Stochastic Modelling Solutions to Exercises on Time Series Dr. Iqbal Owadally March 3, 2003 Solutions to Elementary Problems Q1. (i) (1 0.5B)X t = Z t. The characteristic equation 1 0.5z = 0 does not have a unit root and the time series is ARIMA(1, 0, 0). The root of the characteristic equation is 2. Since 2 > 1, the time series is stationary. It is also (trivially) invertible. (ii) (1 0.5B)X t = (1 1.3B+0.4B 2 )Z t = (1 0.8B)(1 0.5B)Z t. Equation 1 0.5z = 0 does not have a unit root and the time series is ARIMA(1, 0, 2). The root of the characteristic equation is 2. Since 2 > 1, the time series is stationary. The roots of equation (1 0.8z)(1 0.5z) = 0 are 1.25 and 2. Since 1.25 > 2 and 2 > 1, the time series is invertible. (iii) (1 1.5B + 0.6B 2 )X t = Z t. The characteristic equation does not have a unit root and the time series is ARIMA(2, 0, 0). The roots of the c.e. are (15 ± i 15)/12. Since (15 ± i 15)/12 = ( )/12 2 = 5/3 > 1, the time series is stationary. It is also (trivially) invertible. (iv) 0.2 causes only a translation of the mean level of the process and may be ignored here: let Y t = X t 0.2 and represent the time series as (1 B)(1 0.2B)Y t = (1 0.5B)Z t. The characteristic equation does have a unit root and the time series is ARIMA(1, 1, 1). The time series is non-stationary (being an I(1) process) but is invertible (the root of 1 0.5z = 0 being greater than 1 in magnitude). Q2. These processes are stationary: EX t and VarX t are constant and Cov [X t, X t k ] depends on k and not on t. (i) EX t + 0.5EX t 1 0.1EX t 2 = EZ t = 0 and hence EX t = 0 t and ρ k = ρ k = E [X t X t k ] /VarX t. Furthermore, Z t is independent of X t k for k 1, so that E [Z t X t k ] = EZ t EX t k = 0. Multiplying across X t + 0.5X t 1 0.1X t 2 = Z t by X t k for k 1 gives X t X t k + 0.5X t 1 X t k 0.1X t 2 X t k = Z t X t k. Solutions to past exam questions are adapted from examiners reports to the Exam Board of the Institute and Faculty of Actuaries and to the Exam Board, Faculty of Actuarial Science and Statistics, Cass Business School, City University. Contact details: Cass Building Room 5071, extension 8478, iqbal@city.ac.uk. 1
2 Taking expectation on both sides yields E [X t X t k ] + 0.5E [X t 1 X t k ] 0.1E [X t 2 X t k ] = 0 and dividing by VarX t on both sides gives ρ k + 0.5ρ k 1 0.1ρ k 2 = 0. When k = 1, ρ ρ 0 0.1ρ 1 = 0 which solves to ρ 1 = 0.5/0.9 = 5/9 (noting that ρ k = ρ k and ρ 0 = 1 by definition). When k = 2, ρ ρ 1 0.1ρ 0 = 0 yielding ρ 2 = ( 5/9) = 17/45. When k = 3, ρ ρ 2 0.1ρ 1 = 0 yielding ρ 3 = 0.1( 5/9) 0.5(17/45) = 11/45. (ii) {X t } in X t + 0.6X t 2 = Z t will clearly have zero autocorrelation at odd lags. ρ k + 0.6ρ k 2 = 0 for k 1. When k = 1, ρ ρ 1 = 0 and ρ 1 = 0. When k = 2, ρ ρ 0 = 0 and ρ 2 = 0.6. When k = 3, ρ ρ 1 = 0 and ρ 3 = 0. (iii) ρ k 1.1ρ k ρ k 2 = 0 for k 1. When k = 1, ρ 1 1.1ρ ρ 1 = 0 and ρ 1 = 1.1/1.18 = 55/59. When k = 2, ρ 2 1.1ρ ρ 0 = 0 and ρ 2 = 1.1(55/59) 0.18 = 1247/1475. When k = 3, ρ 3 1.1ρ ρ 1 = 0 and ρ 3 = 1.1(1247/1475) 0.18(55/59) = 5621/7375. (iv) ρ k + αρ k 1 + α 2 ρ k 2 + α 3 ρ k 3 = 0 for k 1. When k = 1, ρ 1 (1 + α 2 ) + α + α 3 ρ 2 = 0. When k = 2, ρ 1 (α + α 3 ) + α 2 + ρ 2 = 0. (Note in the above that ρ 0 = 1 and ρ k = ρ k ). Hence ρ 1 = α/(1 + α 2 ) and ρ 2 = 0. When k = 3, ρ 3 + αρ 2 + α 2 ρ 1 + α 3 ρ 0 = 0. Hence, ρ 3 = α 5 /(1 + α 2 ). Q3. (i) Y t = Z t βz t 1. VarY t = VarZ t + β 2 VarZ t 1 = σ 2 Z (1 + β2 ). Cov [Y t, Y t 1 ] = Cov [Z t βz t 1, Z t 1 βz t 2 ] = βσ 2 Z. Cov [Y t, Y t k ] = 0 for k 2. Since ρ k = γ k /γ 0, ρ 1 = β/(1 + β 2 ) and ρ k = 0 for k 2. (ii) Y t = Z t + 2.4Z t Z t 2. VarY t = σ 2 Z ( ) = 7.4σ 2 Z. Cov [Y t, Y t 1 ] = Cov [Z t + 2.4Z t Z t 2, Z t + 2.4Z t Z t 2 ] = ( )σ 2 Z = 7.4σ2 Z. Cov [Y t, Y t 2 ] = 0.8σ 2 Z. Cov [Y t, Y t k ] = 0 for k 3. Since ρ k = γ k /γ 0, ρ 1 = 4.32/7.4 = , ρ 2 = 0.8/7.4 = and ρ k = 0 for k 3. Q4. (i) The correlogram is a plot of autocorrelation function against lag, i.e. ρ k vs. k. An MA(q) process has a correlogram with a cutoff at lag q, whereas a stationary AR process has a correlogram that tapers off gradually (possibly with some damped oscillations) to zero as the lag increases. Comments. Plotting the correlogram is therefore a way of distinguishing MA from AR time series, although in practice estimation errors may mean that it is not so easy to discern a difference. 2
3 (ii) The process is assumed to be stationary and we can proceed to find the autocorrelation function by using the usual method for stationary time series. The time series is shifted by µ and EX t = µ t. Let Y t = X t µ. Then, EY t = 0 t. The autocovariance function is γ k = Cov [X t, X t k ] = Cov [Y t, Y t k ] = E[Y t Y t k ]. Recall that γ k = γ k. Also, EZ t = 0 and VarZ t = σ 2 Z t. Multiply Y t αy t 1 = Z t βz t 1 by Y t k on both sides and take expectations to obtain γ k αγ k 1 = E[Y t k Z t ] βe[y t k Z t 1 ]. Observe that Z t is statistically independent of Y t k, k 1. Therefore, when k = 0, γ 0 αγ 1 = E[Y t Z t ] βe[y t Z t 1 ], (1) when k = 1, γ 1 αγ 0 = βe[y t Z t 1 ], (2) when k > 1, γ k αγ k 1 = 0. (3) The non-zero cross-correlations involving Y t and Z t are easily found. Multiply Y t αy t 1 = Z t βz t 1 by Z t on both sides and take expectations to obtain E[Y t Z t ] = σz 2 t, noting that {Z t} is a sequence of independent and identically distributed random variables. Also, multiply Y t αy t 1 = Z t βz t 1 by Z t 1 on both sides and take expectations to obtain E[Y t Z t 1 ] αe[y t 1 Z t 1 ] = βσz 2. Hence, E[Y t Z t 1 ] = σz 2 (α β). Equation (1) may be simplified to γ 0 αγ 1 = σz 2 [1 αβ + β2 ] and equation (2) simplifies to γ 1 αγ 0 = βσz 2. Solving these two equations simultaneously yields: γ 0 = σ2 Z (1 2αβ + β2 ) 1 α 2, γ 1 = σ2 Z (α β)(1 αβ) 1 α 2. Finally, equation (3) solves to γ k = α k 1 γ 1, for k > 1. Since ρ k = γ k /γ 0 by definition, 1 k = 0 ρ k = (α β)(1 αβ)/(1 2αβ + β 2 ) k = 1 α k 1 ρ 1 k > 1. (iii) The correlogram of the ARMA(1, 1) time series has a kink at lag 1. At lag 1, both the MA and AR components affect the correlogram. From lag 2 onwards, it looks rather like the correlogram of an AR series and it decays exponentially. Q5. Comments. If the process were known (or assumed) to be stationary over time, then EX t = 0 and ρ k = 0.4 k, using the properties of a stationary AR(1) process. But the process has a finite history (with a given initial condition) and so it will be nonstationary for t <. Solution. (i) In terms of the backward shift operator, X t = Z t /(1 0.4B) = Z t + 0.4Z t Z t 2 + 3
4 The process is arbitrarily started at t = 0 with X 0 = x 0 and X j = 0 for j 1. Hence, Z 0 = x 0 and Z j = 0 for j 1. Therefore, X t = 0.4 j Z t j + x t. {Z t } is a sequence of zero-mean random variables. Hence, EX t = x t for t 0. (ii) Note that and that and also that X t+k EX t+k = X t EX t = 0.4 j Z t j t+k j Z t+k j = j= k 0.4 j+k Z t j Cov [X t, X t+k ] = E {(X t EX t )(X t+k EX t+k )}. For k 0, Cov [X t, X t+k ] = E 0.4 j Z t j j= k 0.4 j+k Z t j = σ2 Z 0.4 j 0.4 j+k (where σz 2 = VarZ t) because {Z t } is a sequence of zero-mean independent and identically distributed random variables. Hence, for k 0 and t 0. Cov [X t, X t+k ] = σ 2 Z0.4 k ( t )/( ) (4) (iii) The process is not stationary for t < as both EX t and Cov [X t, X t+k ] vary with t. (iv) The process is stationary in the limit since the magnitude of the root of the characteristic equation 1 0.4z = 0 is 2.5 and 2.5 > 1. As t, EX t = x t 0 and Cov [X t, X t+k ] σ 2 Z 0.4k /( ) and VarX t σ 2 Z /( ). In the limit, the autocorrelation coefficient ρ k = Cov [X t, X t k ] /VarX t = 0.4 k. (v) Suppose that the initial value x 0 at time 0 is itself a random variate that is normally distributed with zero mean and variance σ 2 Z /( ). This distribution is the same as that of X t t in the stationary AR(1) process { X t } given by X t = 0.4 X t 1 +Z t. One may therefore choose to put x 0 = X 0 = Z j0.4 j. Since X t = t 1 0.4j Z t j + x t, it follows that X t = 0.4 j Z t j t Z j 0.4 j = 0.4 j Z t j. In other words, an initial condition that is random and taken from N(0, σ 2 Z /( )) yields an AR(1) process that is stationary ab initio. 4
5 Q6. (i) EW t = EX t +EY t is constant over time. The autocovariance of the process {W t } is Cov [W t, W t k ] = Cov [X t + Y t, X t k + Y t k ] = Cov [X t, X t k ] + Cov [Y t, Y t k ] and depends on lag k and not on time t. (ii) X t = ln CPI t ln CPI t 1 and hence ln CPI t = t j= X j. Since X t is an AR(1) process, it is clear that ln CPI t is integrated once and is ARIMA(1, 1, 0). Alternatively, note that (1 B)(1 αb) ln CPI t = Z t + µ(1 α) and a unit root occurs. Solutions to Past Exam Questions Q1. (i) (a) γ 0 = Var(e t +β 1 e t 1 ) = (1+β 2 1 )σ2 e and γ 1 = Cov [e t + β 1 e t 1, e t 1 + β 1 e t 2 ] = β 1 σ 2 e, with γ k = 0 for k > 1. This gives ρ 0 = 1, ρ 1 = β 1 /(1 + β 2 1 ), ρ k = 0 otherwise. (b) Invertibility requires that β 1 < 1, so that the sum X t β 1 X t 1 +β 2 1 X t 2 + converges. µ and σ e are irrelevant. (ii) We need to solve (1 + β 2 1 )σ2 e = 14.5, β 1 σ 2 e = 5.0. Eliminating σ 2 e, we have 1 + β 2 1 = 2.9β 1, or β 1 = 1 2 (2.9 ± ( )) = 2.5 or 0.4. β 1 = 2.5 corresponds to σ 2 e = 2, whereas β 1 = 0.4 corresponds to σ 2 e = For invertibility, solve 1 + β 1 z = 0. In the first case, z = 0.4 (no good); in the second, z = 2.5 (OK). Q2. (i) (a) γ 1 = Cov [X t, X t 1 ] = Cov [α 1 X t 1 + α 2 X t 2 + e t, X t ] = α 1 γ 0 +α 2 γ 1 +0, since e t is independent of X t 1. (b) Similarly γ 2 = α 1 γ 1 + α 2 γ 0 and γ 0 = α 1 γ 1 + α 2 γ 2 + Cov [X t, e t ]. A further application of the same technique gives Cov [X t, e t ] = σe. 2 Thus, γ 1 = α ( ) 1 γ 0, γ 2 = α 2 + α2 1 γ 0. 1 α 2 1 α 2 (c) ρ k is found by the relation ρ k = γ k /γ 0. (ii) We have ˆα 1 = r 1 (1 ˆα 2 ) and ˆα 2 + ˆα 2 1 /(1 ˆα 2) = r 2, which are solved by ˆα 1 = r 1(1 r 2 ) 1 r 2 1, ˆα 2 = r 2 r1 2 1 r1 2. Q3. (i) γ 1 = 0.8γ 0 0.4γ and γ 2 = 0.8γ 1 0.4γ Hence, γ 1 = 4 7 γ 0 and γ 2 = 2 35 γ 0, implying that ρ 1 = 4 7 and ρ 2 = 2 φ 1 = ρ 1 = 4 7 and φ 2 = ρ 2 ρ 2 1 = ρ 2 1 (ii) The acf will reduce to zero as k increases; the pacf, however, will be equal to zero for all k >
6 Q4. (i) Identification: Determination of the parameters p, d and q in the ARIMA(p, d, q) model. Estimation: Determination of the p AR parameters α 1, α 2, α 3,..., α p and the q MA parameters β 1, β 2, β 3,..., β q in the stationary ARMA(p, q) model for the dth differences of the observed series. Diagnosis: Testing the goodness of fit of the proposed model. (ii) The optimal value of d is the smallest value that produces a stationary series. Three criteria used are: (a) sample acf should tend rapidly to zero (b) sample variance should be mimised (c) series should exhibit a constant mean (iii) Parismony is using the lowest values for the parameters p, d and q which adequately model the observed series, i.e. additional parameters are only added if they significantly improve the fit of the model. (iv) Q 10 = k=1 r2 k = = 9.07 Under null hypothesis of independence of residuals, Q 10 has a χ 2 distribution with 10 1 = 9 degrees of freedom. From tables, 95th percentile of χ 2 9 is Hence, accept H 0. (v) Let M = number of tps = 48. Let N = number of residuals = 78. Under null hypothesis of independence of residuals, ( ) M app 2 16N 29 N (N 2), 3 90 Then, M N ( , 1219 ) 90. ( ) P (M 48) = P Z (1219/90) = P (Z 0.59) = 1 Φ(0.59) = accept H 0 at 5% level. (Exam Board, Faculty of Actuarial Science and Statistics, Cass Business School, City University) Q5. (i) Consumer prices do tend to exhibit regular seasonal variation, though not a great deal these days. And, since prices tend to go up rather more than they come down, it is probably worth including a trend term in any model. It is certainly possible to test whether the trend term is equal to zero. (ii) (a) X n+1 x n = α(x n x n 1 ) + e n+1 + βe n. (b) The parameters are α, β and σe. 2 The trend removal process would have accounted for any µ parameter. (iii) ˆx n (1) = E [X n+1 x n,..., x 1 ] = x n + α(x n x n 1 ) + E [e n+1 + βe n x n,..., x 1 ]. Now e n+1 has mean 0 and is conventionally supposed independent of everything that happens before n. On the other hand, e n can be deduced from past data, e.g. e n = x n x n 1 α(x n 1 x n 2 ) βe n 1, which may be iterated back to get e n in terms of the known x and the known e 0. Thus, ˆx n (1) = x n + α(x n x n 1 ) + βe n. Similarly, ˆx n (2) = E [X n+2 F n ] = E [X n+1 + α(x n+1 x n ) + e n+2 + βe n+1 F n ] = (1 + α)ˆx n (1) αx n. 6
7 We see that X n+1 ˆx n (1) = e n+1, so that the prediction variance is just Var(e n+1 ) = σ 2 e. (iv) Since e n = x n ˆx n 1 (1), we have ˆx n (1) = x n + α(x n x n 1 ) + β(x n ˆx n 1 (1)). If we set α = 0 and β = ξ, the equation is identical to the updating equation for exponential smoothing. (v) An ARIMA(p, d, q) model is I(d); in this case, x is I(1). A stationary (I(0)) model has an equilibrium distribution: the distribution of the forecast of X n+k would converge to equilibrium for large k. An I(1) process is the partial sum of an I(0) process, so would have increasing variance, even if the mean happened to be stable. (vi) Two series {x} and {y} are cointegrated if both are I(1) but there are some constants a and b such that {ax + by} is stationary. Two processes are likely to be cointegrated if one drives the other, or if both are driven by the same underlying process. In the given instance the suggestion is certainly worth investigating. 7
EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each
More informationForecasting. Simon Shaw 2005/06 Semester II
Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future
More informationTMA4285 December 2015 Time series models, solution.
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationNANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS
NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationThe Identification of ARIMA Models
APPENDIX 4 The Identification of ARIMA Models As we have established in a previous lecture, there is a one-to-one correspondence between the parameters of an ARMA(p, q) model, including the variance of
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationExercises - Time series analysis
Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationTime Series Analysis
Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationModule 3. Descriptive Time Series Statistics and Introduction to Time Series Models
Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More information6 Stationary Models. 6.1 Purpose. 6.2 Strictly stationary series
6 Stationary Models 6.1 Purpose As seen in the previous chapters, a time series will often have well-defined components, such as a trend and a seasonal pattern. A well-chosen linear regression may account
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationLesson 13: Box-Jenkins Modeling Strategy for building ARMA models
Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,
More informationSolutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14
Introduction to Econometrics (3 rd Updated Edition) by James H. Stock and Mark W. Watson Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14 (This version July 0, 014) 015 Pearson Education,
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationCircle the single best answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice
More informationCHAPTER 8 FORECASTING PRACTICE I
CHAPTER 8 FORECASTING PRACTICE I Sometimes we find time series with mixed AR and MA properties (ACF and PACF) We then can use mixed models: ARMA(p,q) These slides are based on: González-Rivera: Forecasting
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationModule 4. Stationary Time Series Models Part 1 MA Models and Their Properties
Module 4 Stationary Time Series Models Part 1 MA Models and Their Properties Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W. Q. Meeker. February 14, 2016 20h
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationA Data-Driven Model for Software Reliability Prediction
A Data-Driven Model for Software Reliability Prediction Author: Jung-Hua Lo IEEE International Conference on Granular Computing (2012) Young Taek Kim KAIST SE Lab. 9/4/2013 Contents Introduction Background
More informationTIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA
CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis
More informationRegression of Time Series
Mahlerʼs Guide to Regression of Time Series CAS Exam S prepared by Howard C. Mahler, FCAS Copyright 2016 by Howard C. Mahler. Study Aid 2016F-S-9Supplement Howard Mahler hmahler@mac.com www.howardmahler.com/teaching
More informationARIMA Modelling and Forecasting
ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationIDENTIFICATION OF ARMA MODELS
IDENTIFICATION OF ARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationUnivariate linear models
Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation
More informationForecasting. This optimal forecast is referred to as the Minimum Mean Square Error Forecast. This optimal forecast is unbiased because
Forecasting 1. Optimal Forecast Criterion - Minimum Mean Square Error Forecast We have now considered how to determine which ARIMA model we should fit to our data, we have also examined how to estimate
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More information1 Time Series Concepts and Challenges
Forecasting Time Series Data Notes from Rebecca Sela Stern Business School Spring, 2004 1 Time Series Concepts and Challenges The linear regression model (and most other models) assume that observations
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationProblem set 1 - Solutions
EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed
More informationEconometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016 Organized by African Heritage Institution
Econometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016 Organized by African Heritage Institution Delivered by Dr. Nathaniel E. Urama Department of Economics, University of Nigeria,
More informationMODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo
Vol.4, No.2, pp.2-27, April 216 MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo ABSTRACT: This study
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More informationLab: Box-Jenkins Methodology - US Wholesale Price Indicator
Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationARMA (and ARIMA) models are often expressed in backshift notation.
Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time
More informationAnalysis. Components of a Time Series
Module 8: Time Series Analysis 8.2 Components of a Time Series, Detection of Change Points and Trends, Time Series Models Components of a Time Series There can be several things happening simultaneously
More information2. An Introduction to Moving Average Models and ARMA Models
. An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models
More informationTime Series I Time Domain Methods
Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT
More informationForecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation
More informationCh 5. Models for Nonstationary Time Series. Time Series Analysis
We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More informationUNIVARIATE TIME SERIES ANALYSIS BRIEFING 1970
UNIVARIATE TIME SERIES ANALYSIS BRIEFING 1970 Joseph George Caldwell, PhD (Statistics) 1432 N Camino Mateo, Tucson, AZ 85745-3311 USA Tel. (001)(520)222-3446, E-mail jcaldwell9@yahoo.com (File converted
More informationAR(p) + I(d) + MA(q) = ARIMA(p, d, q)
AR(p) + I(d) + MA(q) = ARIMA(p, d, q) Outline 1 4.1: Nonstationarity in the Mean 2 ARIMA Arthur Berg AR(p) + I(d)+ MA(q) = ARIMA(p, d, q) 2/ 19 Deterministic Trend Models Polynomial Trend Consider the
More informationCh 9. FORECASTING. Time Series Analysis
In this chapter, we assume the model is known exactly, and consider the calculation of forecasts and their properties for both deterministic trend models and ARIMA models. 9.1 Minimum Mean Square Error
More informationMATH 5075: Time Series Analysis
NAME: MATH 5075: Time Series Analysis Final For the entire test {Z t } WN(0, 1)!!! 1 1) Let {Y t, t Z} be a stationary time series with EY t = 0 and autocovariance function γ Y (h). Assume that a) Show
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationSOME BASICS OF TIME-SERIES ANALYSIS
SOME BASICS OF TIME-SERIES ANALYSIS John E. Floyd University of Toronto December 8, 26 An excellent place to learn about time series analysis is from Walter Enders textbook. For a basic understanding of
More informationSTAD57 Time Series Analysis. Lecture 8
STAD57 Time Series Analysis Lecture 8 1 ARMA Model Will be using ARMA models to describe times series dynamics: ( B) X ( B) W X X X W W W t 1 t1 p t p t 1 t1 q tq Model must be causal (i.e. stationary)
More informationinterval forecasting
Interval Forecasting Based on Chapter 7 of the Time Series Forecasting by Chatfield Econometric Forecasting, January 2008 Outline 1 2 3 4 5 Terminology Interval Forecasts Density Forecast Fan Chart Most
More informationA SARIMAX coupled modelling applied to individual load curves intraday forecasting
A SARIMAX coupled modelling applied to individual load curves intraday forecasting Frédéric Proïa Workshop EDF Institut Henri Poincaré - Paris 05 avril 2012 INRIA Bordeaux Sud-Ouest Institut de Mathématiques
More informationTime Series Analysis -- An Introduction -- AMS 586
Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationEstimating AR/MA models
September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating
More informationPart 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)
GROUND RULES: This exam contains two parts: Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each) The maximum number of points on this exam is
More information7. Forecasting with ARIMA models
7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability
More informationdistributed approximately according to white noise. Likewise, for general ARMA(p,q), the residuals can be expressed as
library(forecast) log_ap
More informationFinQuiz Notes
Reading 9 A time series is any series of data that varies over time e.g. the quarterly sales for a company during the past five years or daily returns of a security. When assumptions of the regression
More informationFORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL
FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL B. N. MANDAL Abstract: Yearly sugarcane production data for the period of - to - of India were analyzed by time-series methods. Autocorrelation
More informationSimple Descriptive Techniques
Simple Descriptive Techniques Outline 1 Types of variation 2 Stationary Time Series 3 The Time Plot 4 Transformations 5 Analysing Series that Contain a Trend 6 Analysing Series that Contain Seasonal Variation
More informationBooth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm
Booth School of Business, University of Chicago Business 41914, Spring Quarter 017, Mr Ruey S Tsay Solutions to Midterm Problem A: (51 points; 3 points per question) Answer briefly the following questions
More informationChapter 8: Model Diagnostics
Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics
More informationTime Series Econometrics 4 Vijayamohanan Pillai N
Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationModelling using ARMA processes
Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33
More informationProblem Set 2 Solution Sketches Time Series Analysis Spring 2010
Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )
More informationForecasting using R. Rob J Hyndman. 2.5 Seasonal ARIMA models. Forecasting using R 1
Forecasting using R Rob J Hyndman 2.5 Seasonal ARIMA models Forecasting using R 1 Outline 1 Backshift notation reviewed 2 Seasonal ARIMA models 3 ARIMA vs ETS 4 Lab session 12 Forecasting using R Backshift
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationPart II. Time Series
Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to
More informationChapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models
Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit
More informationFinal Examination 7/6/2011
The Islamic University of Gaza Faculty of Commerce Department of Economics & Applied Statistics Time Series Analysis - Dr. Samir Safi Spring Semester 211 Final Examination 7/6/211 Name: ID: INSTRUCTIONS:
More information