ARCH and GARCH Processes

Size: px
Start display at page:

Download "ARCH and GARCH Processes"

Transcription

1 Chapter 8 ARCH and GARCH Processes 8.1 Introduction When modelling time series, there are broadly speaking two approaches, the fundamentalist and the data analyst. The first of these aims to construct a model based on the fundamental principles of the situation and then use data to estimate the parameters so that the model may be used for forecasting and prediction. The second of these makes no attempt to search for underlying principles and simply tries to find a model that fits the available data. The ARCH and GARCH processes, motivated by observed volatility of financial series, fall firmly into the second category. The ARCH and GARCH processes are designed for situations where, after fitting a stationary time series model with residuals that satisfy {X t } WN(0,σ 2 ), the residuals do not correspond to IID(0,σ 2 ), because the volatility seems variable, with bursts of high volatility. More information on the nature of the residuals can lead to more accurate prediction. The ARCH / GARCH processes are one attempt to study the situation of variable volatility. Let {X t : t Z} be a stationary, mean zero time series. That is, the mean and possible trend have already been removed. Suppose that the {X t } appear uncorrelated, but where the WN(0,σ 2 ) does not appear to be justified by a time series plot; the noise is not constant. One approach is to model this by processes of the form: { Xt = σ t Z t {Z t } IIDN(0,1) σ t F (Z) t 1 (8.1) where F (Z) t 1 denotes the collection {Z s : s t 1} and σ t F (Z) t 1 denotes that σ t is a function of these random variables. 139

2 8.2 The ARCH Process A prominent model of this class is the so-called ARCH process, introduced by R.F. Engle in [4] (1982). The abbreviation ARCH stands for autoregressive conditional heteroscedasticity. Definition 8.1. The process {X t : t Z} is said to be an ARCH(p) process if it is stationary and { X t = σ t Z t {Z t } IIDN(0,1) σt 2 = α 0 +α 1 Xt α pxt p 2 (8.2) α 0 > 0, α j 0 j = 1,...,p The requirement that α 0 > 0 and α j 0 for j = 1,...,p ensure that σ t > 0. The coefficients α 0,α 1,...,α p have to be chosen such that the process {X t : t Z} defined in this way is stationary. The following lemma, which holds for all processes of the form given by Equation (8.1), helps to establish when {Xt} 2 may be represented as a stationary process with WN(0,σ 2 ) innovations, for some well defined σ 2 < +. Lemma 8.2. Let {X t } be a stationary process satisfying Equation (8.1). Let Z t = Xt 2 σt 2 = σt(z 2 t 2 1). (8.3) Assume that {σ t : t Z} is a stationary process satisfying E[σ 4 t] < +, then { Z t : t Z} is WN(0,σ 2 ) where σ 2 = 2E[σ 4 t]. Proof By construction, E[ Z t ] = E[σ 2 t(z 2 t 1)] = E[σ 2 t](e[z 2 t] 1) = 0 and E[ Z 2 t] = E[σ 4 t]e[z 4 t 2Z 2 t +1] = 2E[σ 4 t] = σ 2. and the proof is complete. Lemma 8.3. Let {X t : t Z} be an ARCH(p) process defined by Equation (8.2) satisfying E[Xt] 4 < +. Let Z be defined by Equation (8.3). Then Xt 2 α j Xt j 2 = α 0 + Z t. j=1 If {X t } is defined by Equation (8.2), it follows that {X 2 t} is an AR(p) process with mean if and only if E[X 2 t] = µ = α 0 1 p j=1 α j 1. p j=1 α j < 1 and 2. E[X 4 t] <

3 Proof giving X 2 t σ 2 t = Z t = X 2 t α 0 X 2 t α j Xt j 2 j=1 α j Xt j 2 = α 0 + Z t. This has the form of an AR(p) process. Taking expectations gives µ 1 = α 0 from which it follows that j=1 µ = j=1 α j α 0 1 p j=1 α. j It remains to show that E[ Z 2 t] < + if and only if E[X 4 t] < +. This follows directly from: [ ) ] 2 E ( Zt +α 0 = 1 +2 α j (1 j=1 = I +II +III. j=1 α j 2 E[X 4 t] α k )E[Xt(X 2 t 2 Xt j)]+ 2 k=1 j=1 k=1 α j α k E[(Xt 2 Xt j)(x 2 t 2 Xt k 2 )] Firstly, E[ Z 2 t] < + E[( Z +α 0 ) 2 ] < +. Clearly each of these terms is non-negative; I = c 2 E[X 4 t] c = 1 α j > 0. j=1 Since E[X 2 tx 2 t j ] E[X4 t] by Hölder together with stationarity, it follows that 0 II 2 1 α j α j j=1 j=1 E[X 4 t] = 2c(1 c)e[x 4 t] Finally, let γ(j,k) = E[(Xt 2 Xt j 2 )(X2 t X t k ) 2 )] then γ is a covariance matrix and hence non negative definite. It follows that α t γα > 0, so that III > 0. It follows that 0 III 4(1 c) 2 E[Xt]. 4 The if and only if condition now follows directly. Although it is the process {X t } that is of interest, the process {Xt} 2 and its representation as an AR(p) process, is very useful for parameter estimation. Most statistical tests rely on a central limit theorem effect, which requires the innovations to have a well defined finite variance. 141

4 For general models, there may be difficulties establishing conditions on the coefficients so that the innovations for the AR(p) process corresponding to {Xt} 2 are WN(0,σ 2 ) for a well defined σ 2 < +. The following example shows the parameter range for the ARCH(1) process. Example 8.1 (ARCH(1) process). Let {X t } satisfy { X t = σ t Z t σt 2 = α 0 +α 1 Xt 1 2. {Z t } IIDN(0,1) Suppose that E[X 2 t] = µ < +. By stationarity, it follows from the first equation that From the second, it therefore follows that in line with the general result. E[X 2 t] = E[σ 2 t] = µ > 0. µ = α 0 +α 1 µ µ = E[X 2 t] = α 0 1 α 1, Now consider the time series {X 2 t : t Z}. With Z t = Z 2 t σ 2 t, so that X 2 t σ 2 t = Z t = X 2 t α 0 α 1 X 2 t 1 X 2 t α 1 X 2 t 1 = α 0 + Z t. It follows that, provided E[ Z 2 t] < + and α 1 < 1, the process {X 2 t} is a causal AR(1) process with mean µ = α 0 1 α 1. For the AR(1) process, precise conditions on the coefficients can be established to ensure that {X 2 t} is an AR(1) process; the following computation shows that this is the case if and only if α 2 1 < 1 3. Using X t = σ t Z t X 4 t = σ 4 tz 4 t = (α 0 +α 1 X 2 t 1) 2 Z 4 t, it follows, using E[Z 4 t] = 3, that E[Xt] 4 = 3E[α0 2 +2α 0 α 1 Xt 2 +α1x 2 t] 4 ( = 3 α α2 0 α ) 1 +3α 1 α 1E[X 2 t] 4 = 3α2 0 (1+α 1) +3α 1 1 α 1E[X 2 t] 4 1 Clearly, E[X 4 t] = + for 3α For 3α2 1 < 1, E[X 4 t] = 3α 2 0 (1+α 1) (1 α 1 )(1 3α 2 1 ). 142

5 It follows that {X 2 t} is a causal AR(1) process if and only if α 1 < 1 3. Since E[X t ] = 0, it follows that the kurtosis of X t is: E[Xt] 4 V(X t ) 2 3 = 3 α0 2(1+α 1) (1 α 1 )(1 3α1 2) (1 α 1) 2 α0 2 3 = 3 1 α α α1 2 = 6α α 2 1 The kurtosis of an ARCH(p) process is always greater than 0, indicating thicker tails than a normal distribution. > 0. Generating Polynomials for the ARCH(p) Process Let α(z) = α 1 z +...+α p z p. Note that the α 0 has not been included in the polynomial. Then the equation for σt 2 may be written in the form: σ 2 t = α 0 +α(b)x 2 t. In this notation, p j=1 α j = α(1) and hence the formula for E[X 2 t] may be written as: ARCH - t Process E[X 2 t] = α 0 +α(1)e[x 2 t] E[X 2 t] = α 0 1 α(1). Sometimes the underlying noise {Z t } is not IID N(0,1); it is heavier tailed. Instead, IID t may be used; the degrees of freedom of the t distribution may be altered to fit the empirical distribution of the residuals. 8.3 Testing for the ARCH Effect Let (z t ) T t=1 denote the estimates of the innovations of the time series. There are two prominent tests available to determine whether a data set exhibits an ARCH effect. The first is simply the Ljung - Box test. The Ljung-Box test determines whether or not the acf is significantly different from the acf of white noise. This is applied to both the series (z t ) T t=1 and (z2 t) T t=1. If the first series is white noise, but the second is not, then this is an indicator of ARCH effects Ljung - Box test The Ljung Box test (named for Greta M. Ljung and George E. P. Box) is a statistical test of whether a of a group of autocorrelations of a time series are different from zero. The Ljung Box test is defined as follows. Let 143

6 The test statistic is: { H 0 : The data are an observed random sample from WN(0,σ 2 ) H 1 The data are not an observed random sample from WN(0,σ 2 ). Q = T (T +2) h k=1 ρ 2 k T k where T is the sample size, ρ k is the sample autocorrelation at lag k, and h is the number of lags being tested. For significance level α, the critical region for rejection of the null hypothesis H 0 is Q > χ 2 h,α where χ 2 h.α is the 1 α quantile of the chi-squared distribution with h degrees of freedom. Note that the test is applied to the residuals of the fitted model, not the original series. This means, for example, that if a causal invertible ARMA(p,q) is fitted to the original series {X t } t Z : then {Z t } t Z is recovered by: φ(b)x t = θ(b)z t Z t = θ(b) 1 φ(b)x t and the Ljung-Box test applied to {Z t } t Z. In such applications the hypothesis being tested is that the residuals have no autocorrelation. When testing the residuals of an estimated model, the degrees of freedom need to be adjusted to reflect the number of parameters estimated. In a model with p + q estimated parameters (for example ARMA(p,q)), the degrees of freedom should be set to h p q. The distribution of the test statistic is derived under the assumption that the residuals are normally distributed; when the data is not normal, it is approximate and relies on a central limit theorem effect Engle s test The second test available is the Lagrange multiplier test of Engle [4] (1982). This test is equivalent to the usual F statistic for testing α j = 0 for all j = 1,...,p in the linear regression z 2 t = α 0 +α 1 z 2 t α p z 2 t p +ǫ t t = p+1,...,t. (8.4) where ǫ t denotes the error term, p is a pre-specified positive integer and T is the sample size. The null hypothesis is: H 0 : α 1 =... = α m = 0. LetQ 0 = T t=p+1 (z2 t ω) 2, where ω = 1 T T t=1 z2 t is the sample mean of (zt) 2 T t=1 and letq 1 = T t=p+1ê2 t where ê t is the least squares residual estimate for the regression problem of Equation (8.4). Then 144

7 F = (SSR 0 SSR 1 )/p SSR 1 /(T 2p 1) T + χ 2 p. The decision rule is to reject H 0 if F > χ 2 p,α. Again, note that the statistic is based on a central limit theorem effect Example Consider the monthly log stock returns of Intel Corporation from , found in the file m-intc7308.txt. Figures 8.1, 8.2, 8.3, 8.4 and 8.5 indicate that there is conditional heteroscedasticity. Firstly, the time series plot indicates clusters of high volatility. Secondly, while the acf of the log stock returns shows no significant serial correlations with only minor exceptions at 7 and 14, the acf and pacf of the squared log returns indicates show serial correlations, indicating that the monthly returns are not independent. intc Index Figure 8.1: Intel log stock returns time series plot These observations are confirmed by the Ljung - Box test: the Q(m) statistics of the log return series give Q(12) = with a p value of 0.11 confirming no serial correlations in the data (significance level 5%). On the other hand, the Ljung - Box test on the squared log return series gives a value of Q(12) = with a p value close to 0, thus indicating that strong ARCH effects are present. > m.intc7308 <- read.table("~/download/m-intc7308.txt", header=t, quote="\"") > View(m.intc7308) > da = m.intc7308 > intc = log(da[,2]+1) > plot(intc, type = "l") > acf(intc) > pacf(intc) > acf(intc^2) 145

8 Series intc ACF Lag Figure 8.2: Intel log stock returns acf Series intc Partial ACF Lag Figure 8.3: Intel log stock returns pacf > pacf(intc^2) > Box.test(intc,lag=12,type= Ljung ) Box-Ljung test data: intc X-squared = , df = 12, p-value = > at=intc-mean(intc) > Box.test(at^2,lag=12,type= Ljung ) Box-Ljung test 146

9 Series intc^2 ACF Lag Figure 8.4: Intel log stock returns squared acf Series intc^2 Partial ACF Lag Figure 8.5: Intel log stock returns squared pacf data: at^2 X-squared = , df = 12, p-value = 5.274e Building an ARCH model Using the fact that Xt 2 is an AR(p) process, it follows that the pacf of {Xt} 2 is 0 for lags greater than p. The sample pacfs may therefore be used to select an ARCH model. Having selected the order, there are several methods for estimating the parameters. One way is to consider {Xt 2 : t Z}, subtract the mean, and apply the techniques of fitting an AR(p) process. 147

10 Likelihood Method Under the distributional assumption that {Z t } IIDN(0,1), the likelihood function of an ARCH(p) model is: f(z 1,...,z T α) = f(z T F T 1 )f(z T 1 F T 2 )...f(z p+1 F p )f(z 1,...,z p α) T } = 1 exp{ z 2 t f(z 2πσ 2 t 2σt 2 1,...,z p α), t=p+1 where α = (α 0,...,α p ) t. It follows that the log likelihood function L is: where L(z p+1,...z T α) = const T t=p+1 { 1 2 ln(σ2 t)+ 1 2 σ 2 t = α 0 +α 1 z 2 t α p z 2 t p. zt 2 σt 2 }, The conditional likelihood is usually used since f(z 1,...,z p α) is usually rather complicated. In heavy tailed applications, the normal density may be replaced by a standardised Student-t distribution with degrees of freedom chosen to reflect the desired tail. Other distributions may be chosen to reflect the fact that the innovations are often skewed. 8.5 Forecasting Forecasts from the ARCH model can be obtained recursively in exactly the same way as those for an AR model. Consider an ARCH(p) model; at the forecast origin h, the 1-step ahead forecast of σ 2 h+1 is: σ 2 h (1) = α 0 +α 1 X 2 h +...+α px 2 h+1 p. Since σ 2 h+1 is F h measurable, it follows that σ 2 h (1) = σ2 h+1. The two step ahead forecast is: σ 2 h (2) = α 0 +α 1 σ 2 h (1)+α 2X 2 h +...+α px 2 h+2 p which follows because E[X 2 h+1 F h] = σ 2 h+1 = σ2 h (1). Similarly, the l step ahead forecast for σ 2 h+l is: where σ 2 h (l) = α 0 + = α 0 + i=1 α i E [ Xh+l 1 2 F ] h α i σh 2 (l i) i=1 σ 2 h (l i) = X2 h+l i l i

11 Example The following output uses the fgarch package with the command garchfit. Again, the data considered is the log stock returns of Intel Corporation from The ARCH(1) model returned is: { R t = X t σt 2 = Xt 1 2 All the parameter estimates are highly significant. The acf of the residuals {x t } gives Q(10) = with a p value of 0.25 and those of {x 2 t} give Q(10) = with a p value of 0.10, neither significant at the 5% level, indicating that the model fits. > m.intc7308 <- read.table("~/data/m-intc7308.txt", header=t, quote="\"") > View(m.intc7308) > install.packages("fgarch") > library(fgarch) > da = m.intc7308 > intc=log(da[,2]+1) > m1 = garchfit(intc~garch(1,0),data=intc,trace=f) > summary(m1) Title: GARCH Modelling Call: garchfit(formula = intc ~ garch(1, 0), data = intc, trace = F) Mean and Variance Equation: data ~ garch(1, 0) <environment: 0x5c7c490> [data = intc] Conditional Distribution: norm Coefficient(s): mu omega alpha Std. Errors: based on Hessian Error Analysis: Estimate Std. Error t value Pr(> t ) 149

12 mu * omega < 2e-16 *** alpha ** --- Signif. codes: 0 *** ** 0.01 * Log Likelihood: normalized: Description: Thu May 23 16:38: by user: john Standardised Residuals Tests: Statistic p-value Jarque-Bera Test R Chi^ Shapiro-Wilk Test R W e-08 Ljung-Box Test R Q(10) Ljung-Box Test R Q(15) Ljung-Box Test R Q(20) Ljung-Box Test R^2 Q(10) Ljung-Box Test R^2 Q(15) Ljung-Box Test R^2 Q(20) LM Arch Test R TR^ Information Criterion Statistics: AIC BIC SIC HQIC > predict(m1,5) #obtain 1 to 5 step predictions meanforecast meanerror standarddeviation Now try fitting a GARCH model (this will be considered later); the tests indicate that this is a substantially better fit. > m2 = garchfit(intc~garch(1,1),data=intc,trace=f) > summary(m2) 150

13 Title: GARCH Modelling Call: garchfit(formula = intc ~ garch(1, 1), data = intc, trace = F) Mean and Variance Equation: data ~ garch(1, 1) <environment: 0x5f20c70> [data = intc] Conditional Distribution: norm Coefficient(s): mu omega alpha1 beta Std. Errors: based on Hessian Error Analysis: Estimate Std. Error t value Pr(> t ) mu omega * alpha ** beta <2e-16 *** --- Signif. codes: 0 *** ** 0.01 * Log Likelihood: normalized: Description: Thu May 23 16:40: by user: john Standardised Residuals Tests: Statistic p-value 151

14 Jarque-Bera Test R Chi^ Shapiro-Wilk Test R W e-07 Ljung-Box Test R Q(10) Ljung-Box Test R Q(15) Ljung-Box Test R Q(20) Ljung-Box Test R^2 Q(10) Ljung-Box Test R^2 Q(15) Ljung-Box Test R^2 Q(20) LM Arch Test R TR^ Information Criterion Statistics: AIC BIC SIC HQIC Now try fitting an ARCH(1) model with Student t distribution. The statistics illustrate that this model does not give an adequate fit for the volatility. > m3 = garchfit(intc~garch(1,0),data=intc,trace=f,cond.dist= std ) > summary(m3) Title: GARCH Modelling Call: garchfit(formula = intc ~ garch(1, 0), data = intc, cond.dist = "std", trace = F) Mean and Variance Equation: data ~ garch(1, 0) <environment: 0x61f48f0> [data = intc] Conditional Distribution: std Coefficient(s): mu omega alpha1 shape Std. Errors: based on Hessian 152

15 Error Analysis: Estimate Std. Error t value Pr(> t ) mu ** omega e-14 *** alpha ** shape *** --- Signif. codes: 0 *** ** 0.01 * Log Likelihood: normalized: Description: Thu May 23 16:44: by user: john Standardised Residuals Tests: Statistic p-value Jarque-Bera Test R Chi^ Shapiro-Wilk Test R W e-08 Ljung-Box Test R Q(10) Ljung-Box Test R Q(15) Ljung-Box Test R Q(20) Ljung-Box Test R^2 Q(10) Ljung-Box Test R^2 Q(15) Ljung-Box Test R^2 Q(20) LM Arch Test R TR^ Information Criterion Statistics: AIC BIC SIC HQIC The GARCH Process The GARCH process is defined as: X t = σ t Z t {Z t } IIDN(0,1) σ 2 t = α 0 +α 1 X 2 t α px 2 t p +β 1 σ 2 t β qσ 2 t q α 0 > 0, α j 0 j = 1,...,p β k 0 k = 1,...,q The equation for σ 2 may be written, in terms of generating polynomials, as 153

16 where σ 2 t = α 0 +α(b)x 2 t +β(b)σ 2 t, { α(z) = α 1 z +...+α p z p β(z) = β 1 z +...+β q z q Using E[X 2 t] = E[σ 2 t] together with stationarity gives: E[X 2 t] = α 0 +(α(1)+β(1))e[x 2 t] E[X 2 t] = α 0 1 α(1) β(1). Again using Z t = X 2 t σ 2 t = σ 2 t(z 2 t 1), so that Z t WN(0,σ 2 ) provided that E[σ 4 t] < +, it follows that This may be written as X 2 t Z t = α 0 +α(b)x 2 t +β(b)(x 2 t Z t ). X 2 t (α(b)+β(b))x 2 t = α 0 +(1 β(b)) Z t. Provided E[ Z 2 t] < +, this is an ARMA(max(p,q),q) process with mean µ = E[X 2 t] given by: µ (α(1)+β(1))µ = α 0 µ = 8.7 An Illustrative Example α 0 1 (α(1)+β(1)). Specifying the order of a GARCH process is not so easy. Only lower order GARCH processes are used in most applications. Consider the monthly excess returns of S&P 500 index starting from 1926 for 792 observations. The ACF is shown in Figure 8.7 while the partial autocorrelations for hte squared excess returns is shown in Figure 8.8. The series of returns has some serial correlations at lags 1 and 3, but the key feature of the pacf of x 2 t is that it shows strong linear dependence and hence an MA(3) model may seem appropriate. An AR(3) model has the same number of parameters; it is considered here. The fitted model is: { x t 0.088x t x t x t = a t σ 2 a = A joint estimation of the AR(3) - GARCH(1,1) model gives: { x t = x t x t x t 3 +a t σ 2 t = a 2 t σ2 t 1 From the volatility equation, the implied (unconditional) variance of a t is: =

17 which is in line with the variance estimate for the AR(3) model. In the AR(3) - GARCH(1,1) model, however, the parameters for the three AR coefficients are insignificant at the 5% level. It therefore seems appropriate to redefine the model by dropping all AR parameters. The redefined model is: { x t = a t σt 2 = a 2 t σ2 t 1 The Ljung-Box statistics suggest that the model is adequate. The analysis is as follows: > sp500 <- read.table("~/data/sp500.dat", quote="\"") > View(sp500) > m1 = garchfit(~arma(3,0)+garch(1,1),data=sp500,trace=f) > summary(m1) Title: GARCH Modelling Call: garchfit(formula = ~arma(3, 0) + garch(1, 1), data = sp500, trace = F) Mean and Variance Equation: data ~ arma(3, 0) + garch(1, 1) <environment: 0xac52190> [data = sp500] Conditional Distribution: norm Coefficient(s): mu ar1 ar2 ar3 omega alpha1 beta e e e e e e e-01 Std. Errors: based on Hessian Error Analysis: Estimate Std. Error t value Pr(> t ) mu 7.708e e e-06 *** ar e e ar e e ar e e

18 omega 7.975e e ** alpha e e e-08 *** beta e e < 2e-16 *** --- Signif. codes: 0 *** ** 0.01 * Log Likelihood: normalized: Description: Thu May 23 17:09: by user: john Standardised Residuals Tests: Statistic p-value Jarque-Bera Test R Chi^ e-16 Shapiro-Wilk Test R W e-07 Ljung-Box Test R Q(10) Ljung-Box Test R Q(15) Ljung-Box Test R Q(20) Ljung-Box Test R^2 Q(10) Ljung-Box Test R^2 Q(15) Ljung-Box Test R^2 Q(20) LM Arch Test R TR^ Information Criterion Statistics: AIC BIC SIC HQIC Now fit a GARCH(1,1) model with Student t distribution. > m2 = garchfit(~garch(1,1),data=sp5,trace=f,cond.dist="std") > summary(m2) Title: GARCH Modelling Call: garchfit(formula = ~garch(1, 1), data = sp5, cond.dist = "std", trace = F) 156

19 Mean and Variance Equation: data ~ garch(1, 1) <environment: 0xb13e2d0> [data = sp5] Conditional Distribution: std Coefficient(s): mu omega alpha1 beta1 shape Std. Errors: based on Hessian Error Analysis: Estimate Std. Error t value Pr(> t ) mu 8.455e e e-08 *** omega 1.248e e ** alpha e e e-05 *** beta e e < 2e-16 *** shape 7.003e e e-05 *** --- Signif. codes: 0 *** ** 0.01 * Log Likelihood: normalized: Description: Thu May 23 17:11: by user: john Standardised Residuals Tests: Statistic p-value Jarque-Bera Test R Chi^ Shapiro-Wilk Test R W e-08 Ljung-Box Test R Q(10) Ljung-Box Test R Q(15) Ljung-Box Test R Q(20) Ljung-Box Test R^2 Q(10)

20 Ljung-Box Test R^2 Q(15) Ljung-Box Test R^2 Q(20) LM Arch Test R TR^ Information Criterion Statistics: AIC BIC SIC HQIC > #obtain standardised residuals > stresi = residuals(m2,standardize=t) > plot(stresi,type="l") > Box.test(stresi,10,type= Ljung ) Box-Ljung test data: stresi X-squared = , df = 10, p-value = > predict(m2,5) meanforecast meanerror standarddeviation stresi Index Figure 8.6: Standard and Poor standardised residuals 158

21 Series sp500 ACF Lag Figure 8.7: Standard and Poor acf for (excess) return series Series sp500^2 Partial ACF Lag Figure 8.8: Standard and Poor partial autocorrelation for squared excess return series 8.8 Further Extensions of ARCH / GARCH There are numerous associated models: Non-linear ARCH (NARCH) σ γ t = α 0 +α(b) X t γ +β(b)σ γ t. The choice γ = 2 yields the GARCH model; γ = 1 is sometimes used. The standard approach for constructing variations on ARCH / GARCH is as follows: some financial data sets are considered. For each set, the hypothesis H 0 : γ = 2 (GARCH) is tested against the alternative H 1 : γ 2 (or rather: the null hypothesis is that the data may be modelled by one of the current models; the alternative is that the wider class of models is more appropriate). The null 159

22 hypothesis is generally rejected. The creator of the generalisation is then happy and some papers are written. The typical situation is that we have some further parameters to play with in H 0 H 1. Since most (in fact all) models are simplifications of reality, a null hypothesis H 0 which is much smaller than H 0 H 1 is (almost) always rejected if the data set is large enough. Asymmetric ARCH Processes One drawback with ARCH, GARCH and NARCH is that positive and negative values have a summetric effect on the volatility. Many financial time series are strongly asymmetric; negative returns are followed by larger increase in the volatility than positive returns. Typical examples of this may be prices of gasoline or the interest on a loan. Let X t + = max(x t,0) Xt = min(x t,0) There are several examples which address this problem; for example, the exponential GARCH (EGARCH) and quadratic GARCH (QGARCH). The EGARCH is: lnσt 2 = α 0 +βlnσt 1 2 +λz t 1 +φ( Z t 1 E[ Z t 1 ]). Other examples may be found in the literature. 160

23 Tutorial 7 1. Consider the monthly simple returns of Intel stock from January 1973 to December 2008 m-intc7308.txt. Transform the returns into log returns. Build a GARCH model for the transformed series and compute 1-step to 5-step ahead volatility forecasts at the forecast origin December The file m-mrk4608.txt contains monthly simple returns of Merck stock from June 1946 to December The file has two columns denoting date and simple return. Transform the simple returns into log returns. (a) Is there any evidence of serial correlations in the log returns? Use autocorrelations and 5% significance level to answer the question. If yes, remove the serial correlations. This is done by fitting an ARMA model. One is then interested in ARCH-GARCH effects in the residuals. (b) Is there any evidence of ARCH effects in the log returns? Use the residual series if there are serial correlations in part (a). Use Ljung-Box statistics for the squared returns (or residuals) with 6 and 12 lags of autocorrelations and 5% significance level to answer the question. (c) Identify an ARCH model for the data and fit the identified model. Write down the fitted model. 3. The file m-3m4608.txt contains two columns. They are date and the monthly simple return for 3M stock. Transform the returns to log returns. (a) Is there any evidence of ARCH effects in the log returns? Use Ljung-Box statistics with 6 and 12 lags of autocorrelations and 5% significance level to answer the question. (b) use the PACF of the squared returns to identify an ARCH model. What is the fitted model? (c) There are 755 data points. Refit the model using the first 750 observations and use the fitted model to predict the folatilities for times 751 to 755 (the forecast origin is 750). (d) Build an EGARCH model for the log return series of 3M stocks using the first 750 observations. Use the fitted model to compute the 1-step to 5-step-ahead volatility forecasts at the forecast origin h = 750. For EGARCH, you may find the package betategarch useful. Look at the documentation. 4. The file m-gmsp5008.txt contains the dates and monthy simple returns of General Motors stock and the S&P index from (a) Build a GARCH model with Gaussian innovations for the log returns of GM stock. Check the model and write down the fitted model. (b) Build a GARCH model with Student-t distribution for the log returns of GM stock including estimation of the degrees of freedom. Write down the fitted model. Let ν be the degrees of freedom. Test the hypothesis H 0 : ν = 6 versus H 1 : ν 6 at a significance level of 5%. (c) Build an EGARCH model for the log returns of GM stock. What is the fitted model? 161

24 (d) Obtain 1-step to 6-step-ahead volatility forecasts for all models obtained. Compare the forecasts. 5. Consider again the data in m-gmsp5008.txt. (a) Build a Gaussian GARCH model for the monthly log returns of the S&P 500 index. (b) Is there are summer effect on the volatility of the index return? This requires writing an R script. Make a variable Try fitting the model: u t = { 1 month is: June, July, August 0 other months. σ 2 (t) = α 0 +α 1 X 2 (t)+β 1 σ 2 (t 1)+u t ( α00 +α 01 X 2 (t)+β 01 σ 2 (t 1) ). Then the coefficients are (α 0,α 1,β 1 ) for September to May and (α 0 +α 00,α 1 +α 01,β 1 +β 01 ) for June, July and August. Are α 00,α 01,β 01, which represent the differences, significant? (c) Are the lagged returns of GM stock useful for modelling the index volatility? Use your GARCH model as a baseline for comparison. 6. The purpose of this exercise is to simulate a GARCH(1,1) process u(t) = v(t) h(t) h(t) = a 0 +a 1 u(t 1) 2 +b 1 h(t 1) a 0 = 0.1 a 1 = 0.4 b 1 = 0.2 > v<-rnorm(1000) > u<-array(0,1000) > h<-array(0,1000) > for(t in 2:1000){} > for(t in 2:1000){} > a0 = 0.1; a1 = 0.4; b1 = 0.2 > for(t in 2:1000){ + h[t] <- a0 + a1*(u[t-1]^2)+b1*h[t-1] + u[t]<-v[t]*sqrt(h[t]) + } > plot(u,type="l") > acf(u) > acf(u^2) > library("fgarch") > u.garch <- garchfit(~garch(1,1),u,trace=f) > u.garch@fit$matcoef How do the estimated coefficients compare with the true coefficients? 162

25 Exercises (These exercises should be considered at home) 1. Derive multistep-ahead forecasts of the volatility ] for a GARCH(1,2) model at the forecast origin h. Do this by noting thatσh [σ 2(j) = E h+j 2 F h, usinge[e[x Y]] = E[X] and thate[xt F 2 t 1 ] = σt 2. Obtain a recursive formula by computing σh 2(1), σ2 h (2) and σ2 h (j) for j > 2. Compute σh 2 ( ), assuming it exists and state conditions such that the answer is well defined. 2. Derive multistep-ahead forecasts of the volatility for a GARCH(2,1) model at the origin h. Comute σh 2 ( ) assuming it exists and state conditions such that the answer is well defined. 3. Suppose that r 1,...,r n are observations of a return series that follows the AR(1)-GARCH(1,1) model: X t = σ t Z t {Z t } IIDN(0,1) σ 2 t = α 0 +α 1 X 2 t 1 +β 1σ 2 t 1 R t = µ+φ 1 R t 1 +X t Derive the conditional log-likelihood function of the data. Base this on r 3,...,r n and use as an initialisation r 1 = r 0 = σ0 2 = Consider the AR(1)-GARCH(1,1) model of the previous exercise, with the difference that {Z t } is IID t ν. Derive the conditional log-likelihood function of the data. 163

26 Answers 1. For the GARCH(1,2), { X t = σ t Z t {Z t } IID N(0,1) It follows that σ 2 t = α 0 +α 1 X 2 t 1 +β 1σ 2 t 1 +β 2σ 2 t 2 E[X 2 t F t 1 ] = σ 2 t so that σ 2 h (1) = α 0 +α 1 X 2 h +β 1σ 2 h +β 2σ 2 h 1 σ 2 h (2) = α 0 +α 1 σ 2 h (1)+β 1σ 2 h (1)+β 2σ 2 h = α 0 +(α 1 +β 1 )σ 2 h (1)+β 2σ 2 h = α 0 +(α 1 +β 1 )(α 0 +α 1 X 2 h +β 2σ 2 h 1 )+(β 1(α 1 +β 1 )+β 2 )σ 2 h For j 3, σh 2 (j) = α 0 +α 1 E[Xh+j 1 2 F h]+β 1 σh 2 (j 1)+β 2σh 2 (j 2) = α 0 +(α 1 +β 1 )σ 2 h (j 1)+β 2σ 2 (j 2). σ 2 h ( )(1 α 1 β 1 β 2 ) = α 0 σ 2 h ( ) = α 0 1 α 1 β 1 β σ 2 h (1) = α 0 +α 1 X 2 h +α 2X 2 h 1 +β 1σ 2 h σ 2 h (2) = α 0 +α 1 σ 2 h (1)+α 2X 2 h +β 1σ 2 h (1) α h (j) = α 0 +(α 1 +β 1 )σh 2(j 1)+α 2σh 2 (j 2) (if and only if α 1 +α 2 +β 1 < 1) σ 2 h ( ) = α 0 1 α 1 α 2 β 1 3. Let θ = (α 0,α 1,β 1,µ,φ 1 ) denote the vector of parameters and let f(r 1,...,r n ) be the joint density, then f(r 1,...,r n θ) = f(r 1,r 2 θ) Take L(θ) = f(r 3,...,r n θ,r 1,r 2 ), then L(θ) = n 2 2 ln(2π) n f(r j θ,r 1,...,r j 1 ) j=3 n lnσj j=3 Use an initialisation of σ 2 0 = 0, r 0 = r 1 = 0. n (r j µ φ 1 r j 1 ) 2 j=3 σ 2 t = β 1 σ 2 t 1 +α 0 +α 1 (r t 1 µ φ 1 r t 2 ) σ 2 j

27 4. The student t on ν degrees of freedom has density proportional to: f(x) ) (ν+1)/2 (1+ x2 ν and the model is: so that X t := (R t ν φ 1 R t 1 ) t ν σ t σ t L(θ) = const ν +1 2 Use an initialisation of σ 2 0 = 0, r 0 = r 1 = 0. n ln (1+ (r j ν φ 1 r j 1 ) 2 ) j=3 νσ 2 t σ 2 t = β 1 σ 2 t 1 +α 0 +α 1 (r t 1 µ φ 1 r t 2 )

THE UNIVERSITY OF CHICAGO Graduate School of Business Business 41202, Spring Quarter 2003, Mr. Ruey S. Tsay

THE UNIVERSITY OF CHICAGO Graduate School of Business Business 41202, Spring Quarter 2003, Mr. Ruey S. Tsay THE UNIVERSITY OF CHICAGO Graduate School of Business Business 41202, Spring Quarter 2003, Mr. Ruey S. Tsay Solutions to Homework Assignment #4 May 9, 2003 Each HW problem is 10 points throughout this

More information

Lecture 14: Conditional Heteroscedasticity Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 14: Conditional Heteroscedasticity Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 14: Conditional Heteroscedasticity Bus 41910, Time Series Analysis, Mr. R. Tsay The introduction of conditional heteroscedastic autoregressive (ARCH) models by Engle (198) popularizes conditional

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria Emmanuel Alphonsus Akpan Imoh Udo Moffat Department of Mathematics and Statistics University of Uyo, Nigeria Ntiedo Bassey Ekpo Department of

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 9 - Heteroskedasticity Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 43 Agenda 1 Introduction 2 AutoRegressive Conditional Heteroskedastic Model (ARCH)

More information

End-Semester Examination MA 373 : Statistical Analysis on Financial Data

End-Semester Examination MA 373 : Statistical Analysis on Financial Data End-Semester Examination MA 373 : Statistical Analysis on Financial Data Instructor: Dr. Arabin Kumar Dey, Department of Mathematics, IIT Guwahati Note: Use the results in Section- III: Data Analysis using

More information

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA CHAPTER 6 TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA 6.1. Introduction A time series is a sequence of observations ordered in time. A basic assumption in the time series analysis

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

Exercises - Time series analysis

Exercises - Time series analysis Descriptive analysis of a time series (1) Estimate the trend of the series of gasoline consumption in Spain using a straight line in the period from 1945 to 1995 and generate forecasts for 24 months. Compare

More information

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia GARCH Models Estimation and Inference Eduardo Rossi University of Pavia Likelihood function The procedure most often used in estimating θ 0 in ARCH models involves the maximization of a likelihood function

More information

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50 GARCH Models Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 50 Outline 1 Stylized Facts ARCH model: definition 3 GARCH model 4 EGARCH 5 Asymmetric Models 6

More information

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis CHAPTER 8 MODEL DIAGNOSTICS We have now discussed methods for specifying models and for efficiently estimating the parameters in those models. Model diagnostics, or model criticism, is concerned with testing

More information

Symmetric btw positive & negative prior returns. where c is referred to as risk premium, which is expected to be positive.

Symmetric btw positive & negative prior returns. where c is referred to as risk premium, which is expected to be positive. Advantages of GARCH model Simplicity Generates volatility clustering Heavy tails (high kurtosis) Weaknesses of GARCH model Symmetric btw positive & negative prior returns Restrictive Provides no explanation

More information

Chapter 8: Model Diagnostics

Chapter 8: Model Diagnostics Chapter 8: Model Diagnostics Model diagnostics involve checking how well the model fits. If the model fits poorly, we consider changing the specification of the model. A major tool of model diagnostics

More information

GARCH Models Estimation and Inference

GARCH Models Estimation and Inference GARCH Models Estimation and Inference Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 1 Likelihood function The procedure most often used in estimating θ 0 in

More information

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models Prof. Massimo Guidolin 019 Financial Econometrics Winter/Spring 018 Overview ARCH models and their limitations Generalized ARCH models

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

13. Estimation and Extensions in the ARCH model. MA6622, Ernesto Mordecki, CityU, HK, References for this Lecture:

13. Estimation and Extensions in the ARCH model. MA6622, Ernesto Mordecki, CityU, HK, References for this Lecture: 13. Estimation and Extensions in the ARCH model MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Robert F. Engle. GARCH 101: The Use of ARCH/GARCH Models in Applied Econometrics,

More information

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006. 6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series MA6622, Ernesto Mordecki, CityU, HK, 2006. References for Lecture 5: Quantitative Risk Management. A. McNeil, R. Frey,

More information

GARCH Models Estimation and Inference

GARCH Models Estimation and Inference Università di Pavia GARCH Models Estimation and Inference Eduardo Rossi Likelihood function The procedure most often used in estimating θ 0 in ARCH models involves the maximization of a likelihood function

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

Financial Econometrics and Quantitative Risk Managenent Return Properties

Financial Econometrics and Quantitative Risk Managenent Return Properties Financial Econometrics and Quantitative Risk Managenent Return Properties Eric Zivot Updated: April 1, 2013 Lecture Outline Course introduction Return definitions Empirical properties of returns Reading

More information

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Circle the single best answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 6, 2017 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 32 multiple choice

More information

Heteroskedasticity in Time Series

Heteroskedasticity in Time Series Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Modelling using ARMA processes

Modelling using ARMA processes Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Gaussian Copula Regression Application

Gaussian Copula Regression Application International Mathematical Forum, Vol. 11, 2016, no. 22, 1053-1065 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/imf.2016.68118 Gaussian Copula Regression Application Samia A. Adham Department

More information

Minitab Project Report - Assignment 6

Minitab Project Report - Assignment 6 .. Sunspot data Minitab Project Report - Assignment Time Series Plot of y Time Series Plot of X y X 7 9 7 9 The data have a wavy pattern. However, they do not show any seasonality. There seem to be an

More information

Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations

Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations Farhat Iqbal Department of Statistics, University of Balochistan Quetta-Pakistan farhatiqb@gmail.com Abstract In this paper

More information

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Dynamic Time Series Regression: A Panacea for Spurious Correlations International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

A Non-Parametric Approach of Heteroskedasticity Robust Estimation of Vector-Autoregressive (VAR) Models

A Non-Parametric Approach of Heteroskedasticity Robust Estimation of Vector-Autoregressive (VAR) Models Journal of Finance and Investment Analysis, vol.1, no.1, 2012, 55-67 ISSN: 2241-0988 (print version), 2241-0996 (online) International Scientific Press, 2012 A Non-Parametric Approach of Heteroskedasticity

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Economics 618B: Time Series Analysis Department of Economics State University of New York at Binghamton

Economics 618B: Time Series Analysis Department of Economics State University of New York at Binghamton Problem Set #1 1. Generate n =500random numbers from both the uniform 1 (U [0, 1], uniformbetween zero and one) and exponential λ exp ( λx) (set λ =2and let x U [0, 1]) b a distributions. Plot the histograms

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Financial Econometrics

Financial Econometrics Financial Econometrics Nonlinear time series analysis Gerald P. Dwyer Trinity College, Dublin January 2016 Outline 1 Nonlinearity Does nonlinearity matter? Nonlinear models Tests for nonlinearity Forecasting

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Estimation and application of best ARIMA model for forecasting the uranium price.

Estimation and application of best ARIMA model for forecasting the uranium price. Estimation and application of best ARIMA model for forecasting the uranium price. Medeu Amangeldi May 13, 2018 Capstone Project Superviser: Dongming Wei Second reader: Zhenisbek Assylbekov Abstract This

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

2. Multivariate ARMA

2. Multivariate ARMA 2. Multivariate ARMA JEM 140: Quantitative Multivariate Finance IES, Charles University, Prague Summer 2018 JEM 140 () 2. Multivariate ARMA Summer 2018 1 / 19 Multivariate AR I Let r t = (r 1t,..., r kt

More information

Romanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS

Romanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS Marian Zaharia, Ioana Zaheu, and Elena Roxana Stan Abstract Stock exchange market is one of the most dynamic and unpredictable

More information

CHAPTER 8 FORECASTING PRACTICE I

CHAPTER 8 FORECASTING PRACTICE I CHAPTER 8 FORECASTING PRACTICE I Sometimes we find time series with mixed AR and MA properties (ACF and PACF) We then can use mixed models: ARMA(p,q) These slides are based on: González-Rivera: Forecasting

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

MODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY

MODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY MODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY The simple ARCH Model Eva Rubliková Ekonomická univerzita Bratislava Manuela Magalhães Hill Department of Quantitative Methods, INSTITUTO SUPERIOR

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before. ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Time Series Forecasting: A Tool for Out - Sample Model Selection and Evaluation

Time Series Forecasting: A Tool for Out - Sample Model Selection and Evaluation AMERICAN JOURNAL OF SCIENTIFIC AND INDUSTRIAL RESEARCH 214, Science Huβ, http://www.scihub.org/ajsir ISSN: 2153-649X, doi:1.5251/ajsir.214.5.6.185.194 Time Series Forecasting: A Tool for Out - Sample Model

More information

Nonlinear Time Series Modeling

Nonlinear Time Series Modeling Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Appendix 1 Model Selection: GARCH Models. Parameter estimates and summary statistics for models of the form: 1 if ɛt i < 0 0 otherwise

Appendix 1 Model Selection: GARCH Models. Parameter estimates and summary statistics for models of the form: 1 if ɛt i < 0 0 otherwise Appendix 1 Model Selection: GARCH Models Parameter estimates and summary statistics for models of the form: R t = µ + ɛ t ; ɛ t (0, h 2 t ) (1) h 2 t = α + 2 ( 2 ( 2 ( βi ht i) 2 + γi ɛt i) 2 + δi D t

More information

Cointegration Lecture I: Introduction

Cointegration Lecture I: Introduction 1 Cointegration Lecture I: Introduction Julia Giese Nuffield College julia.giese@economics.ox.ac.uk Hilary Term 2008 2 Outline Introduction Estimation of unrestricted VAR Non-stationarity Deterministic

More information

Time Series Models of Heteroskedasticity

Time Series Models of Heteroskedasticity Chapter 21 Time Series Models of Heteroskedasticity There are no worked examples in the text, so we will work with the Federal Funds rate as shown on page 658 and below in Figure 21.1. It will turn out

More information

Final Examination 7/6/2011

Final Examination 7/6/2011 The Islamic University of Gaza Faculty of Commerce Department of Economics & Applied Statistics Time Series Analysis - Dr. Samir Safi Spring Semester 211 Final Examination 7/6/211 Name: ID: INSTRUCTIONS:

More information

LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION

LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION LATVIAN GDP: TIME SERIES FORECASTING USING VECTOR AUTO REGRESSION BEZRUCKO Aleksandrs, (LV) Abstract: The target goal of this work is to develop a methodology of forecasting Latvian GDP using ARMA (AutoRegressive-Moving-Average)

More information

Oil price volatility in the Philippines using generalized autoregressive conditional heteroscedasticity

Oil price volatility in the Philippines using generalized autoregressive conditional heteroscedasticity Oil price volatility in the Philippines using generalized autoregressive conditional heteroscedasticity Carl Ceasar F. Talungon University of Southern Mindanao, Cotabato Province, Philippines Email: carlceasar04@gmail.com

More information

Time Series Outlier Detection

Time Series Outlier Detection Time Series Outlier Detection Tingyi Zhu July 28, 2016 Tingyi Zhu Time Series Outlier Detection July 28, 2016 1 / 42 Outline Time Series Basics Outliers Detection in Single Time Series Outlier Series Detection

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Circle a single answer for each multiple choice question. Your choice should be made clearly. TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

DEPARTMENT OF ECONOMICS

DEPARTMENT OF ECONOMICS ISSN 0819-64 ISBN 0 7340 616 1 THE UNIVERSITY OF MELBOURNE DEPARTMENT OF ECONOMICS RESEARCH PAPER NUMBER 959 FEBRUARY 006 TESTING FOR RATE-DEPENDENCE AND ASYMMETRY IN INFLATION UNCERTAINTY: EVIDENCE FROM

More information

STAT 520 FORECASTING AND TIME SERIES 2013 FALL Homework 05

STAT 520 FORECASTING AND TIME SERIES 2013 FALL Homework 05 STAT 520 FORECASTING AND TIME SERIES 2013 FALL Homework 05 1. ibm data: The random walk model of first differences is chosen to be the suggest model of ibm data. That is (1 B)Y t = e t where e t is a mean

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

Comment about AR spectral estimation Usually an estimate is produced by computing the AR theoretical spectrum at (ˆφ, ˆσ 2 ). With our Monte Carlo

Comment about AR spectral estimation Usually an estimate is produced by computing the AR theoretical spectrum at (ˆφ, ˆσ 2 ). With our Monte Carlo Comment aout AR spectral estimation Usually an estimate is produced y computing the AR theoretical spectrum at (ˆφ, ˆσ 2 ). With our Monte Carlo simulation approach, for every draw (φ,σ 2 ), we can compute

More information

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994). Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

Estimating AR/MA models

Estimating AR/MA models September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information