Modeling and Forecasting Volatility Autoregressive Conditional Heteroskedasticity Models. Economic Forecasting Anthony Tay Slide 1

Similar documents
Cointegration and Implications for Forecasting

Regression with Time Series Data

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

Time series Decomposition method

Stationary Time Series

Licenciatura de ADE y Licenciatura conjunta Derecho y ADE. Hoja de ejercicios 2 PARTE A

Financial Econometrics Jeffrey R. Russell Midterm Winter 2009 SOLUTIONS

Chapter 5. Heterocedastic Models. Introduction to time series (2008) 1

Econ Autocorrelation. Sanjaya DeSilva

Volatility. Many economic series, and most financial series, display conditional volatility

Distribution of Estimates

Distribution of Least Squares

Wisconsin Unemployment Rate Forecast Revisited

Chapter 15. Time Series: Descriptive Analyses, Models, and Forecasting

Lecture 15. Dummy variables, continued

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models

Lecture 5. Time series: ECM. Bernardina Algieri Department Economics, Statistics and Finance

Comparing Means: t-tests for One Sample & Two Related Samples

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Unit Root Time Series. Univariate random walk

Wednesday, November 7 Handout: Heteroskedasticity

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Solutions to Odd Number Exercises in Chapter 6

Dynamic Models, Autocorrelation and Forecasting

Chapter 2. First Order Scalar Equations

(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum.

5. NONLINEAR MODELS [1] Nonlinear (NL) Regression Models

GMM - Generalized Method of Moments

Methodology. -ratios are biased and that the appropriate critical values have to be increased by an amount. that depends on the sample size.

Forecasting optimally

Testing the Random Walk Model. i.i.d. ( ) r

Estimation Uncertainty

ESTIMATION OF DYNAMIC PANEL DATA MODELS WHEN REGRESSION COEFFICIENTS AND INDIVIDUAL EFFECTS ARE TIME-VARYING

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Solutions: Wednesday, November 14

3.1 More on model selection

A Hybrid Model for Improving. Malaysian Gold Forecast Accuracy

Box-Jenkins Modelling of Nigerian Stock Prices Data

Math 10B: Mock Mid II. April 13, 2016

OBJECTIVES OF TIME SERIES ANALYSIS

14 Autoregressive Moving Average Models

Vectorautoregressive Model and Cointegration Analysis. Time Series Analysis Dr. Sevtap Kestel 1

Arima Fit to Nigerian Unemployment Data

Linear Combinations of Volatility Forecasts for the WIG20 and Polish Exchange Rates

Testing for a Single Factor Model in the Multivariate State Space Framework

Smoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T

ECON 482 / WH Hong Time Series Data Analysis 1. The Nature of Time Series Data. Example of time series data (inflation and unemployment rates)

Properties of Autocorrelated Processes Economics 30331

Applied Time Series Notes White noise: e t mean 0, variance 5 2 uncorrelated Moving Average

Generalized Least Squares

- The whole joint distribution is independent of the date at which it is measured and depends only on the lag.

Stability. Coefficients may change over time. Evolution of the economy Policy changes

Vector autoregression VAR. Case 1

Lecture 3: Exponential Smoothing

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =

1. Diagnostic (Misspeci cation) Tests: Testing the Assumptions

Dynamic Econometric Models: Y t = + 0 X t + 1 X t X t k X t-k + e t. A. Autoregressive Model:

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

MODELING AND FORECASTING EXCHANGE RATE DYNAMICS IN PAKISTAN USING ARCH FAMILY OF MODELS

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.

Additional material for Chapter 7 Regression diagnostic IV: model specification errors

The Effect of Nonzero Autocorrelation Coefficients on the Distributions of Durbin-Watson Test Estimator: Three Autoregressive Models

CHAPTER 17: DYNAMIC ECONOMETRIC MODELS: AUTOREGRESSIVE AND DISTRIBUTED-LAG MODELS

Robust critical values for unit root tests for series with conditional heteroscedasticity errors: An application of the simple NoVaS transformation

Exponential Smoothing

INVESTIGATING THE WEAK FORM EFFICIENCY OF AN EMERGING MARKET USING PARAMETRIC TESTS: EVIDENCE FROM KARACHI STOCK MARKET OF PAKISTAN

References are appeared in the last slide. Last update: (1393/08/19)

6. COMPUTATION OF CENTILES AND Z-SCORES FOR VELOCITIES BASED ON WEIGHT, LENGTH AND HEAD CIRCUMFERENCE

Ensamble methods: Bagging and Boosting

Vehicle Arrival Models : Headway

Problem Set 5. Graduate Macro II, Spring 2017 The University of Notre Dame Professor Sims

ACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin

Section 4.4 Logarithmic Properties

CH Sean Han QF, NTHU, Taiwan BFS2010. (Joint work with T.-Y. Chen and W.-H. Liu)

d 1 = c 1 b 2 - b 1 c 2 d 2 = c 1 b 3 - b 1 c 3

Robust estimation based on the first- and third-moment restrictions of the power transformation model

You must fully interpret your results. There is a relationship doesn t cut it. Use the text and, especially, the SPSS Manual for guidance.

2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes

Chickens vs. Eggs: Replicating Thurman and Fisher (1988) by Arianto A. Patunru Department of Economics, University of Indonesia 2004

Ensamble methods: Boosting

Linear Gaussian State Space Models

GDP Advance Estimate, 2016Q4

Linear Response Theory: The connection between QFT and experiments

Chapter 16. Regression with Time Series Data

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

Section 4.4 Logarithmic Properties

EXERCISES FOR SECTION 1.5

Nonstationarity-Integrated Models. Time Series Analysis Dr. Sevtap Kestel 1

Section 4 NABE ASTEF 232

I. Return Calculations (20 pts, 4 points each)

13.3 Term structure models

Types of Exponential Smoothing Methods. Simple Exponential Smoothing. Simple Exponential Smoothing

Hypothesis Testing in the Classical Normal Linear Regression Model. 1. Components of Hypothesis Tests

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

Forecasting. Summary. Sample StatFolio: tsforecast.sgp. STATGRAPHICS Centurion Rev. 9/16/2013

ENGI 9420 Engineering Analysis Assignment 2 Solutions

Transcription:

Modeling and Forecasing Volailiy Auoregressive Condiional Heeroskedasiciy Models Anhony Tay Slide 1

smpl @all line(m) sii dl_sii S TII D L _ S TII 4,000. 3,000.1.0,000 -.1 1,000 -. 0 86 88 90 9 94 96 98 00 0 04 06 08 10 -.3 86 88 90 9 94 96 98 00 0 04 06 08 10 Anhony Tay Slide

Auoregressive Condiional Heeroskedasiciy (ARCH) Model Pure ARCH(1) [Mean equaion] y 1/ iid, ~(0,1) h u u N [Variance equaion] h 1 1 Anhony Tay Slide 3

y 1/ iid, ~(0,1) h u u N h 1 1 E[ y y, y,...] E[ y, y,...] 1 1 E h u y y 1/ [ 1,,...] h E[ u y, y,...] 0 1/ 1 This resul implies - uncondiional mean is zero - serially uncorrelaed process E[ E[ y y y,...]] E[ y E[ y y,...]] E[ y.0] 0. 1 1 1 1 1 Pure ARCH(1) model herefore does no (and canno) describe any sor of cyclicaliy or predicabiliy in he mean Anhony Tay Slide 4

y iid h u, u ~(0,1) N, 1/ h 1 1 var[ y y, y,...] var[ y y, y,...] 1 1 i.e. h is he condiional variance of 1/ var[ h u y 1, y,...] h var[ u y, y,...] h 1 Condiional variance depends on he pas values of y (We usually assume and 1 o be posiive o ensure posiive variances) y Anhony Tay Slide 5

Incidenally, he uncondiional variance is: Var y [ ] E[ y ] E[ h ](since E[ y ] E[ h u ] E[ E[ h u y ]] E[ h E[ u y ]] E[ h]) 1 1 E[ y ] 1 1 E[ y ] 1 1 E[ h ] 1 1 E[ y ] 1 1 E[ y ] 1 1 E[ h ] 1 1...(1 1...) 1 1 1 herefore consan and finie if 1 1 Anhony Tay Slide 6

[ Mean equaion ] y 1/ iid, ~(0,1) h u u N h 1 1 We can have a non-rivial mean equaion: AR(1) - ARCH(1) model: y y, 0 1 1 1/ iid, ~(0,1) h u u N h 1 1 Anhony Tay Slide 7

y y 0 1 1 1/ iid, ~(0,1) h u u N h 1 1 Can show: E[ y y 1,...] 0 1y 1 ( E[ y 1,...] 0) var[ y y,...] var[ y,...] h 1 1 1 1 Noice ha y 0 1y 1, so in your compuaions Same wih condiional variance E[. y, y,...] E[.,,...] 1 1 Anhony Tay Slide 8

How he ARCH model capures volailiy-clusering y 1/ iid, ~(0,1) h u u N h 1 1 Focus on volailiy Suppose If y 1 y small (so happens) hen h 1 is small (which means ha var[ y 1 y,...] so y 1 large is small) ends o be small; if so, hen cycle coninues hen h 1 is large (which means ha var[ y 1 y,...] so y 1 is large) ends o be large; if so, hen cycle coninues Anhony Tay Slide 9

Anoher example y y 0 1 1 1/ iid, ~(0,1) h u u N h 1 1 Suppose small (so happens) If 1 hen h 1 is small (which means ha var[ 1 y,...] so 1 large is small) ends o be small; if so, hen cycle coninues hen h 1 is large (which means ha var[ 1 y,...] so 1 is large) ends o be large; if so, hen cycle coninues Anhony Tay Slide 10

Volailiy clusering can also accoun, a leas parially, for excess kurosis or faails dl_sii.his 3,600 3,00,800,400,000 1,600 1,00 800 S eries : DL_S TII S am ple 1/04/1985 10/6/011 O bs ervaions 6993 M ean 0.0001 M edian 0.000000 M axim um 0.154809 M inim um -0.9186 S d. Dev. 0.013707 S kewness -1.54600 K urosis 44.60801 K E[( Y E[ Y ]) ] () 4 400 0-0.3-0. -0.1 0.0 0.1 Jarque-B era 507.0 P robabiliy 0.000000 This is parially due o he presence of volailiy clusering (large observaions cluser, so you ge oo many of hem) ARCH processes imply excess kurosis even when condiionally normal Anhony Tay Slide 11

Anoher feaure of series wih volailiy clusering: square of residuals ofen show ARMA-ype properies Ignoring he MA(1), which is negligible in he dl_sii series, regress dl_sii on a consan, and compue he correllogram of squared residuals: equaion eq1.ls dl_sii c eq1.correlsq Anhony Tay Slide 1

ARCH(1)-ype errors possesses his propery h u, 1/ h 1 1 Rewrie variance equaion as () h 1 1 which akes he form of an AR(1) in squares of (You can view h as a zero-mean error erm) Anhony Tay Slide 13

Applicaion: Forecasing STII wihou ARCH You can show: log(stii) clearly I(1) work wih DL_STII The correllogram shows MA(1)? Anhony Tay Slide 14

smpl @firs 1/31/007 equaion eq1.ls dl_sii c ma(1) eq1.correl Dependen Variable: DL_STII residual correllogram Sample (adjused): 1/07/1985 1/31/007 Included observaions: 5996 afer adjusmens Convergence achieved afer 6 ieraions MA Backcas: 1/04/1985 Variable Coefficien Sd. Error -Saisic Prob. C 0.00086 0.000 1.4313 0.15 MA(1) 0.165015 0.0174 1.9516 0 R-squared 0.0449 Anhony Tay Slide 15

In-sample fi as well as he one-sep ahead forecass..08.1..1.0 -.1 -..0 -.1 -. -.3.04.00 -.04 -.08 -.3 86 88 90 9 94 96 98 00 0 04 06 -.1 004 005 006 007 008 009 010 011 Residual Acual Fied Y Y_F Y_UP Y_DOWN Anhony Tay Slide 16

The sandardized residuals also show evidence of volailiy clusering: In addiion o he correllogram of squared residuals, we can es for ARCH errors using Engle s LM es: Esimae his mean equaion, calculae he residuals ˆ, regress and calculae he TR ˆ on ˆ 1,,, ˆ ˆ 3 Anhony Tay Slide 17

This saisic follows a chi-square disribuion wih m degrees of freedom under he null of no condiional heeroskedasiciy in he errors. eq1.arches Heeroskedasiciy Tes: ARCH F-saisic 574.499 Prob. F(1,5993) 0.000 Obs*R-squared 54.419 Prob. Chi-Square(1) 0.000 Tes Equaion: Dependen Variable: RESID^ Variable CoefficienSd. Error -Saisic Prob. C 0.000 1.49E-05 8.305 0.000 RESID^(-1) 0.96 0.0134 3.969 0.000 The es clearly rejecs he null of no condiional heeroskedasiciy Anhony Tay Slide 18

Forecasing Volailiy Forecasing an ARCH processes is no as sraighforward as AR processes y 1/ iid, ~(0,1) h u u N h 1 1 The one-sep ahead forecas y 1 is simply y 1 0 To compue he one-sep ahead forecas of variance of y 1, use var[ y 1 y,...] var[ y y,...] h y 1 1 1 1 : Anhony Tay Slide 19

To ge he wo-sep ahead forecas for he variance, noe ha var[ y y,...] var[ y,...] E[ y,...] E[ h u y,...] E[ u y,...] E[ h y,...] 1 Since we have h h u, E[ h 1 y,...] h 1, 1 1 1 1 1 E[ u,...] 1 1 y, var[ y y,...]()(1) h y y 1 1 1 1 1 1 Anhony Tay Slide 0

In general, var[ y y,...] h...(1...) y, k 1 k k 1 k 1 1 1 1 As k, var[ y k y,...] as long as 1 1 1 1 Tha is, he k -sep ahead forecas of he variance of variance as he forecas horizon increases oward infiniy y converges o he uncondiional Anhony Tay Slide 1

Back o DL_STII: Fiing MA(1)-ARCH(1) and Forecas equaion.eq.arch(1,0) dl_sii c ma(1) Dependen Variable: DL_STII Mehod: ML - ARCH (Marquard) - Normal disribuion Sample (adjused): 1/07/1985 1/31/007 Included observaions: 5996 afer adjusmens Variable Coefficien Sd. Error z-saisic Prob. C 0.000519 0.000156 3.3519 0.0009 MA(1) 0.17774 0.006786 6.1911 0.0000 Variance Equaion C 0.000107 7.65E-07 140.959 0.0000 RESID(-1)^ 0.347641 0.011511 30.0077 0.0000 R-squared 0.0413 Mean dependen var 0.00086 Adjused R-squared 0.0396 S.D. dependen var 0.013433 S.E. of regression 0.01371 Akaike info crierion -6.0319 Sum squared resid 1.055603 Schwarz crierion -6.0683 Log likelihood 18085.81 Hannan-Quinn crier. -6.0974 Durbin-Wason sa.03671 Invered MA Roos -0.18 Anhony Tay Slide

There are no imporan changes o he esimae of he mean equaion The coefficien of ˆ 1 The fied sandard deviaion of in he variance equaion is clearly significan y is shown on he lef The fied and acuals are shown on he righ wih he +/- sandard deviaions. eq.makegarch h_ha line h_ha^0.5 genr dl_sii_ha = dl_sii-resid genr upp = dl_sii_ha + *h_ha^0.5 genr low = dl_sii_ha - *h_ha^0.5 line dl_sii dl_sii_ha upp low Anhony Tay Slide 3

H_HAT^0.5.16.3.14..1.1.10.0.08.06 -.1.04 -..0 -.3.00 86 88 90 9 94 96 98 00 0 04 06 -.4 86 88 90 9 94 96 98 00 0 04 06 The one-sep ahead forecass are shown below..1.08.04.00 -.04 -.08 -.1 004 005 006 007 008 009 010 011 Y Y_F Y_UP Y_DOWN Anhony Tay Slide 4

There is some improvemen in ha we are capuring some of he changes in volailiy, bu clearly he model sill does no properly describe and forecas volailiy here does no seem o be enough flexibiliy, and for long periods he inervals are sill far oo wide. Anhony Tay Slide 5

How do we evaluae he fi of ARCH model? The R says nohing abou he volailiy fi i only ells us how well he mean equaion fis he daa. The R here is acually smaller han in he pure MA(1) esimaion oupu Wih consan plus pure ARCH models we will usually ge negaive R s (why?) Anhony Tay Slide 6

To evaluae how well he ARCH model fis he volailiy, we make use of 1/ Our ARCH model produces esimaes of iid, ~(0,1) h u u N 1/ h (h_ha^0.5) We also have he residuals ˆ from he MA(1) model ha was fi o he daa Compue If h ˆ are good esimaes of he rue sandardized residuals should be iid uˆ ˆ ˆ 1/ h ( sandardized residuals ) h and mean equaion is correcly specified, hen he The correllogram of he sandardized residuals and heir squares should no show any dynamics: Anhony Tay Slide 7

eq.correl correl. of sandardized res. eq.correlsq correllogram of squared residuals I appears he mean equaion has done a good job (Q-sas on he lef) whereas he volailiy equaion could use some improvemens (again Q-sas, on he righ-hand figure) alhough he correlaions appear small Anhony Tay Slide 8

The hisogram of he sand. res. also ells us somehing useful abou he model. eq.his,500,000 S eries : S andardiz ed Res iduals S am ple 1/07/1985 1/31/007 O bs ervaions 5996 1,500 1,000 500 0-10 -8-6 -4-0 4 6 8 10 1 14 M ean -0.017997 M edian -0.014 M ax im um 14.30354 M inim um -10.4450 S d. Dev. 0.999893 S k ewnes s -0.104807 K uros is 18.3639 Jarque-B era 58009.18 P robabiliy 0.000000 Kurosis is much smaller han for he raw daa Some excess kurosis was removed by sandardizaion by 1/ h Bu sandardized residuals sill no normally disribued Anhony Tay Slide 9

ARCH(1) specificaion ofen no rich enough GARCH Models y 1/ iid, ~(0,1) h u u N h h 1 1 1 1 Consider he variance equaion for he GARCH(1,1) process Recursively subsiuing backwards gives h h h 1 1 1 1 () h 1 1 1 1 1 (1) h... 1 1 1 1 1 1 (1...)... 1 1 1 1 1 1 1 3 Anhony Tay Slide 30

equaion eq3.arch(1,1) dl_sii c ma(1) Dependen Variable: DL_STII Mehod: ML - ARCH (Marquard) - Normal disribuion Sample (adjused): 1/07/1985 1/31/007 Included observaions: 5996 afer adjusmens Convergence achieved afer 36 ieraions eq3.correlsq Variable Coefficien Sd. Error z-saisic Prob. C 0.001 0.000 4.71 0.000 MA(1) 0.145 0.014 10.079 0.000 Variance Equaion C 5.35E-06.51E-07f 1.34 0.000 RESID(-1)^ 0.139 0.004 34.735 0.000 GARCH(-1) 0.838 0.004 06.811 0.000 1,600 eq3.his R-squared 0.04 Mean dependen var 0.000 Adj R-squared 0.03 S.D. dependen var 0.013 S.E. of regression 0.013 Akaike info crierion -6.181 Sum sq resid 1.056 Schwarz crierion -6.175 Log likelihood 18534.360 Hannan-Quinn crier. -6.179 DW sa 1.973 Invered MA Roos -0.150 1,400 1,00 1,000 800 600 400 00 0-14 -1-10 -8-6 -4-0 4 6 Series : Sandardized Residuals Sample 1/07/1985 1/31/007 O bs ervaions 5996 M ean -0.0353 M edian -0.037406 M ax im um 5.959593 M inim um -13.53757 S d. Dev. 0.999539 S k ewnes s -0.987643 K uros is 16.31053 Jarque-B era 4537.79 P robabiliy 0.000000 Anhony Tay Slide 31

The one-sep ahead forecass over he forecas sample shows a subsanial improvemen:.1.08.04.00 -.04 -.08 -.1 004 005 006 007 008 009 010 011 Y Y_F Y_UP Y_DOWN Anhony Tay Slide 3

GARCH-in-Mean y h 0 1 1/ iid, ~(0,1) h u u N h... h... h 1 1 p p 1 1 q q This allows he condiional mean of y o be correlaed wih he condiional variance In he following, sandard deviaion h 1/ equaion: is included as a regressor in he mean Anhony Tay Slide 33

equaion eq3.arch(1,1, archm=sd) dl_sii c ma(1) Dependen Variable: DL_STII Mehod: ML - ARCH (Marquard) - Normal disribuion Sample: 1/01/004 10/6/011 Included observaions: 040 Convergence achieved afer 1 ieraions Variable Coefficien Sd. Error z-saisic Prob. @SQRT(GARCH) 0.000 0.060 0.008 0.994 C 0.001 0.001 1.37 0.185 MA(1) 0.008 0.05 0.309 0.758 Variance Equaion C 0.000 0.000 3.851 0.000 RESID(-1)^ 0.106 0.010 10.176 0.000 GARCH(-1) 0.889 0.010 86.50 0.000 R-squared -0.001 Mean dependen var 0.000 Adjused R-squared -0.00 S.D. dependen var 0.013 S.E. of regression 0.013 Akaike info crierion -6.380 Sum squared resid 0.31 Schwarz crierion -6.363 Log likelihood 6513.38 Hannan-Quinn crier. -6.374 Durbin-Wason sa 1.968 Invered MA Roos -0.010 Anhony Tay Slide 34