Nonlinear Parameter Estimation for State-Space ARCH Models with Missing Observations

Similar documents
Parameter Estimation for ARCH(1) Models Based on Kalman Filter

Generalized Autoregressive Score Models

Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations

GARCH processes probabilistic properties (Part 1)

LEAST SQUARES ESTIMATION OF ARCH MODELS WITH MISSING OBSERVATIONS

1 Kalman Filter Introduction

GARCH Models Estimation and Inference

Asymptotic inference for a nonstationary double ar(1) model

July 31, 2009 / Ben Kedem Symposium

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

The Size and Power of Four Tests for Detecting Autoregressive Conditional Heteroskedasticity in the Presence of Serial Correlation

Cramér-Rao Bounds for Estimation of Linear System Noise Covariances

A NOVEL OPTIMAL PROBABILITY DENSITY FUNCTION TRACKING FILTER DESIGN 1

Accounting for Missing Values in Score- Driven Time-Varying Parameter Models

M-estimators for augmented GARCH(1,1) processes

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Conditional Least Squares and Copulae in Claims Reserving for a Single Line of Business

Econ 423 Lecture Notes: Additional Topics in Time Series 1

13. Estimation and Extensions in the ARCH model. MA6622, Ernesto Mordecki, CityU, HK, References for this Lecture:

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models

Generalized Autoregressive Score Smoothers

State Estimation using Gaussian Process Regression for Colored Noise Systems

New Statistical Model for the Enhancement of Noisy Speech

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia

Location Multiplicative Error Model. Asymptotic Inference and Empirical Analysis

A Study of Covariances within Basic and Extended Kalman Filters

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

Multivariate GARCH models.

The Unscented Particle Filter

BOOTSTRAP PREDICTION INTERVALS IN STATE SPACE MODELS. Alejandro Rodriguez 1 and Esther Ruiz 2

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

ASYMPTOTIC NORMALITY OF THE QMLE ESTIMATOR OF ARCH IN THE NONSTATIONARY CASE

Consistency of Quasi-Maximum Likelihood Estimators for the Regime-Switching GARCH Models

Analytical derivates of the APARCH model

Revisiting linear and non-linear methodologies for time series prediction - application to ESTSP 08 competition data

Adaptive ensemble Kalman filtering of nonlinear systems

From Bayes to Extended Kalman Filter

Open Economy Macroeconomics: Theory, methods and applications

MODELLING TIME SERIES WITH CONDITIONAL HETEROSCEDASTICITY

Financial Econometrics

Research Article The Laplace Likelihood Ratio Test for Heteroscedasticity

Volatility. Gerald P. Dwyer. February Clemson University

1 Class Organization. 2 Introduction

ARTICLE IN PRESS Statistics and Probability Letters ( )

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra

Measurement Errors and the Kalman Filter: A Unified Exposition

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form

GARCH Models Estimation and Inference

Quadratic Extended Filtering in Nonlinear Systems with Uncertain Observations

Volume 30, Issue 3. A note on Kalman filter approach to solution of rational expectations models

Finite Sample and Optimal Inference in Possibly Nonstationary ARCH Models with Gaussian and Heavy-Tailed Errors

DEPARTMENT OF ECONOMICS

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q

Publication IV Springer-Verlag Berlin Heidelberg. Reprinted with permission.

ANALYSIS OF PANEL DATA MODELS WITH GROUPED OBSERVATIONS. 1. Introduction

Time Series Analysis -- An Introduction -- AMS 586

9) Time series econometrics

VEHICLE WHEEL-GROUND CONTACT ANGLE ESTIMATION: WITH APPLICATION TO MOBILE ROBOT TRACTION CONTROL

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria

STOCHASTIC STABILITY OF EXTENDED FILTERING FOR NONLINEAR SYSTEMS WITH MEASUREMENT PACKET LOSSES

{ ϕ(f )Xt, 1 t n p, X t, n p + 1 t n, { 0, t=1, W t = E(W t W s, 1 s t 1), 2 t n.

Heteroskedasticity in Time Series

A note on the specification of conditional heteroscedasticity using a TAR model

Introduction to ARMA and GARCH processes

UC San Diego Recent Work

SIMULTANEOUS STATE AND PARAMETER ESTIMATION USING KALMAN FILTERS

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t.

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

ARIMA Modelling and Forecasting

Nonlinear Time Series Modeling

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

A radial basis function artificial neural network test for ARCH

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Bayesian Estimation of DSGE Models

Econometric Forecasting

α i ξ t i, where ξ t i.i.d. (0, 1),

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect

Modeling speech signals in the time frequency domain using GARCH

Hausman tests for the error distribution in conditionally heteroskedastic models

Available online at ScienceDirect. Energy Procedia 94 (2016 )

DEPARTMENT OF ECONOMICS

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Thomas J. Fisher. Research Statement. Preliminary Results

On Consistency of Tests for Stationarity in Autoregressive and Moving Average Models of Different Orders

The Estimation of Simultaneous Equation Models under Conditional Heteroscedasticity

Confidence Intervals for the Autocorrelations of the Squares of GARCH Sequences

ADAPTIVE TIME SERIES FILTERS OBTAINED BY MINIMISATION OF THE KULLBACK-LEIBLER DIVERGENCE CRITERION

Parameter identification using the skewed Kalman Filter

Lecture 2: From Linear Regression to Kalman Filter and Beyond

A new unscented Kalman filter with higher order moment-matching

EUSIPCO

INFLATION TARGETING IN LATIN AMERICA: EMPIRICAL ANALYSIS USING GARCH MODELS

ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications

MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

Econ 424 Time Series Concepts

Optimization-Based Control

Transcription:

Nonlinear Parameter Estimation for State-Space ARCH Models with Missing Observations SEBASTIÁN OSSANDÓN Pontificia Universidad Católica de Valparaíso Instituto de Matemáticas Blanco Viel 596, Cerro Barón, Valparaíso CHILE sebastian.ossandon@ucv.cl NATALIA BAHAMONDE Pontificia Universidad Católica de Valparaíso Instituto de Estadística Blanco Viel 596, Cerro Barón, Valparaíso CHILE natalia.bahamonde@ucv.cl Abstract: A new mathematical representation, based on a discrete time nonlinear state space formulation, is presented to characterize AutoRegresive Conditional Heterosedasticity (ARCH) models. A novel parameter estimation procedure for state-space ARCH models with missing observations, based on an Extended Kalman Filter (EKF) approach, is described and successfully evaluated herein. Finally, through a comparison analysis between our proposed estimation method and a Quasi Maximum Lielihood Estimation (QMLE) technique based on different methods of imputation, some numerical results with simulated data, which mae evident the effectiveness and relevance of the proposed nonlinear estimation technique are given. Key Words: ARCH models, Missing observations, Nonlinear state space model, Nonlinear estimation, Extended Kalman Filter. 1 Introduction Autoregressive conditionally heteroscedastic (ARCH) type modeling, introduced by [4], are often used in finance because their properties are close to the observed properties of empirical financial data such as heavy tails, volatility clustering, white noise behavior or autocorrelation of the squared series. Financial time series often exhibit that the conditional variance can change over time, namely heterosedasticity. The ARCH family of model is a class of nonlinear time series models introduced by [1]. The ARCH(p) model with normal error is r = σ ε, σ = α + p α i r i, (1) i=1 where r and σ (>, ) are, respectively the return and the volatility, in the discrete time Z, associated to a financial process, and ε } Z is a i.i.d. Gaussian sequence, with E(ε ) =, E(ε ε j ) = Qδ j, and parameters α >, α i, p i 1 and p i=1 α i < 1. Moreover, let us consider r independent of sequence ε } >. The goal of this research is to develop a novel nonlinear parameter estimation procedure, based on an EKF approach, for ARCH models considering missing observations. The EKF technique proposed is derived from a nonlinear state space formulation of the discrete time ARCH equation (see equation (1)). This method is adequate to obtain initial conditions for a maximum-lielihood iteration, or to provide the final estimation of the parameters and the states when maximum- ISBN: 978-1-6184-141-8 58

lielihood is considered inadequate or costly. Nonlinear state space formulation for ARCH models Let us consider the following general discrete time nonlinear state space mathematical model: X =f(x 1, u 1, θ)+σ(u 1, θ) w, Y = h(x, u, θ) + ν, () where X R n is the state unnown vector, u R r is the input nown vector, Y R m is the noisy observation vector or output vector of the stochastic process, w R n and ν R m are, respectively, the process noise (due, mainly, to disturbances and modelling inaccuracies of the process) and the measurement noise (due, mainly, to sensor inaccuracy). Moreover θ R l is the parameter vector that is generally unnown, f( ) R n, σ( ) R n n and h( ) R m are nonlinear functions that characterize the stochastic system. With respect to the noises of the process, we assume the following assumptions: i) the vector w is assumed to be Gaussian, zeromean E(w ) = and white noise with covariance matrix E(w w T j ) = Q δ j; ii) the vector ν is assumed to be Gaussian, zeromean E(ν ) = and white noise with covariance matrix E(ν ν T j ) = R δ j. Where δ j = identity matrix when = j, otherwise, δ j = zero matrix..1 Case 1: State space formulation for ARCH(1) In this case a possible state space representation of equation (1) when p = 1 is, [ ] X (1) X := X () = f(x 1, θ) + B w, Y := r = X (), (3) where θ = (α, α 1, β 1 ) R 3, B = (, 1) R, = σ R (>, ), X() = r /σ R, w = ε and f(x 1, θ) := [ ] α + [ ] α1 1 (X() 1 ). (4) For details see [6]. The construction leading to the model of equations (3) and (4) is easily extended to ARCH models with higher orders yielding a different state space formulation that will be studied in the next subsection.. Case : State space formulation for ARCH(p), p > 1 The following formulation is a possible state space representation of equation (1) when p > 1: X := X () = A θ X 1 + f(x 1, θ)+ X (3) + B w, Y := r = X (3) X (), (5) where θ = (α, α 1,, α p, β 1 ) R p+, B = (,,,, 1) R p+, w = ε := r p r p+1. r 1 Rp, (6) ISBN: 978-1-6184-141-8 59

X () = σ R, and R (>, ), X(3) = r /σ shows the computation of the predicted state from the previous estimate: f(x 1, θ) := Moreover, p 1 α + α 1 X () (p 1) 1 X () 1 (X(3) 1 ) 1 (X(3) 1 ) (7) ( ) Ap p A θ = p R (p+) (p+), α p (8) with 1 1 A p p =........ R p p, 1 ( ) αp α α p = R p, (9) and where m n, 1 m, n p, are matrices of zeros whose size is m n. Let us notice the obvious nonlinearity of all state space representations, presented in this section, due clearly to nonlinearity of the process and observation equations. 3 The Extended Kalman Filter Let Y N = [y, y 1, y,..., y,..., y N ] be a nown sequence of measurement or observations. The functions f and h (see equation ()) are used to compute the predicted state and the predicted measurement from the previous estimate state. The following equation. ˆx 1 = f(ˆx 1 1, u 1, θ). (1) To compute the predicted estimate covariance a matrix A of partial derivatives (the Jacobian matrix) is previously computed. This matrix is evaluated, with the predicted states, at each discrete timestep and used in the KF equations. In other words, A is a linearized version of the nonlinear function f around the current estimate. P 1 = A 1 P 1 1 A 1 + Q. (11) After maing the prediction stage, we need to update the equations. So we have the residual measure innovation: ỹ = y h(ˆx 1, u, θ) (1) and the conditional covariance innovation S 1 = C P 1 C + R, (13) where C is a linearized version of the nonlinear function h around the current estimate. The Kalman gain is given by K = P 1 C S 1 1, (14) and the corresponding updates by and ˆx = ˆx 1 + K ỹ (15) P = (I K C )P 1. (16) The state transition and observation matrices (the linearized versions of f and h) are defined, respectively, by A 1 = f (17) X ˆx 1 1,u 1 ISBN: 978-1-6184-141-8 6

and C = h X ˆx 1 (18) 4 Nonlinear estimation with missing observations Given a sequence of measurement or observations Y N, the lielihood function is given by the following joint probability density function: L(θ; Y N ) = p(y θ) N p(y Y 1, θ). =1 (19) Due to the central limit theorem when the number of experiment replications becomes large the distribution of the mean becomes approximately normal and centered at the mean of each original observation. So it seems reasonable to assume that, for a large number of experiment replications, the probability density functions can be approximated by functions of Gaussian probability densities. Therefore we can rewrite equation (19) as follows: L(θ; Y N )= p(y θ) (π) m/ N =1 g() det(s 1 ) 1/, () where g() = exp.5ỹ S 1 1 ỹ }, ỹ is the residual measure innovation defined in equation (1), ŷ 1 = E(y Y 1, θ) is the conditional mean of y given y, y 1, y,..., y 1 and θ, and finally S 1 is the conditional covariance innovation, defined in equation (13), given y, y 1, y,..., y 1 and θ. Conditioning on y, and considering the function: l(θ) = ln(l(θ; Y y )), (1) the maximum lielihood estimator of θ can be obtained solving the following nonlinear optimization problem: θ = arg min θ (l(θ)). () State space formulation and EKF provide a powerful tool for the analysis of data in the context of maximum lielihood estimation. Let us remar that for a fixed θ, the values of ỹ and S 1, at each discrete timestep, can be obtained from the Kalman filter equations, described in section 3, and subsequently used in the construction of the log-lielihood function. In this context, the success of the optimization of log-lielihood function depends strictly on the behavior of the EKF designed. In a first time, we give the evaluation of the Gaussian log-lielihood function bases on Y r = [ y i, y i1,..., y ir ], where Ir = [i, i 1,..., i r ] are positive integers such that i < i 1 < < i r N. This allows for observation of the process at irregular intervals, or equivalently for the possibility that (N r) observations are missing from the sequence Y N. To deal with possibly irregularly spaced observations or data with missing values (see [5]), we introduce a new series Y }, related to the state X } by the modified observation equation (see equation ()): Y = h (X, u, θ)+ν, =, 1,..., (3) where h (X, u, θ)= h(x, u, θ) if I r, otherwise, ν = ν if I r, η otherwise, ISBN: 978-1-6184-141-8 61

and η is a i.i.d. Gaussian, zero-mean and white noise sequence with covariance matrix E(η η T s ) = I w w δ s (w is the dimension of noise ν and η at each discrete timestep ). Moreover η s X, η s w, and η s ν (s, =, ±1,... ). Let us consider the joint Gaussian probability density function L 1 (θ; Y r ) and the related Gaussian log-lielihood function l 1 (θ) based on the measured values Y r. From these measured values, let us define the sequence YN = [y, y 1,..., y N ] as follows: y = y if I r, (4) otherwise. Let L (θ; YN ) be the joint Gaussian probability density function based on the values YN and let l (θ) be the related log-lielihood function. So it is clear that L 1 (θ; Y r ) and L (θ; YN ) are related by: L 1 (θ; Y r ) = (π) (N r)w/ L (θ; Y N), (5) and consequently l 1 (θ) and l (θ) are related by: l 1 (θ) = ln(π)(n r)w + l (θ). (6) Then we can now compute the required L 1 (θ; Y r ) and the corresponding l 1 (θ) of the realized sequence Y r, using the Kalman approach described above and applying equations (5) and (6). 5 Numerical results Let us notice that for the treatment of missing data using QMLE technique is necessary to replace the unobserved data by other values (which is now as imputation methods). Two probabilities 1 q (5% and %) of missing observation are considered. All the experiments are based on 5 replications of an ARCH(1) model, with parameters θ = (α, α 1 ) = (.3,.5). Also let us assume a noise process covariance Q = I 1 1. The sample mean and standard error of the EKF estimator θ = ( α, α 1 ) (new proposed estimator) are compared to those of estimators based on a complete data set obtained after filling the missing observations by some imputation procedures. Different imputation methods can be applied and we consider the following estimators obtained with the old methods (see [] and [3]): θ b = ( α b, α 1 b ), θ m = ( α m, α 1 m ) and θ p = ( α p, α 1 p ) are the QMLE estimators of θ using different imputation methods, obtained respectively when each missing values is (i) deleted and the observed data are bound together, (ii) replaced by the sample mean of the observed values, and (iii) replaced by the first non missing value prior to it. The Table 1 show the results for simulated observations. We can see the estimates for θ = (α, α 1 ) and the corresponding standard error in parentheses. We can see that the proposed nonlinear estimation method presents better performances than the classical imputation techniques considered here. Also our methodology is readily extended to other nonlinear time series models and nonstationary and asymmetric cases. 6 Conclusion This article introduces a new nonlinear state space representation to characterize ARCH(p) (p 1) time series models. Also an efficient numerical method, based on an EKF approach, for nonlinear parameter estimation of ARCH processes, has been presented. The ISBN: 978-1-6184-141-8 6

1 q 5% %.31 (.3).31 (.33) θ.4991 (.135).4994 (.16).38 (1.591e-).334 (1.991e-) θ b.485 (4.65e-).44 (5.97e-).316 (.41e-).33 (.6e-) θ m.43 (4.57e-).33 (4.41e-).93 (1.51e-).67 (1.616e-) θ p.511 (4.397e-).554 (4.39e-) Table 1: Sample mean (standard error in parentheses) of the different estimators when n =. framewor that we propose it is valuable if the processes have unobserved values. The strategy of estimation associated with this representation allows computationally efficient parameter estimation. Let us notice that quasi maximum lielihood methods are very accurate by estimating ARCH parameters in a complete time series. In these cases, the EKF involve a more complex procedure for the parameter estimation and the gain is not realy significant. On the other hand, it is well nown that time series with missing values presents a serious problem to conventional parameter estimation methodologies such as QMLE. Studies show that, for linear time series models, the standard KF approach can be easily modified in order to obtain an efficient method to deal with missing observations. A natural extension, for nonlinear time series models, has been proposed in this wor to treat problems with missing values. The numerical results presented herein demonstrate the effectiveness of this methodology, and show that it is more appropriate than other QMLEimputation methods used in practice (see Table 1). In conclusion, the methodology proposed in this article is an innovative and effective way to solve the problem of estimation of parameters in ARCH processes with missing observations. Acnowledgements: The research was supported by the projects DII 37.397/1 and DII 37.395/1, Dirección de Investigación e Innovación, Pontificia Universidad Católica de Valparaíso. References: [1] Bollerslev, T. (1986), Generalized autoregressive conditional heterosedasticity, J. Econometrics 31(3), 37-37. [] Bondon, P. and Bahamonde, N. (1). Least squares estimation of ARCH models with missing observations. J. Time Ser. Anal. In Press. [3] Bose, A. and Muherjee, K. (3). Estimating the ARCH parameters by solving linear equations. J. Time Ser. Anal., 4():17-136. [4] Engle, R. (198), Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation, Econometrica 5(4), 987-17. [5] Little, R. J. A. and Rubin, D. B. (). Statistical analysis with missing data. Wiley Series in Probability and Statistics. Wiley-Interscience John Wiley & Sons, Hoboen, NJ, second edition. [6] Ossandón, S. and Bahamonde, N. (11), On the Nonlinear Estimation of GARCH Models Using an Extended Kalman Filter, Lectures Notes in Engineering and Computer Science: Proceedings of The World Congress on Engineering 11, WCE 11, 6-8 July, 11, London, U.K., pp 148-151. ISBN: 978-1-6184-141-8 63