Generalized Autoregressive Score Smoothers

Size: px
Start display at page:

Download "Generalized Autoregressive Score Smoothers"

Transcription

1 Generalized Autoregressive Score Smoothers Giuseppe Buccheri 1, Giacomo Bormetti 2, Fulvio Corsi 3,4, and Fabrizio Lillo 2 1 Scuola Normale Superiore, Italy 2 University of Bologna, Italy 3 University of Pisa, Italy 4 City University of London, UK Very Preliminary and Incomplete. Abstract Motivated by the observation that Generalized Autoregressive Score (GAS) models can be viewed as approximate filters, we introduce a new class of simple approximate smoothers for nonlinear non-gaussian state-space models that are named Generalized Autoregressive Score Smoothers (sgas). The newly proposed sgas improves on GAS filtered estimates as it is able to use all available observations when re-constructing time-varying parameters. In contrast to complex and computationally demanding simulation-based methods, the sgas has similar structure to Kalman backward smoothing recursions but uses the score of the non-gaussian observation density. Through an extensive Monte Carlo study, we provide evidence that the performance of the approximation is very close (with average differences lower than 2.5% in mean square errors) to that of simulation-based techniques while at the same time requiring significantly lower computational burden. Keywords: GAS models, Smoothing, Kalman filter, State-Space models, GARCH, MEM 1

2 1 Introduction Observation-driven models like the GARCH model of Bollerslev (1986), where time-varying parameters are driven by functions of lagged observations, are typically viewed as data generating processes. As such, all relevant information is encoded on previous observations and there is no room for using actual and future observations. However, they can also be viewed as predictive filters, as time-varying parameters are one-step-ahead measurable. This idea was largely exploited by Daniel B. Nelson who explored the asymptotic properties of conditional covariance estimates generated by GARCH processes under the assumption that the true data generating process is a diffusion 1 ; see e.g. Nelson (1992), Nelson and Foster (1994), Nelson and Foster (1995) and Nelson (1996). In particular, Nelson (1996) showed how to efficiently use information in both lagged and led GARCH residuals to estimate the unobserved components of a stochastic volatility process. However, despite the huge amount of observation-driven models, which allow to filter out the timevarying parameters of the true data generating process, the literature lacks of observation-driven smoothers allowing to estimate parameters using all available information. In this paper we aim at filling this gap by introducing a smoothing method for a general class of observation-driven models, namely Generalized Autoregressive Score (GAS) models of Creal et al. (2013) and Harvey (2013), also known as Score-Driven Models. We start from the observation that Kalman filtering and smoothing recursions for time-invariant linear Gaussian models can be re-written in terms of the score of the predictive likelihood and a set of static parameters. Since the predictive filtering recursion is known to have the same form of GAS models, the latters can be viewed as approximate filters for nonlinear non-gaussian models. Thanks to the use of the score of the non-gaussian density, GAS filters provide similar forecasting performances as correctly specified parameter-driven models, as shown by Koopman et al. (2016). Based on the same logic, we build a new class of smoothers that maintain the same simple form of Kalman backward smoothing recursions but use the score of the non-gaussian density. The resulting smoothing method is very 1 The interpretation of GARCH processes as filters is well described in this statement by Nelson (1992): Note that our use of the term estimate corresponds to its use in the filtering literature rather than the statistics literature; that is, an ARCH model with (given) fixed parameters produces estimates of the true underlying conditional covariance matrix at each point in time in the same sense that a Kalman filter produces estimates of unobserved state variables in a linear system. 2

3 general, as it can be applied to any observation density, in a similar fashion to GAS filters. We name the newly proposed methodology Generalized Autoregressive Score Smoother (sgas). Smoothing with the sgas requires performing a backward recursion following the standard GAS forward recursion to filter out time-varying parameters. While going backward, the sgas updates filtered GAS estimates by including the effect of actual and future observations and leads to a more efficient re-contruction of time-varying parameters. Considering that the likelihood of observation-driven models can be typically written down in closed form, sgas smoothing is particularly advantageous from a computational point of view. In contrast, the classical theory of filtering and smoothing for nonlinear non-gaussian models requires the use of computationally demanding simulation-based techniques (see e.g. Durbin and Koopman 2012). GAS models have been successfully applied in the recent econometric literature. For instance, Creal et al. (2011) developed a multivariate dynamic model for volatilities and correlations using fat tailed distributions. Oh and Patton (2017) introduced high-dimensional factor copula models based on GAS dynamics for systemic risk assessment while Harvey and Luati (2014) described a new framework for filtering with heavy tails. Compared to other observation-driven models, GAS models are locally optimal from an information theoretic perspective, as shown by Blasques et al. (2015). For any GAS model, a companion sgas can be devised to improve the estimation of timevarying parameters. The sgas is therefore useful for offline signal reconstruction and analysis. In particular, we examine in detail the companion sgas of some among the most popular observationdriven models, namely the GARCH, the MEM model of Engle (2002) and Engle and Gallo (2006) and an AR(1) model with time-varying autoregressive coefficient. By performing extensive Monte Carlo simulations of nonlinear non-gaussian state-space models, we compare the performance of the sgas to that of correctly specified parameter-driven models. In particular, we consider two stochastic volatility models and a stochastic intensity models. Importance sampling methods allow to evaluate the full likelihood of these models. Increasing the number of simulations from the importance density leads to precise parameter estimates and accurate smoothed estimates of time-varying parameters. We also used the Quasi Maximum Likelihood (QML) method of Harvey et al. (1994) to estimate the two stochastic volatility models. Compared to correctly specified models, the losses incurred by the sgas are very small in all the simulated scenarios and are always lower on average than 2.5% in mean square errors. Morevoer, it sys- 3

4 tematically outperforms the QML. Computational times are decisively in favour of the sgas. For instance, for the first stochastic volatility model used in the simulation, we found that smoothing with the sgas is on average faster than smoothing with importance sampling and similar computational advantages are observed for the remaining models. The rest of the paper is organized as follows: Section (2) introduces the sgas and conveys the main theoretical ideas; Section (3) describes in detail three examples of sgas models; Section (4) shows the results of the Monte Carlo study; Section (5) concludes. 2 sgas In this section we will discuss in detail the main theoretical ideas conveying to the formulation of the sgas. We start by observing that the classical Kalman filter and smoothing recursions for time-invariant linear Gaussian models can be re-written in an equivalent form that involves only the score of the conditional likelihood and a set of static parameters. Based on the same logic of GAS filters, that have similar form of Kalman forward filtering recursions, we will introduce the sgas as having similar form of Kalman backward smoothing recursions. However, sgas models are able to provide robust estimates since they use the score of the non-gaussian density. 2.1 Kalman filtering and smoothing Let us consider a linear Gaussian state-space representation: y t = Zα t + ɛ t, ɛ t NID(0, H) (1) α t+1 = c + T α t + η t, η t NID(0, Q) (2) where α t is an n-dimensional column vector of state variables and y t is an m-dimensional column vector of observations. The parameters Z, H, T and Q are system matrices of appropriate dimensions. Let F t denote the set of observations up to time t, namely F t = {y 1,..., y t }. We are interested in updating our knowledge of the underlying state variable α t when a new observation y t 4

5 becomes available and to predict α t+1 based on the last observations y 1,..., y t. Thus, we define: a t t. = E[α t F t ], P t t. = Var[α t F t ] (3) a t+1. = E[α t+1 F t ], P t+1. = Var[α t+1 F t ] (4) The Kalman filter allows to compute recursively a t t, P t t, a t+1 and P t+1. Assuming α 1 N(a 1, P 1 ), where a 1 and P 1 are known, for t = 1,..., N, we have (see e.g. Durbin and Koopman 2012): v t = y t Za t, F t = ZP t Z + H (5) a t t = a t + P t Z F 1 t v t, P t t = P t P t Z F 1 t ZP t (6) a t+1 = c + T a t + K t v t, P t+1 = T P t (T K t Z) + Q (7) where K t = T P t Z Ft 1. The log-likelihood can be computed in the prediction error decomposition form, namely: log p(y t F t 1 ) = const 1 2 ( log Ft + v tf 1 t v t ) Smoothed estimates ˆα t. = E[α t F N ], ˆP t. = Var[α t F N ], N > t, can be computed through the following backward recursions: r t 1 = Z F 1 t v t + L tr t, N t 1 = Z F 1 t Z + L tn t L t (9) ˆα t = a t + P t r t 1, ˆPt = P t P t N t 1 P t (10) (8) where L t = T K t Z, r n = 0, N n = 0 and t = N,..., A more general representation We now re-write the Kalman filter and smoothing recursions in an equivalent but more general representation. To this end, let us consider the score of the log-likelihood (8) with respect to a t : [ ] log p(yt F t 1 ) t = (11) a t By computing the derivative we obtain: [ ] [ log p(yt F t 1 ) log p(yt F t 1 ) v t = a t v t a t ] = [v tft 1 Z] = Z Ft 1 v t (12) 5

6 The information matrix is computed as: Thus, we can re-write recursions for a t t and a t+1 as: I t t 1 = E t 1 [ t t] = Z F 1 t Z (13) a t t = a t + P t t (14) a t+1 = c + T a t + T P t t (15) and the backward recursion for ˆα t as r t 1 = t + L tr t (16) ˆα t = a t + P t r t 1 (17) where L t = T T P t I t t 1. Since the system matrices are constant, a steady state exists and P t converges to the fixed point P of the iteration in few steps (see e.g. Durbin and Koopman 2012). By defining R =. T P and I =. Z (Z P Z + H) 1 Z, we can re-write the Kalman filter and smoother recursions for the mean in the steady state as: a t t = a t + T 1 R t (18) a t+1 = c + T a t + R t (19) and r t 1 = t + L r t (20) ˆα t = a t + T 1 Rr t 1 (21) where L = T RI. The new Kalman filter and smoothing recursions for the mean are re-parametrized in terms of the score t. This representation is equivalent to the one in equations (5)-(7) and (9), (10). However, it is more general in the sense that it only relies on the predictive density p(y t F t 1 ). In principle, the forward recursions (18), (19) and the backward recursions (20), (21) can be applied to any observation-driven model for which a predictive density p(y t F t 1 ) is defined. 6

7 2.3 sgas specification Note that the predictive filter (19) has an autoregressive structure and is driven by the score of the conditional likelihood. Thus, if one looks at GAS models as filters, it turns out that the GAS is correctly specified in case of linear Gaussian state-space models. In case of nonlinear non-gaussian state-space models, it can be regarded as an approximate filter that provides robust estimates as it uses the score of the non-gaussian observation density. Indeed, as shown by Koopman et al. (2016), the GAS has similar predictive accuracy to correctly specified parameter-driven models while at the same time providing large computational gains. The main advantage is that the likelihood can be written in closed form and standard quasi-newton techniques can be employed for optimization. Based on the same principle, we introduce an approximate smoother that allows to estimate time-varying parameters using all the available observations. As the GAS filter is correctly specified in case of linear and Gaussian state-space models, we define our smoother in such a way that it has similar structure to the Kalman smoother in case of linear Gaussian state-space models while in case of nonlinear non-gaussian state-space models, it maintains the same simple form but uses the score of the non-gaussian observation density. Let us assume that observations y t R n, t = 1,..., T, are generated by the following predictive density: y t f t p(y t f t, Θ) (22) where f t R k is a vector of time-varying parameters and Θ is a vector of static parameters. We generalize the filtering and smoothing recursions (18)-(21) for the prediction density p(y t f t, Θ) as: f t t = f t + B 1 As t (23) f t+1 = ω + As t + Bf t (24) t = 1,..., N and: r t 1 = s t + (B A) r t (25) ˆf t = f t + B 1 Ar t 1 (26) where r n = 0 and t = N,..., 1. The predictive filter in equation (24) has the same form of a GAS filter. The term s t. = S t t, t. = log p(yt ft,θ) f t is the scaled score of the predictive likelihood. As 7

8 discussed by Creal et al. (2013), the most common choice for the scaling matrix is S t = I 1 t t 1, where. I t t 1 = E t 1 [ t t] is the information matrix. The vector ω R k and the two matrices A, B R k k are static parameters included in Θ which are estimated by maximizing the log-likelihood, namely: ˆΘ = argmax Θ T log p(y t f t, Θ) (27) t=1 Thus, one can run the backward smoothing recursions (25), (26) after estimating Θ and computing the forward filtering recursions (23), (24), in a similar fashion to Kalman filter and backward recursions. Compared to the latters, we have used s t in place of t in order to correct for the curvature of the log-likelihood, as tipically done in GAS models. by AI 1 t t 1 The score is now multiplied in equation (24) and therefore the term L = T RI in equation (20) generalizes as B (AI 1 t t 1 )I t t 1 = B A. That is, the information matrix I t t 1 disappears because its effect is already taken into account when scaling the score. We term the approximate smoother obtained through recursions (25), (26) as Generalized Autoregressive Score Smoother (sgas). 3 Examples of sgas models In this section we show some examples of sgas models. We compare filtered estimates obtained through the GAS to the corresponding smoothed estimates obtained through the companion sgas model. We focus on three time-varying parameter models that are quite popular in the econometric literature, namely the GARCH model of Bollerslev (1986), the multiplicative error model (MEM) of Engle (2002) and Engle and Gallo (2006) and an AR(1) model with a time-varying autoregressive coefficient. In all the three cases we first estimate the static parameters, compute filtered estimates through the forward recursions (24) and finally compute smoothed estimates by running the backward recursions (25), (26). Example 1: sgarch. Consider the model: y t = σ t ɛ t, ɛ t NID(0, 1) (28) The predictive density is thus: p(y t σt 2 ) = 1 e y 2 t 2πσt 2σ 2 t (29) 8

9 Setting f t = σ 2 t and S t = I 1 t t 1, equation (24) reduces to the GARCH(1,1) model: f t+1 = c + A(yt 2 f t ) + Bf t (30) while the smoothing recursions (25), (26) reduce to: r t 1 = yt 2 f t + (B A) r t (31) ˆf t = f t + B 1 Ar t 1 (32) t = N,..., 1. Example 2: smem. Consider the model: y t = µ t ɛ t (33) where ɛ t has a gamma distribution with density p(ɛ t α) = Γ(α) 1 ɛ α 1 t α α e αɛt. The predictive density is thus given by: p(y t µ t, α) = Γ(α) 1 y α 1 t α α µ α Setting f t = µ t and S t = I 1 t t 1, equation (24) reduces to the MEM(1,1) model: while the smoothing recursions (25), (26) reduce to: t e α y t µ t (34) f t+1 = c + A(y t f t ) + Bf t (35) r t 1 = y t f t + (B A) r t (36) ˆf t = f t + B 1 Ar t 1 (37) t = N,..., 1. Example 3: sar(1). Consider the model: y t = c + α t y t 1 + ɛ t, ɛ t N(0, q 2 ) (38) The predictive density is thus given by: [ p(y t α t ) = 1 exp 1 ( ) ] 2 yt c α t y t 1 2πq 2 q (39) Setting f t = α t and S t = I 1 t t 1, equation (24) reduces to: f t+1 = c + A y t c f t y t 1 y t 1 + Bf t (40) 9

10 while the smoothing recursions (25), (26) reduce to: r t 1 = y t c f t y t 1 y t 1 + (B A) r t (41) ˆf t = f t + B 1 Ar t 1 (42) t = N,..., 1. We simulate N = 4000 observations with different dynamic patterns for σt 2, µ t and α t. Figures (1), (2), (3) show filtered and smoothed estimates of time-varying parameters obtained through the three models. As expected, sgas estimates are less noisy than filtered GAS estimates and are characterized by lower mean square errors (MSE). For instance, the average MSE of sgarch estimates in Figure (1), is times lower than that of GARCH estimates and similar values are obtained for other models. 4 Monte Carlo analysis In this section we will perform extensive Monte Carlo simulations to test the performance of the sgas under different dynamic specifications for the time-varying parameters. Since we interpret the sgas as an approximate smoother for parameter-driven models, we compare its performance to that of correctly specified parameter-driven models. The main idea is to examine the extent to which the approximation leads to similar results as correctly specified parameter-driven models. In this case, the use of the sgas would be particularly advantageous as the likelihood can be written in closed form and smoothing is performed through a simple backward recursion. Thus, the computational burden would be much lower than that required for parameter-driven models where computationally demanding simulation-based techniques are employed for both estimation and smoothing. This analysis is similar to that of Koopman et al. (2016), who compared the GAS to correctly specified parameter-driven models and found that the two classes of models have similar predictive accuracy, with very small average losses. We will find a similar result for sgas models. 10

11 4.1 Linear Gaussian models As a first step, we consider an AR(1) plus noise model: y t = α t + ɛ t, ɛ t N(0, H) (43) α t+1 = γ + T α t + η t, η t N(0, Q) (44) The signal-to-noise ratio is defined as δ. = Q H and the constant is chosen as γ = The model is linear and Gaussian and thus the classical Kalman recursions can be applied to obtain smoothed estimates. To apply the sgas, we consider a Gaussian prediction density: [ p(y t f t ; σ 2 1 ) = exp (y ] t f t ) 2 2πσ 2 2σ 2 (45) Setting S t = I 1 t t 1, equation (24) reduces to: f t+1 = c + A(y t f t ) + Bf t (46) while the smoothing recursions (25), (26) reduce to: r t 1 = y t f t + (B A) r t (47) ˆf t = f t + B 1 Ar t 1 (48) t = N,..., 1. We generate 1000 time series with 4000 observations. We use the first 2000 observations for estimation and the last 2000 for testing purposes. Since the sgas has similar form to Kalman smoother backward recursions, we expect that the two methods provide very similar results. Indeed, this is confirmed by the results in Table (1) which compares, for a wide range of autoregressive coefficients T and signal-to-noise ratios δ, average MSE and MAE of GAS and sgas estimates to those obtained through Kalman filtering and smoothing recursions. The GAS provides same results as the Kalman filter and the sgas provides same results as the Kalman smoother, confirming that the two methods are equivalent on linear Gaussian models. 11

12 4.2 Linear non-gaussian models We add non-gaussianity to the previous model by considering a t-distributed measurement error. The new model reads: y t = α t + ɛ t, ɛ t t(0, H, ν) (49) α t+1 = γ + T α t + η t, η t N(0, Q) (50) We choose γ = 0.01 and T = The corresponding observation driven model has a t-distributed predictive density: p(y t f t ; ϕ, β) = Γ[(β + 1)/2] Γ(β/2)ϕ πβ [1 + (y ] t f t ) 2 (β+1)/2 (51) βϕ 2 Setting S t = I 1 t t 1, equation (24) reduces to (see e.g. Harvey 2013): while the smoothing recursions (25), (26) reduce to: f t+1 = c + A(β + 3) y t f t ( ) 2 + Bf t (52) β + y t f t ϕ r t 1 = (β + 3) y t f t ( ) 2 + (B A) r t (53) β + y t f t ϕ ˆf t = f t + B 1 Ar t 1 (54) t = N,..., 1. We compare standard Kalman filtered and smoothed estimates with GAS and sgas estimates. The simulation setting is the same as the one in paragraph (4.1). Table (2) shows relative MSE and MAE for different values of ν. In contrast to the previous case, GAS and sgas provide now better estimates than standard Kalman filter and smoother. In particular, we observe large differences for low values of ν, where the t-distribution strongly deviates from the Gaussian and for low values of δ, at which accounting for the non-gaussianity of the measurement error becomes more important. Note that the gains of sgas over Kalman smoother estimates are larger than the gains of GAS over the Kalman filter for low ν and δ. These results confirm the ability of the sgas to provide robust smoothed estimates of time-varying parameters to the same extent as the GAS provides robust filtered estimates of time-varying parameters in presence of a non-gaussian prediction density. 12

13 4.3 Nonlinear non-gaussian models We now examine the behaviour of the sgas in presence of nonlinear non-gaussian parameter-driven models. In particular, we consider the following three specifications, which are quite popular in the econometric literature: 1. Stochastic volatility model with Gaussian measurement density: r t = σe 0.5θt ɛ t, ɛ t N(0, 1) θ t+1 = γ + φθ t + η t, η t N(0, σ 2 η) 2. Stochastic volatility with non-gaussian measurement density: r t = σe 0.5θt ɛ t, ɛ t t(0, 1, ν) θ t+1 = γ + φθ t + η t, η t N(0, σ 2 η) 3. Stochastic intensity model with Poisson measurement density: t e λt p(y t λ t ) = λyt y t!, θ t = log λ t θ t+1 = γ + φθ t + η t, η t N(0, σ 2 η) Harvey et al. (1994) proposed a Quasi Maximum Likelihood method (QML) to estimate the stochastic volatility model 1 and 2 based on a Gaussian quasi-likelihood that is obtained through linearization. As the linearized model is a assumed to be normal, it is susceptible of treatment with the Kalman filter and smoother and thus the method can be viewed as providing approximate filtered and smoothed estimates. As such, it is interesting to compare the performance of the QML to that of the sgas. Sandmann and Koopman (1998) devised a Monte-Carlo approach based on importance sampling to evaluate the full likelihood function. We estimate the two stochastc volatility models by employing the same importance sampling approach but use the recently developed Numerically Accelerated Importance Sampling (NAIS) technique of Koopman et al. (2015) to choose the parameters of the importance density. This method has been shown to provide several efficiency gains compared to existing approaches. The stochastic intesity model can also be estimated through importance sampling, as described e.g. by Durbin and Koopman (1997). Similarly to the previous 13

14 cases, we use importance sampling but choose the parameters of the importance density through the NAIS. More details on importance sampling techniques for nonlinear non-gaussian state-space models can be found on Durbin and Koopman (2012). We choose the predictive densities of the corresponding observation-driven models as indicated below. 1. For the two stochastic volatility models: p(y t f t ) = Γ ( ) [ β+1 2 Γ ( ) β πβϕ2 e 2 ft β ( yt ϕe 0.5ft ) 2 ] β+1 2 (55) 2. For the stochastic intensity model: p(y t f t ) = e ef t e ftyt y t! (56) The use of a t distribution for the first model is due to the fact that even Gaussian stochastic volatility models are able to generate a predictive density with fat-tails and overdispersion (see e.g. Carnero et al. 2004). Thus, in order for the observation-driven model to capture these features, we adopt a more flexible specification for the predictive density. This is in line with Koopman et al. (2016), who compared GAS models to correctly specified parameter-driven models. In the case of the predictive density in equation (55), setting S t = I 1 t t 1, the filtering recursion (24) reduces to (see e.g. Harvey 2013): f t+1 = c + A β + 3 β β + 1 β ( β while the smoothing recursions (25), (26) reduce to: ( r t 1 = β + 3 β β + 1 β β ) 2 y t ϕe 0.5f t ( ) 2 y t ϕe 0.5f t ( y t ϕe 0.5f t y t ϕe 0.5f t ) Bf t (57) ) (B A) r t (58) ˆf t = f t + B 1 Ar t 1 (59) In the case of the predictive density in equation (56), setting S t = I 1 t t 1, the filtering recursion (24) reduces to: f t+1 = c + A(e ft y t 1) + Bf t (60) 14

15 while the smoothing recursions (25), (26) reduce to: r t 1 = e ft y t 1 + (B A) r t (61) ˆf t = f t + B 1 Ar t 1 (62) Importance sampling is implemented with S = 200 simulations and we also use control variables as described by Koopman et al. (2015). In all experiments we generate 1000 time series of 4000 observations. The first 2000 observations are used for estimation while the last 2000 are used for testing purposes. Figure (4) compares smoothed estimates obtained through the NAIS and the sgas for the three different models at hand. A simple visual inspection shows that the estimates provided by the two methods are very close. In order to examine in more detail differences between sgas and NAIS estimates, tables (3), (4), (5) show the results of Monte Carlo experiments for the three models. In the case of stochastic volatility models, we consider different scenarios where we vary the autoregressive coefficients φ and the coefficient of variation CV, as defined in Sandmann and Koopman (1998): ( σ CV =. 2 ) η exp 1 (63) 1 φ 2 Note that, as CV increases, the signal-to-noise ratio increases. The values of both φ, CV and of the remaining parameters are chosen to be close to those estimated on real financial time series, as discussed in Durbin and Koopman (1997). For the stochastic intensity model, we consider scenarios characterized by different autoregressive coefficients φ and different values of the variance σ 2 η of the signal. In the case of the two stochastic volatility models, the sgas largely outperforms the QML on all scenarios. The performance of the latter tends to worsen as CV decreases, according to the fact that the non-gaussianity of the measurement equation becomes more important. Compared to the NAIS, the relative MSE loss of the sgas is very small. In particular, it is always less than 2% in the Gaussian case, while it is always lower than 2.5% in the non-gaussian case. In contrast to the QML, larger losses are observed for large values of CV, where the the signal-to-noise ratio is larger and non-gaussianity is less relevant. It is worth emphasizing that increasing further the number of simulations S in the NAIS does not lead to significant improvements over the sgas. In the case of the stochastic intensity model, we observe a similar behaviour but the relative MSE loss is slightly larger for φ = 0.98 and σ 2 η = 0.01 where it is found to be around 5%. Overall, 15

16 average MSE losses are less than 2.5% if one averages across all scenarios. This result is in agreement with what Koopman et al. (2016) found by comparing the prediction performance of the GAS to that of correctly specified parameter-driven models. Finally, it is interesting to look at computational times. In the case of the stochastic volatility model with Gaussian measurement density, the estimation time of the NAIS with S = 200 simulations is on average times larger than that of the corresponding observation driven model while the smoothing time is times larger. Similar values are achieved for the remaining models. Thus, using the sgas allows to obtain smoothed estimates which are very close to those of correctly specified parameter-driven models but reducing considerably the required computational burden. 5 Conclusions In this paper we have introduced the sgas, a new class of approximate smoothers for nonlinear non-gaussian state-space models. The sgas is based on interpreting observation-driven models and in particular GAS models as filters rather than data generating processes. As such, actual and future observations can be used to improve GAS filtered estimates of time-varying parameters. The sgas has similar structure to Kalman backward smoothing recursions for linear Gaussian state-space models but uses the score of the observation density at hand. Thus, it is particularly advantageous from a computational point of view and can be applied to any observation density, in a similar fashion to GAS models. We have examimed three examples of sgas corresponding to popular observation-driven models, namely GARCH, MEM and an AR(1) model with a time-varying autoregressive coefficient. The sgas updates GAS estimates based on all available observations and thus leads to more efficient estimates. As such, it is useful for signal reconstruction and analysis. Extensive Monte Carlo simulations of nonlinear non-gaussian state-space models show that sgas estimates are very similar to those of correctly specified parameter-driven models. Indeed, losses are always smaller on average than 2.5%. In contrast, the sgas is more appealing from a computational point of view, being much faster than simulation-based techniques that are employed to estimate nonlinear non-gaussian state-space models. 16

17 GAS - Kalman filter sgas - Kalman smoother δ T = 0.90 MSE MAE T = 0.95 MSE MAE T = 0.98 MSE MAE Table 1: Average MSE and MAE of GAS (relative to Kalman filter) filtered estimates and sgas (relative to Kalman smoother) smoothed estimates. GAS - Kalman filter sgas - Kalman smoother δ ν = 3 MSE MAE ν = 5 MSE MAE ν = 8 MSE MAE Table 2: Average MSE and MAE of GAS (relative to Kalman filter) filtered estimates and sgas (relative to Kalman smoother) smoothed estimates. 17

18 CV MSE MAE φ = 0.98 NAIS sgas QML φ = 0.95 NAIS sgas QML φ = 0.90 NAIS sgas QML Table 3: Average MSE and MAE of NAIS, sgas and QML smoothed estimates normalized by NAIS loss in case of stochastic volatility model with Gaussian measurement density. 18

19 CV MSE MAE φ = 0.98, ν = 3 NAIS sgas QML φ = 0.95, ν = 3 NAIS sgas QML φ = 0.90, ν = 3 NAIS sgas QML Table 4: Average MSE and MAE of NAIS, sgas and QML smoothed estimates normalized by NAIS loss in case of stochastic volatility model with non-gaussian measurement density. 19

20 ση MSE MAE φ = 0.98 NAIS sgas φ = 0.95 NAIS sgas φ = 0.90 NAIS sgas Table 5: Average MSE and MAE of NAIS and sgas smoothed estimates normalized by NAIS loss in case of stochastic intensity model with Poisson measurement density. 20

21 Figure 1: Comparison among simulated (black line), filtered (blue dotted) and smoothed (red line) variance σ 2 t of GARCH(1,1) model. 21

22 Figure 2: Comparison among simulated (black line), filtered (blue dotted) and smoothed (red line) mean µ t of MEM(1,1) model. 22

23 Figure 3: Comparison among simulated (black line), filtered (blue dotted) and smoothed (red line) autoregressive coefficient α t of AR(1) model. 23

24 10 SV Gaussian SV non-gaussian Stochastic intensity Figure 4: Comparison among simulated unobserved components (black dotted), NAIS smoothed estimates (red dashed) and sgas smoothed estimates (blue dotted and dashed). 24

25 References Blasques, F., Koopman, S. J., Lucas, A., Information-theoretic optimality of observationdriven time series models for continuous responses. Biometrika 102 (2), 325. Bollerslev, T., April Generalized autoregressive conditional heteroskedasticity. Journal of Econometrics 31 (3), Carnero, M. A., Peña, D., Ruiz, E., Persistence and kurtosis in garch and stochastic volatility models. Journal of Financial Econometrics 2 (2), Creal, D., Koopman, S. J., Lucas, A., A dynamic multivariate heavy-tailed model for timevarying volatilities and correlations. Journal of Business & Economic Statistics 29 (4), Creal, D., Koopman, S. J., Lucas, A., Generalized autoregressive score models with applications. Journal of Applied Econometrics 28 (5), Durbin, J., Koopman, S., Time Series Analysis by State Space Methods: Second Edition. Oxford Statistical Science Series. OUP Oxford. Durbin, J., Koopman, S. J., Monte carlo maximum likelihood estimation for non-gaussian state space models. Biometrika 84 (3), Engle, R., New frontiers for arch models. Journal of Applied Econometrics 17 (5), Engle, R. F., Gallo, G. M., A multiple indicators model for volatility using intra-daily data. Journal of Econometrics 131 (1), Harvey, A., Luati, A., Filtering with heavy tails. Journal of the American Statistical Association 109 (507), Harvey, A., Ruiz, E., Shephard, N., Multivariate stochastic variance models. The Review of Economic Studies 61 (2), Harvey, A. C., Dynamic Models for Volatility and Heavy Tails: With Applications to Financial and Economic Time Series. Econometric Society Monographs. Cambridge University Press. 25

26 Koopman, S. J., Lucas, A., Scharth, M., Numerically accelerated importance sampling for nonlinear non-gaussian state-space models. Journal of Business & Economic Statistics 33 (1), Koopman, S. J., Lucas, A., Scharth, M., March Predicting Time-Varying Parameters with Parameter-Driven and Observation-Driven Models. The Review of Economics and Statistics 98 (1), Nelson, D. B., Filtering and forecasting with misspecified arch models i: Getting the right variance with the wrong model. Journal of Econometrics 52 (1), Nelson, D. B., Asymptotically optimal smoothing with arch models. Econometrica 64 (3), Nelson, D. B., Foster, D. P., Asymptotic filtering theory for univariate arch models. Econometrica 62 (1), Nelson, D. B., Foster, D. P., Filtering and forecasting with misspecified arch models ii: Making the right forecast with the wrong model. Journal of Econometrics 67 (2), Oh, D. H., Patton, A. J., Time-varying systemic risk: Evidence from a dynamic copula model of cds spreads. Journal of Business & Economic Statistics 0 (0), Sandmann, G., Koopman, S. J., Estimation of stochastic volatility models via monte carlo maximum likelihood. Journal of Econometrics 87 (2),

Accounting for Missing Values in Score- Driven Time-Varying Parameter Models

Accounting for Missing Values in Score- Driven Time-Varying Parameter Models TI 2016-067/IV Tinbergen Institute Discussion Paper Accounting for Missing Values in Score- Driven Time-Varying Parameter Models André Lucas Anne Opschoor Julia Schaumburg Faculty of Economics and Business

More information

Generalized Autoregressive Score Models

Generalized Autoregressive Score Models Generalized Autoregressive Score Models by: Drew Creal, Siem Jan Koopman, André Lucas To capture the dynamic behavior of univariate and multivariate time series processes, we can allow parameters to be

More information

A score-driven conditional correlation model for noisy and asynchronous data: an application to high-frequency covariance dynamics

A score-driven conditional correlation model for noisy and asynchronous data: an application to high-frequency covariance dynamics A score-driven conditional correlation model for noisy and asynchronous data: an application to high-frequency covariance dynamics Giuseppe Buccheri, Giacomo Bormetti, Fulvio Corsi, Fabrizio Lillo February,

More information

Part I State space models

Part I State space models Part I State space models 1 Introduction to state space time series analysis James Durbin Department of Statistics, London School of Economics and Political Science Abstract The paper presents a broad

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53 State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State

More information

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49 State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing

More information

X t = a t + r t, (7.1)

X t = a t + r t, (7.1) Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical

More information

POSTERIOR MODE ESTIMATION FOR NONLINEAR AND NON-GAUSSIAN STATE SPACE MODELS

POSTERIOR MODE ESTIMATION FOR NONLINEAR AND NON-GAUSSIAN STATE SPACE MODELS Statistica Sinica 13(2003), 255-274 POSTERIOR MODE ESTIMATION FOR NONLINEAR AND NON-GAUSSIAN STATE SPACE MODELS Mike K. P. So Hong Kong University of Science and Technology Abstract: In this paper, we

More information

Numerically Accelerated Importance Sampling for Nonlinear Non-Gaussian State Space Models

Numerically Accelerated Importance Sampling for Nonlinear Non-Gaussian State Space Models Numerically Accelerated Importance Sampling for Nonlinear Non-Gaussian State Space Models Siem Jan Koopman (a) André Lucas (a) Marcel Scharth (b) (a) VU University Amsterdam and Tinbergen Institute, The

More information

GARCH Models Estimation and Inference

GARCH Models Estimation and Inference GARCH Models Estimation and Inference Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 1 Likelihood function The procedure most often used in estimating θ 0 in

More information

7 Day 3: Time Varying Parameter Models

7 Day 3: Time Varying Parameter Models 7 Day 3: Time Varying Parameter Models References: 1. Durbin, J. and S.-J. Koopman (2001). Time Series Analysis by State Space Methods. Oxford University Press, Oxford 2. Koopman, S.-J., N. Shephard, and

More information

A Practical Guide to State Space Modeling

A Practical Guide to State Space Modeling A Practical Guide to State Space Modeling Jin-Lung Lin Institute of Economics, Academia Sinica Department of Economics, National Chengchi University March 006 1 1 Introduction State Space Model (SSM) has

More information

Statistical Inference and Methods

Statistical Inference and Methods Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 31st January 2006 Part VI Session 6: Filtering and Time to Event Data Session 6: Filtering and

More information

BOOTSTRAP PREDICTION INTERVALS IN STATE SPACE MODELS. Alejandro Rodriguez 1 and Esther Ruiz 2

BOOTSTRAP PREDICTION INTERVALS IN STATE SPACE MODELS. Alejandro Rodriguez 1 and Esther Ruiz 2 Working Paper 08-11 Departamento de Estadística Statistic and Econometric Series 04 Universidad Carlos III de Madrid March 2008 Calle Madrid, 126 28903 Getafe (Spain) Fax (34-91) 6249849 BOOTSTRAP PREDICTION

More information

July 31, 2009 / Ben Kedem Symposium

July 31, 2009 / Ben Kedem Symposium ing The s ing The Department of Statistics North Carolina State University July 31, 2009 / Ben Kedem Symposium Outline ing The s 1 2 s 3 4 5 Ben Kedem ing The s Ben has made many contributions to time

More information

Nonlinear Parameter Estimation for State-Space ARCH Models with Missing Observations

Nonlinear Parameter Estimation for State-Space ARCH Models with Missing Observations Nonlinear Parameter Estimation for State-Space ARCH Models with Missing Observations SEBASTIÁN OSSANDÓN Pontificia Universidad Católica de Valparaíso Instituto de Matemáticas Blanco Viel 596, Cerro Barón,

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

Discussion Score-driven models for forecasting by Siem Jan Koopman. Domenico Giannone LUISS University of Rome, ECARES, EIEF and CEPR

Discussion Score-driven models for forecasting by Siem Jan Koopman. Domenico Giannone LUISS University of Rome, ECARES, EIEF and CEPR Discussion Score-driven models for forecasting by Siem Jan Koopman Domenico Giannone LUISS University of Rome, ECARES, EIEF and CEPR 8th ECB Forecasting Workshop European Central Bank, June 2014 1 / 10

More information

Research Division Federal Reserve Bank of St. Louis Working Paper Series

Research Division Federal Reserve Bank of St. Louis Working Paper Series Research Division Federal Reserve Bank of St Louis Working Paper Series Kalman Filtering with Truncated Normal State Variables for Bayesian Estimation of Macroeconomic Models Michael Dueker Working Paper

More information

Location Multiplicative Error Model. Asymptotic Inference and Empirical Analysis

Location Multiplicative Error Model. Asymptotic Inference and Empirical Analysis : Asymptotic Inference and Empirical Analysis Qian Li Department of Mathematics and Statistics University of Missouri-Kansas City ql35d@mail.umkc.edu October 29, 2015 Outline of Topics Introduction GARCH

More information

13. Estimation and Extensions in the ARCH model. MA6622, Ernesto Mordecki, CityU, HK, References for this Lecture:

13. Estimation and Extensions in the ARCH model. MA6622, Ernesto Mordecki, CityU, HK, References for this Lecture: 13. Estimation and Extensions in the ARCH model MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Robert F. Engle. GARCH 101: The Use of ARCH/GARCH Models in Applied Econometrics,

More information

Parameter Estimation for ARCH(1) Models Based on Kalman Filter

Parameter Estimation for ARCH(1) Models Based on Kalman Filter Applied Mathematical Sciences, Vol. 8, 2014, no. 56, 2783-2791 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ams.2014.43164 Parameter Estimation for ARCH(1) Models Based on Kalman Filter Jelloul

More information

Forecasting economic time series using score-driven dynamic models with mixeddata

Forecasting economic time series using score-driven dynamic models with mixeddata TI 2018-026/III Tinbergen Institute Discussion Paper Forecasting economic time series using score-driven dynamic models with mixeddata sampling 1 Paolo Gorgi Siem Jan (S.J.) Koopman Mengheng Li3 2 1: 2:

More information

Analytical derivates of the APARCH model

Analytical derivates of the APARCH model Analytical derivates of the APARCH model Sébastien Laurent Forthcoming in Computational Economics October 24, 2003 Abstract his paper derives analytical expressions for the score of the APARCH model of

More information

Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations

Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations Farhat Iqbal Department of Statistics, University of Balochistan Quetta-Pakistan farhatiqb@gmail.com Abstract In this paper

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

ISSN Article. Selection Criteria in Regime Switching Conditional Volatility Models

ISSN Article. Selection Criteria in Regime Switching Conditional Volatility Models Econometrics 2015, 3, 289-316; doi:10.3390/econometrics3020289 OPEN ACCESS econometrics ISSN 2225-1146 www.mdpi.com/journal/econometrics Article Selection Criteria in Regime Switching Conditional Volatility

More information

Nonlinear Autoregressive Processes with Optimal Properties

Nonlinear Autoregressive Processes with Optimal Properties Nonlinear Autoregressive Processes with Optimal Properties F. Blasques S.J. Koopman A. Lucas VU University Amsterdam, Tinbergen Institute, CREATES OxMetrics User Conference, September 2014 Cass Business

More information

Bayesian Semiparametric GARCH Models

Bayesian Semiparametric GARCH Models Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods

More information

Bayesian Semiparametric GARCH Models

Bayesian Semiparametric GARCH Models Bayesian Semiparametric GARCH Models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics xibin.zhang@monash.edu Quantitative Methods

More information

Switching Regime Estimation

Switching Regime Estimation Switching Regime Estimation Series de Tiempo BIrkbeck March 2013 Martin Sola (FE) Markov Switching models 01/13 1 / 52 The economy (the time series) often behaves very different in periods such as booms

More information

New Statistical Model for the Enhancement of Noisy Speech

New Statistical Model for the Enhancement of Noisy Speech New Statistical Model for the Enhancement of Noisy Speech Electrical Engineering Department Technion - Israel Institute of Technology February 22, 27 Outline Problem Formulation and Motivation 1 Problem

More information

Time-Varying Vector Autoregressive Models with Structural Dynamic Factors

Time-Varying Vector Autoregressive Models with Structural Dynamic Factors Time-Varying Vector Autoregressive Models with Structural Dynamic Factors Paolo Gorgi, Siem Jan Koopman, Julia Schaumburg http://sjkoopman.net Vrije Universiteit Amsterdam School of Business and Economics

More information

Quaderni di Dipartimento. Small Sample Properties of Copula-GARCH Modelling: A Monte Carlo Study. Carluccio Bianchi (Università di Pavia)

Quaderni di Dipartimento. Small Sample Properties of Copula-GARCH Modelling: A Monte Carlo Study. Carluccio Bianchi (Università di Pavia) Quaderni di Dipartimento Small Sample Properties of Copula-GARCH Modelling: A Monte Carlo Study Carluccio Bianchi (Università di Pavia) Maria Elena De Giuli (Università di Pavia) Dean Fantazzini (Moscow

More information

The Size and Power of Four Tests for Detecting Autoregressive Conditional Heteroskedasticity in the Presence of Serial Correlation

The Size and Power of Four Tests for Detecting Autoregressive Conditional Heteroskedasticity in the Presence of Serial Correlation The Size and Power of Four s for Detecting Conditional Heteroskedasticity in the Presence of Serial Correlation A. Stan Hurn Department of Economics Unversity of Melbourne Australia and A. David McDonald

More information

Research Article The Laplace Likelihood Ratio Test for Heteroscedasticity

Research Article The Laplace Likelihood Ratio Test for Heteroscedasticity International Mathematics and Mathematical Sciences Volume 2011, Article ID 249564, 7 pages doi:10.1155/2011/249564 Research Article The Laplace Likelihood Ratio Test for Heteroscedasticity J. Martin van

More information

Time Varying Transition Probabilities for Markov Regime Switching Models

Time Varying Transition Probabilities for Markov Regime Switching Models Time Varying Transition Probabilities for Markov Regime Switching Models Marco Bazzi (a), Francisco Blasques (b) Siem Jan Koopman (b,c), André Lucas (b) (a) University of Padova, Italy (b) VU University

More information

Generalized Autoregressive Method of Moments *

Generalized Autoregressive Method of Moments * Generalized Autoregressive Method of Moments * Drew Creal 1, Siem Jan Koopman 2, André Lucas 2, and Marcin Zamojski 2 1 University of Chicago, Booth School of Business 2 VU University Amsterdam and Tinbergen

More information

Dynamic Models for Volatility and Heavy Tails by Andrew Harvey

Dynamic Models for Volatility and Heavy Tails by Andrew Harvey Dynamic Models for Volatility and Heavy Tails by Andrew Harvey Discussion by Gabriele Fiorentini University of Florence and Rimini Centre for Economic Analysis (RCEA) Frankfurt, 4-5 May 2012 I enjoyed

More information

Unobserved. Components and. Time Series. Econometrics. Edited by. Siem Jan Koopman. and Neil Shephard OXFORD UNIVERSITY PRESS

Unobserved. Components and. Time Series. Econometrics. Edited by. Siem Jan Koopman. and Neil Shephard OXFORD UNIVERSITY PRESS Unobserved Components and Time Series Econometrics Edited by Siem Jan Koopman and Neil Shephard OXFORD UNIVERSITY PRESS CONTENTS LIST OF FIGURES LIST OF TABLES ix XV 1 Introduction 1 Siem Jan Koopman and

More information

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia GARCH Models Estimation and Inference Eduardo Rossi University of Pavia Likelihood function The procedure most often used in estimating θ 0 in ARCH models involves the maximization of a likelihood function

More information

Working Paper Series. A note on implementing the Durbin and Koopman simulation smoother. No 1867 / November Marek Jarocinski

Working Paper Series. A note on implementing the Durbin and Koopman simulation smoother. No 1867 / November Marek Jarocinski Working Paper Series Marek Jarocinski A note on implementing the Durbin and Koopman simulation smoother No 1867 / November 2015 Note: This Working Paper should not be reported as representing the views

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

Gaussian kernel GARCH models

Gaussian kernel GARCH models Gaussian kernel GARCH models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics 7 June 2013 Motivation A regression model is often

More information

A Non-Parametric Approach of Heteroskedasticity Robust Estimation of Vector-Autoregressive (VAR) Models

A Non-Parametric Approach of Heteroskedasticity Robust Estimation of Vector-Autoregressive (VAR) Models Journal of Finance and Investment Analysis, vol.1, no.1, 2012, 55-67 ISSN: 2241-0988 (print version), 2241-0996 (online) International Scientific Press, 2012 A Non-Parametric Approach of Heteroskedasticity

More information

Stock index returns density prediction using GARCH models: Frequentist or Bayesian estimation?

Stock index returns density prediction using GARCH models: Frequentist or Bayesian estimation? MPRA Munich Personal RePEc Archive Stock index returns density prediction using GARCH models: Frequentist or Bayesian estimation? Ardia, David; Lennart, Hoogerheide and Nienke, Corré aeris CAPITAL AG,

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

Discussion of Bootstrap prediction intervals for linear, nonlinear, and nonparametric autoregressions, by Li Pan and Dimitris Politis

Discussion of Bootstrap prediction intervals for linear, nonlinear, and nonparametric autoregressions, by Li Pan and Dimitris Politis Discussion of Bootstrap prediction intervals for linear, nonlinear, and nonparametric autoregressions, by Li Pan and Dimitris Politis Sílvia Gonçalves and Benoit Perron Département de sciences économiques,

More information

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t.

ECO 513 Fall 2008 C.Sims KALMAN FILTER. s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. u t = r t. u 0 0 t 1 + y t = [ H I ] u t. ECO 513 Fall 2008 C.Sims KALMAN FILTER Model in the form 1. THE KALMAN FILTER Plant equation : s t = As t 1 + ε t Measurement equation : y t = Hs t + ν t. Var(ε t ) = Ω, Var(ν t ) = Ξ. ε t ν t and (ε t,

More information

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50 GARCH Models Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 50 Outline 1 Stylized Facts ARCH model: definition 3 GARCH model 4 EGARCH 5 Asymmetric Models 6

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

A note on implementing the Durbin and Koopman simulation smoother

A note on implementing the Durbin and Koopman simulation smoother MPRA Munich Personal RePEc Archive A note on implementing the Durbin and Koopman simulation smoother Marek Jarocinski European Central Bank 24. October 2014 Online at http://mpra.ub.uni-muenchen.de/59466/

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

Estimating Missing Observations in Economic Time Series

Estimating Missing Observations in Economic Time Series Estimating Missing Observations in Economic Time Series A. C. Harvey London School of Economics, Houghton Street, London, WC2A 2AE, UK R. G. Pierse Department of Applied Economics, Cambridge University,

More information

State Space Model, Official Statistics, Bayesian Statistics Introduction to Durbin s paper and related topics

State Space Model, Official Statistics, Bayesian Statistics Introduction to Durbin s paper and related topics State Space Model, Official Statistics, Bayesian Statistics Introduction to Durbin s paper and related topics Yasuto Yoshizoe Aoyama Gakuin University yoshizoe@econ.aoyama.ac.jp yasuto yoshizoe@post.harvard.edu

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Financial Econometrics and Volatility Models Estimation of Stochastic Volatility Models

Financial Econometrics and Volatility Models Estimation of Stochastic Volatility Models Financial Econometrics and Volatility Models Estimation of Stochastic Volatility Models Eric Zivot April 26, 2010 Outline Likehood of SV Models Survey of Estimation Techniques for SV Models GMM Estimation

More information

Thomas J. Fisher. Research Statement. Preliminary Results

Thomas J. Fisher. Research Statement. Preliminary Results Thomas J. Fisher Research Statement Preliminary Results Many applications of modern statistics involve a large number of measurements and can be considered in a linear algebra framework. In many of these

More information

STRUCTURAL TIME-SERIES MODELLING

STRUCTURAL TIME-SERIES MODELLING 1: Structural Time-Series Modelling STRUCTURAL TIME-SERIES MODELLING Prajneshu Indian Agricultural Statistics Research Institute, New Delhi-11001 1. Introduction. ARIMA time-series methodology is widely

More information

Properties of Estimates of Daily GARCH Parameters. Based on Intra-day Observations. John W. Galbraith and Victoria Zinde-Walsh

Properties of Estimates of Daily GARCH Parameters. Based on Intra-day Observations. John W. Galbraith and Victoria Zinde-Walsh 3.. Properties of Estimates of Daily GARCH Parameters Based on Intra-day Observations John W. Galbraith and Victoria Zinde-Walsh Department of Economics McGill University 855 Sherbrooke St. West Montreal,

More information

Revisiting linear and non-linear methodologies for time series prediction - application to ESTSP 08 competition data

Revisiting linear and non-linear methodologies for time series prediction - application to ESTSP 08 competition data Revisiting linear and non-linear methodologies for time series - application to ESTSP 08 competition data Madalina Olteanu Universite Paris 1 - SAMOS CES 90 Rue de Tolbiac, 75013 Paris - France Abstract.

More information

Model estimation through matrix equations in financial econometrics

Model estimation through matrix equations in financial econometrics Model estimation through matrix equations in financial econometrics Federico Poloni 1 Joint work with Giacomo Sbrana 2 1 Technische Universität Berlin (A. Von Humboldt postdoctoral fellow) 2 Rouen Business

More information

Forecasting the term structure interest rate of government bond yields

Forecasting the term structure interest rate of government bond yields Forecasting the term structure interest rate of government bond yields Bachelor Thesis Econometrics & Operational Research Joost van Esch (419617) Erasmus School of Economics, Erasmus University Rotterdam

More information

Duration-Based Volatility Estimation

Duration-Based Volatility Estimation A Dual Approach to RV Torben G. Andersen, Northwestern University Dobrislav Dobrev, Federal Reserve Board of Governors Ernst Schaumburg, Northwestern Univeristy CHICAGO-ARGONNE INSTITUTE ON COMPUTATIONAL

More information

GARCH Models Estimation and Inference

GARCH Models Estimation and Inference Università di Pavia GARCH Models Estimation and Inference Eduardo Rossi Likelihood function The procedure most often used in estimating θ 0 in ARCH models involves the maximization of a likelihood function

More information

Time-Varying Parameters

Time-Varying Parameters Kalman Filter and state-space models: time-varying parameter models; models with unobservable variables; basic tool: Kalman filter; implementation is task-specific. y t = x t β t + e t (1) β t = µ + Fβ

More information

Forecasting economic time series using score-driven dynamic models with mixed-data sampling

Forecasting economic time series using score-driven dynamic models with mixed-data sampling Forecasting economic time series using score-driven dynamic models with mixed-data sampling Paolo Gorgi a,b Siem Jan Koopman a,b,c Mengheng Li d a Vrije Universiteit Amsterdam, The Netherlands b Tinbergen

More information

DEPARTMENT OF ECONOMICS

DEPARTMENT OF ECONOMICS ISSN 0819-64 ISBN 0 7340 616 1 THE UNIVERSITY OF MELBOURNE DEPARTMENT OF ECONOMICS RESEARCH PAPER NUMBER 959 FEBRUARY 006 TESTING FOR RATE-DEPENDENCE AND ASYMMETRY IN INFLATION UNCERTAINTY: EVIDENCE FROM

More information

Adaptive quadrature for likelihood inference on dynamic latent variable models for time-series and panel data

Adaptive quadrature for likelihood inference on dynamic latent variable models for time-series and panel data MPRA Munich Personal RePEc Archive Adaptive quadrature for likelihood inference on dynamic latent variable models for time-series and panel data Silvia Cagnone and Francesco Bartolucci Department of Statistical

More information

Hidden Markov Models for precipitation

Hidden Markov Models for precipitation Hidden Markov Models for precipitation Pierre Ailliot Université de Brest Joint work with Peter Thomson Statistics Research Associates (NZ) Page 1 Context Part of the project Climate-related risks for

More information

A Gaussian state-space model for wind fields in the North-East Atlantic

A Gaussian state-space model for wind fields in the North-East Atlantic A Gaussian state-space model for wind fields in the North-East Atlantic Julie BESSAC - Université de Rennes 1 with Pierre AILLIOT and Valï 1 rie MONBET 2 Juillet 2013 Plan Motivations 1 Motivations 2 Context

More information

Asymptotic quasi-likelihood based on kernel smoothing for nonlinear and non-gaussian statespace

Asymptotic quasi-likelihood based on kernel smoothing for nonlinear and non-gaussian statespace University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 2007 Asymptotic quasi-likelihood based on kernel smoothing for nonlinear

More information

A Guide to Modern Econometric:

A Guide to Modern Econometric: A Guide to Modern Econometric: 4th edition Marno Verbeek Rotterdam School of Management, Erasmus University, Rotterdam B 379887 )WILEY A John Wiley & Sons, Ltd., Publication Contents Preface xiii 1 Introduction

More information

Inference and estimation in probabilistic time series models

Inference and estimation in probabilistic time series models 1 Inference and estimation in probabilistic time series models David Barber, A Taylan Cemgil and Silvia Chiappa 11 Time series The term time series refers to data that can be represented as a sequence

More information

LESLIE GODFREY LIST OF PUBLICATIONS

LESLIE GODFREY LIST OF PUBLICATIONS LESLIE GODFREY LIST OF PUBLICATIONS This list is in two parts. First, there is a set of selected publications for the period 1971-1996. Second, there are details of more recent outputs. SELECTED PUBLICATIONS,

More information

Calibration Estimation of Semiparametric Copula Models with Data Missing at Random

Calibration Estimation of Semiparametric Copula Models with Data Missing at Random Calibration Estimation of Semiparametric Copula Models with Data Missing at Random Shigeyuki Hamori 1 Kaiji Motegi 1 Zheng Zhang 2 1 Kobe University 2 Renmin University of China Econometrics Workshop UNC

More information

FaMIDAS: A Mixed Frequency Factor Model with MIDAS structure

FaMIDAS: A Mixed Frequency Factor Model with MIDAS structure FaMIDAS: A Mixed Frequency Factor Model with MIDAS structure Frale C., Monteforte L. Computational and Financial Econometrics Limassol, October 2009 Introduction After the recent financial and economic

More information

Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods

Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods Do Markov-Switching Models Capture Nonlinearities in the Data? Tests using Nonparametric Methods Robert V. Breunig Centre for Economic Policy Research, Research School of Social Sciences and School of

More information

Kalman Filter and its Economic Applications

Kalman Filter and its Economic Applications Kalman Filter and its Economic Applications Gurnain Kaur Pasricha University of California Santa Cruz, CA 95060 E-mail: gpasrich@ucsc.edu October 15, 2006 Abstract. The paper is an eclectic study of the

More information

When is a copula constant? A test for changing relationships

When is a copula constant? A test for changing relationships When is a copula constant? A test for changing relationships Fabio Busetti and Andrew Harvey Bank of Italy and University of Cambridge November 2007 usetti and Harvey (Bank of Italy and University of Cambridge)

More information

Marginal Specifications and a Gaussian Copula Estimation

Marginal Specifications and a Gaussian Copula Estimation Marginal Specifications and a Gaussian Copula Estimation Kazim Azam Abstract Multivariate analysis involving random variables of different type like count, continuous or mixture of both is frequently required

More information

Model-based trend-cycle decompositions. with time-varying parameters

Model-based trend-cycle decompositions. with time-varying parameters Model-based trend-cycle decompositions with time-varying parameters Siem Jan Koopman Kai Ming Lee Soon Yip Wong s.j.koopman@ klee@ s.wong@ feweb.vu.nl Department of Econometrics Vrije Universiteit Amsterdam

More information

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January

More information

Simulation Smoothing for State-Space Models: A. Computational Efficiency Analysis

Simulation Smoothing for State-Space Models: A. Computational Efficiency Analysis Simulation Smoothing for State-Space Models: A Computational Efficiency Analysis William J. McCausland Université de Montréal, CIREQ and CIRANO Shirley Miller Université de Montréal Denis Pelletier North

More information

Time Series Models for Measuring Market Risk

Time Series Models for Measuring Market Risk Time Series Models for Measuring Market Risk José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department June 28, 2007 1/ 32 Outline 1 Introduction 2 Competitive and collaborative

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Lecture 4: Dynamic models

Lecture 4: Dynamic models linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu

More information

Estimation and Inference on Dynamic Panel Data Models with Stochastic Volatility

Estimation and Inference on Dynamic Panel Data Models with Stochastic Volatility Estimation and Inference on Dynamic Panel Data Models with Stochastic Volatility Wen Xu Department of Economics & Oxford-Man Institute University of Oxford (Preliminary, Comments Welcome) Theme y it =

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

1 Phelix spot and futures returns: descriptive statistics

1 Phelix spot and futures returns: descriptive statistics MULTIVARIATE VOLATILITY MODELING OF ELECTRICITY FUTURES: ONLINE APPENDIX Luc Bauwens 1, Christian Hafner 2, and Diane Pierret 3 October 13, 2011 1 Phelix spot and futures returns: descriptive statistics

More information

Generalized Dynamic Panel Data Models with Random Effects for Cross-Section and Time

Generalized Dynamic Panel Data Models with Random Effects for Cross-Section and Time Generalized Dynamic Panel Data Models with Random Effects for Cross-Section and Time G. Mesters (a,b,c) and S.J. Koopman (b,c) (a) Netherlands Institute for the Study of Crime and Law Enforcement, (b)

More information

Constructing a Coincident Index of Business Cycles Without Assuming a One-Factor Model

Constructing a Coincident Index of Business Cycles Without Assuming a One-Factor Model Constructing a Coincident Index of Business Cycles Without Assuming a One-Factor Model Roberto S Mariano Singapore Management University and the University of Pennsylvania Yasutomo Murasawa Osaka Prefecture

More information

DEPARTMENT OF ECONOMICS AND FINANCE COLLEGE OF BUSINESS AND ECONOMICS UNIVERSITY OF CANTERBURY CHRISTCHURCH, NEW ZEALAND

DEPARTMENT OF ECONOMICS AND FINANCE COLLEGE OF BUSINESS AND ECONOMICS UNIVERSITY OF CANTERBURY CHRISTCHURCH, NEW ZEALAND DEPARTMENT OF ECONOMICS AND FINANCE COLLEGE OF BUSINESS AND ECONOMICS UNIVERSITY OF CANTERBURY CHRISTCHURCH, NEW ZEALAND Discussion of Principal Volatility Component Analysis by Yu-Pin Hu and Ruey Tsay

More information

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus

More information

Symmetric btw positive & negative prior returns. where c is referred to as risk premium, which is expected to be positive.

Symmetric btw positive & negative prior returns. where c is referred to as risk premium, which is expected to be positive. Advantages of GARCH model Simplicity Generates volatility clustering Heavy tails (high kurtosis) Weaknesses of GARCH model Symmetric btw positive & negative prior returns Restrictive Provides no explanation

More information

Dynamic Adaptive Mixture Models

Dynamic Adaptive Mixture Models Dynamic Adaptive Mixture Models Leopoldo Catania 1, a Department of Economics and Finance, University of Rome, Tor Vergata, Rome, Italy Abstract In this paper we propose a new class of Dynamic Mixture

More information

New Introduction to Multiple Time Series Analysis

New Introduction to Multiple Time Series Analysis Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2

More information

M-estimators for augmented GARCH(1,1) processes

M-estimators for augmented GARCH(1,1) processes M-estimators for augmented GARCH(1,1) processes Freiburg, DAGStat 2013 Fabian Tinkl 19.03.2013 Chair of Statistics and Econometrics FAU Erlangen-Nuremberg Outline Introduction The augmented GARCH(1,1)

More information