A Gini Autocovariance Function for Heavy Tailed Time Series Modeling

Size: px
Start display at page:

Download "A Gini Autocovariance Function for Heavy Tailed Time Series Modeling"

Transcription

1 A Gini Autocovariance Function for Heavy Tailed Time Series Modeling Marcel Carcea 1 and Robert Serfling 2 University of Texas at Dallas December, Department of Mathematics, University of Texas at Dallas, Richardson, Texas , USA. mdc92@utdallas.edu. 2 Department of Mathematics, University of Texas at Dallas, Richardson, Texas , USA. serfling@utdallas.edu. Website: serfling. Support by a Society of Actuaries Grant, NSF Grants DMS and DMS , and NSA Grant H is gratefully acknowledged.

2 Abstract A key conceptual and methodological tool in time series modeling is the autocovariance function, which, however, presupposes finite variances and excludes heavy tailed distributions and data. To allow for the latter, which are increasingly of interest in modern statistics, this paper introduces a Gini autocovariance function defined under merely first order moment assumptions. Playing roles similar to those of the usual autocovariance function, it provides a new fundamental tool for nonparametric description and modeling of time series. It is seen how to fit autoregressive, moving average, and ARMA time series models using Gini autocovariances. Also, the Gini autocovariance function for a nonlinear heavy-tailed (Pareto autoregressive model is developed and sheds new light on this model. A straightforward sample Gini autocovariance function is formulated and its properties discussed. AMS 2 Subject Classification: Primary 62M1 Secondary 62N2 Key words and phrases: Autocovariance funcion; Time series; Heavy tails; Autoregressive; Nonlinear; Pareto.

3 1 Introduction For time series modeling a key conceptual and methodological tool is the autocovariance function. However, the use of covariances presupposes that the variables have finite variances, a restriction that excludes accommodation of heavy tailed distributions and data. Yet the latter are becoming increasingly relevant in modern statistical modeling. Economic data, financial networks, telecommunication networks, and many other settings often involve data having heavy tailed marginal distributions. Although present approaches using second order methods anyway are successful up to a point in treating heavy tailed modeling situations, it is desirable to develop alternative concepts and methods that appropriately impose only first order (or lower moment assumptions on the variables. Particularly desirable is a version of autocovariance function that is conceptually valid in the setting of heavy tailed modeling and is effective in applications. To address this problem, we introduce a Gini autocovariance function defined under merely first order moment assumptions. Conceptually, it plays roles similar to those of the usual autocovariance function and thus represents a new fundamental tool in time series modeling. This extends to the time series setting the Gini covariance of Schechtman and Yitzhaki (1987, which has been applied in the setting of a sample of independent bivariate observations. We formulate the Gini autocovariance function in Section 2, and in Section 3 examine its basic features and some closely related functions. Methods for fitting linear time series models using Gini autocovariances are discussed in Section 4. In particular, systems of equations based on Gini autocovariances for fitting autoregressive (AR, moving average (MA, and ARMA models are developed. The Gini autocovariance function for a heavy tailed Pareto type nonlinear autoregressive model, ARP(1, is treated in Section 5. A straightforward sample Gini autocovariance function is formulated in Section 6. Concluding remarks in Section 7 summarize the roles of the Gini autocovariance function and mention related and further work. Throughout the paper, well-known facts will be invoked as needed without citing particular sources. In such cases, suitable sources are Box and Jenkins (1976 and Brockwell and Davis (1991. For simplicity, we confine to the case of strictly stationary time series, although some results will be obtained under weaker forms of stationarity. 2 The Gini autocovariance function Let us first consider a time series {X t } that is 2nd order stationary: means, variances, and covariances are finite and are invariant under time shifts. A standard tool under such assumptions is the autocovariance function, which consists of the lag k covariances, γ(k = Cov(X 1+k, X 1, k =, ±1, ±2,.... Clearly, γ( k and γ(+k are equal and it suffices to define this function just for k =, 1, 2,.... However, it is technically convenient to define it over all k =, ±1, ±2,.... 1

4 Our purpose here is to introduce an alternative notion of autocovariance function which is available under merely 1st order (or even lower moment assumptions. This is the Gini autocovariance function, which we formulate building upon the existing quantities already in the literature: Gini mean difference and Gini covariance. This new autocovariance function provides a solid theoretical foundation for carrying out standard time series correlational analysis in the setting of heavy-tailed modeling and data. 2.1 The Gini mean difference For random variable X having a univariate distribution F, an alternative to the usual standard deviation as a measure of spread was introduced by Gini (1912: α(x = E X 1 X 2 = E(X 2:2 X 1:2, (1 where X 1 and X 2 are independent conceptual observations having distribution F and X 1:2 X 2:2 denote their ordered values. Now known as the Gini mean difference (GMD, α(x is finite if F has finite mean. An important representation is α(x = 2Cov(X, 2F (X 1 = 4 Cov(X, F (X, (2 facilitating an illuminating interpretation: α(x is 4 times the covariance of X and its rank in the distribution F, or, more precisely, twice the covariance of X and the classical centered rank function 2F (X 1. The GMD may also be expressed as an L-functional (weighted integral of quantiles, α(x = 2 1 F 1 (u (2u 1 du, with F 1 (u = inf{x : F (x u}, < u < 1, the usual quantile function of F. This representation and also (1 yield still another useful expression, α(x = 2 x (2F (x 1 df (x. (3 This enables estimation of α(x by substitution of a sample version of F. For treatment of the GMD within the more general context of L-moments (descriptive measures of all orders under just a first moment assumption, see Hosking (199 and Serfling and Xiao ( The Gini covariance For a bivariate random vector (X, Y with joint distribution F and respective marginal distributions F X and F Y having finite means, two alternatives to the usual covariance for measuring dependence of X and Y were introduced by Schechtman and Yitzhaki (1987: β(x, Y = 2Cov(X, 2F Y (Y 1 = 4 Cov(X, F Y (Y, (4 2

5 the Gini covariance of X with respect to Y, and β(y, X = 2Cov(Y, 2F X (X 1 = 4 Cov(Y, F X (X, (5 the Gini covariance of Y with respect to X. For treatment within the more general context of L-comoments, see Serfling and Xiao (27. Note that β(x, Y is proportional to the covariance of X and the F Y -rank of Y, while β(y, X is proportional to the covariance of Y and the F X -rank of X. Thus β(y, X and β(x, Y need not be equal. Together, they provide complementary pieces of information about the dependence of X and Y. The definitions (4 and (5 parallel the above covariance representation (2 for α(x and reduce to it when X = Y. Also, paralleling (1, we have β(x, Y = E(X [2:2] X [1:2], (6 where (X 1, Y 1 and (X 2, Y 2 are independent observations on F, and, for i = 1, 2, X [i:2] denotes the X-value or concomitant matched with Y i:2, with Y 1:2 Y 2:2 the ordered values of Y 1 and Y 2 (see David and Nagaraja, 23. Further, paralleling (3, we have β(x, Y = 2 x (2F Y (y 1 df (x, y. (7 A scale-free Gini correlation (of X with respect to Y is given by ρ (G (X, Y = β(x, Y α(x = Cov(X, F Y (Y Cov(X, F X (X, (8 and the companion ρ (G (Y, X is defined analogously. These compare with the usual Pearson correlation ρ(x, Y = Cov(X, Y /σ X σ Y defined under second order moment assumptions. In particular, ρ (G (X, Y and ρ(x, Y both take values in the interval [ 1, +1]. The correlation ρ(x, Y measures the degree of linearity in the relationship between X and Y and takes values ±1 if and only if X is a linear function of Y. On the other hand, ρ (G (X, Y measures the degree of monotonicity in the relationship between X and Y and takes values ±1 if and only if X is a monotone function of Y. When ρ (G (X, Y and ρ(x, Y are both defined, they coincide under some conditions, which are fulfilled for bivariate normal distributions and for certain bivariate Pareto distributions, for example. See Schechtman and Yitzhaki (1987 and Serfling and Xiao (27 for discussion. 2.3 A Gini autocovariance function We now extend to the context of a time series {X t } which is 1st order Gini stationary: means are finite, and means and Gini covariances are invariant under time shifts. We make no 2nd order assumptions. For each time t and each lag k 1, there are two Gini covariances of lag k, β(x t+k, X t and β(x t, X t+k. By the stationarity assumption, β(x t+k, X t = β(x 1+k, X 1 and β(x t, X t+k 3

6 = β(x 1, X 1+k, each t =, ±1, ±2,..., and β(x 1, X 1+k = β(x 1 k, X 1, each k 1. Thus, under just 1st order moment assumptions, the Gini covariance structure of the time series {X t } may be characterized succinctly by the Gini autocovariance function γ (G (k = β(x 1+k, X 1, k =, ±1, ±2,.... (9 For k =, we have γ (G ( = α(x = β(x, X. For lag k, γ (G ( k and γ (G ( k provide separate measures of dependence not necessarily equal. On this basis, it is sometimes convenient for practical applications to exhibit the Gini autocovariance function (9 as two separate versions, each indexed by k : and γ (A (k = β(x 1+k, X 1, k =, 1, 2,..., γ (B (k = β(x 1, X 1+k, k =, 1, 2,.... Each summarizes lag k covariances, but in different ways. The A series provides (with factor of 4 the covariance of an observation and the rank of the lag k previous observation, while the B series provides (with factor of 4 the covariance of an observation and the rank of the lag k future observation. For theoretical treatments, however, it is more convenient to work with the function γ (G (k as a whole. We may also convert to correlations. Analogous to the autocorrelation function ρ(k = γ(k/γ(, the Gini autocorrelation functions are given by ρ ( (k = γ ( (k/γ ( (, for = G, A, or B. 3 Key Features of Gini Autocovariance and Related Functions for Linear Time Series Let us now assume that {X t } is strictly stationary, that the variables X t are centered by subtracting a common mean µ and thus have mean, and that the time series is linear, that is, represented as a linear filter applied to a series of independent shocks or innovations {ξ t }, with X t a weighted sum of the innovations from times s t, X t = ψ i ξ t i, (1 i= and with ψ = 1. Under the condition j= ψ j <, which we assume, this representation can be made meaningful and precise in certain senses of convergence. Specifically, we have Theorem 1 For any sequence of random variables {ξ t } such that sup t E ξ t <, and for any sequence of constants {ψ i } such that i= ψ i <, the series (1 converges absolutely with probability 1. If also sup t E ξ t <, then the partial sums of (1 converge in the mean to the same limit. 4

7 This is proved under 2nd order moment assumptions as Proposition of Brockwell and Davis (1991. With but minor modifications, their proof extends to our 1st order setting. The assumption on {ξ t } in this theorem follows from our basic stationarity and first order moment assumptions. Following Brockwell and Davis (1991, a time series {X t } given by (1 with j= ψ j < is called a causal function of the innovations {ξ t }. In our case, this causality holds under only 1st order assumptions. 3.1 Expressions for Gini autocovariances in terms of Gini crosscovariances Under 2nd order assumptions, the usual xξ cross-covariance of lag k is defined as γ xξ (k = Cov(X t+k, ξ t. We have ( {, k < γ xξ (k = Cov(X t+k, ξ t = Cov ψ i ξ t+k i, ξ t = ψ k σξ 2, k. (11 i= Reversing the roles of ξ and x, a ξx cross-covariance of lag k is defined as γ ξx (k = Cov(ξ t+k, X t. However, noting that γ ξx (k = γ xξ ( k, this version of cross-covariance yields nothing new. Under just 1st order assumptions, we have Gini analogues. Two notions of Gini xξ crosscovariances of lag k are given by β(x t+k, ξ t and β(ξ t, X t+k. We will consider just the first and denote it by γ (G xξ (k. For this, a striking parallel to (11 is readily obtained: ( { γ (G xξ (k = β(x, k < t+k, ξ t = β ψ i ξ t+k i, ξ t = (12 ψ k α(ξ, k. i= Reversing the roles of ξ and x, we have two Gini ξx cross-covariances of lag k given by β(ξ t+k, X t and β(x t, ξ t+k. We have special use for the first, which we denote by γ (G ξx (k. In terms of this Gini ξx cross-covariance function, we obtain an expression for the Gini autocovariance function as follows: ( γ (G (k = β(x 1+k, X 1 = β ψ i ξ 1+k i, X 1 = ψ i γ (G ξx (k i (13 i= i=max{,k} As will be illustrated in Section 4, this expression plays useful roles in fitting linear time series models by obtaining equations for the {ψ i } in terms of the Gini autocovariances {γ (G (k}. 3.2 Expressions for Gini cross-covariances in terms of Gini autocovariances Let us now suppose also that the linear stationary model is invertible, that is, {ξ t } can be represented as ξ t = π i X t i, (14 i= 5

8 with π = 1 and π i <. The coefficients {π i } and {ψ i } are related via π(z = ψ 1 (z, where π(z = i= π i z i and ψ(z = i= ψ i z i, provided that π(z and ψ(z have no common zeros. If follows that the {π i } may be obtained recursively from the {ψ i } via π = ψ = 1 and π i = (ψ i + π 1 ψ i π i 1 ψ 1, i 1, (15 yielding π 1 = ψ 1, π 2 = ψ1 2 ψ 2, π 3 = ψ ψ 1 ψ 2 ψ 3, etc. In this case the ξx crosscovariances have expressions in terms of {π i } and the autocovariance function, as follows. For the 2nd order case and the usual autcovariance, we have using (14 that {, k > γ ξx (k = γ xξ ( k = Cov(ξ t+k, X t = i= π (16 i γ(k i, k. By similar steps using (14, we obtain for the 1st order case the nice parallel {, k > γ (G ξx (k = β(ξ t+k, X t = i= π i γ (G (k i, k. (17 4 Linear Time Series Models and Gini Autocovariances Some widely used linear time series models formulate the model in terms of a finite set of parameters and, under 2nd order assumptions, represent the parameters in terms of the usual autocovariance function. Then the model parameters may be estimated using sample estimates of the autocovariance function. Here we introduce similar methods that are based on the Gini autocovariance function and thus available under just 1st order assumptions. In particular, we show how to obtain Gini systems of equations for model parameters in three important cases: moving average, autoregressive, and ARMA models. 4.1 Moving Average Processes An immediate special case of linear stationary model is the MA(q model given by X t = ξ t + θ 1 ξ t θ q ξ t q (18 for some choice of q 1. Putting θ = 1, (1 holds with ψ i = θ i for i =, 1,..., q and ψ i = for i > q. We assume invertibility. Then, for the usual autocovariance function under 2nd order assumptions, and using that X 1+k and X 1 are independent for k > q, we have { σ 2 q k γ(k = ξ i= θ i θ i+ k, k q, (19, k > q. Likewise, under 1st order assumptions and using (13, we obtain for the Gini autocovariance function { q γ (G i=max{,k} (k = θ i γ (G ξx (k i, k q, (2, k > q. 6

9 Here we have used again that X 1+k and X 1 are independent for k > q and also that ξ 1+k and X 1 are independent for k > and k < q Solving for θ 1,..., θ q in terms of Gini autocovariances We start with equations (2, which more explicitly may be written γ (G ( γ (G (1 γ (G (2 γ (G (q (1 = γ (G ξx ( + θ 1γ (G ξx ( 1 + θ 2γ (G ξx ( θ qγ (G ξx ( q (2 = θ 1 γ (G ξx ( + θ 2γ (G ξx ( θ qγ (G ξx ( q + 1 (3 = θ 2 γ (G ξx ( + + θ qγ (G ξx ( q + 2. (q+1 = θ q γ (G ξx (. From the (q + 1th equation, we obtain γ (G ξx ( = θ 1 q γ (G (q and substitute this into each of the q preceding equations. Also, in these q equations, we substitute for the quantities γ (G ξx ( using (15 and further reduce using (17. This leads to a nonlinear system of q equations for the q quantities θ 1,..., θ q in terms of the Gini autocovariances γ (G (,..., γ (G (q. Illustration for q = 1. We have γ (G ( (1 = γ (G ξx ( + θ 1γ (G ξx ( 1 and γ(g (1 (2 = θ 1 γ (G ξx (. From the 2nd equation we obtain γ (G ξx ( = θ 1 1 γ (G (1, which substituted into the 1st equation yields γ (G ( = θ1 1 γ (G (1 + θ 1 γ (G ξx ( 1. Now γ(g ξx ( 1 = π γ (G ( 1 = γ (G ( 1, and our equation for θ 1 now becomes γ (G ( = θ1 1 γ (G (1 + θ 1 γ (G ( 1, or a quadratic equation with solutions γ (G ( 1θ 2 1 γ (G (θ 1 + γ (G (1 =, θ 1 = γ(g ( ± γ (G ( 2 4γ (G ( 1γ (G (1. 2γ (G ( 1 There are multiple solutions, and in practice we adopt rules for selecting one of them, just as is done with equations in terms of the usual autocovariance function under 2nd order assumptions. Illustration for q = 2. Similar steps yield the following system of equations for θ 1 and θ 2 : γ (G ( = θ 1 2 γ (G (2 + θ 1 (γ (G ( 1 θ 1 γ (G ( 2 + θ 2 γ (G ( 2 γ (G (1 = θ 1 θ 1 2 γ (G (2 + θ 2 (γ (G ( 1 θ 1 γ (G ( 2. 7

10 4.2 AR(p Processes Another special case of linear stationary model is the causal AR(p model given by X t = φ 1 X t φ p X t p + ξ t (21 for some choice of p 1 and with ξ t having finite mean zero, each t. The causality assumption means that the process may be represented as a linear process in the form (1 with ψ i <. An additional assumption of invertibility yields the recursion { <k j ψ j = φ kψ j k, j < p, <k p φ kψ j k, j p, yielding ψ = 1, ψ 1 = φ 1, ψ 2 = φ φ 2, etc. To obtain a linear system of p equations for φ 1,..., φ p in terms of Gini autocovariances, we apply a general covariance approach having possibly wider interest A general covariance approach Consider a linear structure η = p φ j α j + ε, (22 j=1 with ε independent of α 1,..., α p, as we are assuming in the AR model. We seek a linear system p a i = b ij φ j, i = 1,..., p, (23 or in matrix form j=1 a = Bφ, (24 with a = (a 1,..., a p T, φ = (φ 1,..., φ p T, and B = (b ij p p (here M T is the transpose of matrix M. We now introduce a simple covariance approach for obtaining suitable a and B. For any function Q(α 1,..., α p of α 1,..., α p, we have Cov(η, Q(α 1,..., α p = p φ j Cov(α j, Q(α 1,..., α p, (25 j=1 provided that these covariances are finite. Using p different functions Q i (α 1,..., α p, 1 i p, in (25, we obtain a linear system of equations of form (24, with a i = Cov(η, Q i (α 1,..., α p, 1 i p, b ij = Cov(ξ j, Q i (α 1,..., α p, 1 i, j p. 8

11 We shall choose Q i (α 1,..., α p = g(α i, 1 i p, for different choices of function g. In this case, for any such g, we obtain a i = Cov(η, g(α i, 1 i p, (26 b ij = Cov(α j, g(α i, 1 i, j p. (27 We apply the foregoing device with (η, α 1,..., α p = (X t, X t 1,..., X t p, as per the AR(p model (21. Both the usual least squares approach under 2nd order assumptions and a new Gini approach under 1st order assumptions are obtained. The standard least squares method Use g(α i = α i and take Q i (X t 1,..., X t p = X t i, 1 i p, yielding, under 2nd order moment assumptions, a i = Cov(X t, X t i = γ(i, 1 i p, and b ij = Cov(X t j, X t i, = γ( i j, 1 i, j p. In this case, (24 gives the usual Yule-Walker equations for φ 1,..., φ p. For p = 1 the least squares solution is simply φ 1 = γ(1/γ(. The Gini approach Now adopting only 1st order moment assumptions and using g(α i = 2(2F X (α i 1 along with Q i (X t 1,..., X t p = 2(2F Xt i (X t i 1, we obtain a i = β(x t, X t i = γ (G (i, 1 i p, and b ij = β(x t j, X t i = γ (G (i j, 1 i, j p. Then (24 gives Gini-Yule-Walker equations for φ 1,..., φ p. For p = 1 the Gini solution is simply φ 1 = γ (G (1/γ (G (. This opens up an attractive new Gini-Yule-Walker methodology for AR modeling with heavy tailed innovations. It has the computational advantages of least squares but under merely 1st order moment assumptions. Although the least squares estimators may still be used under just 1st order assumptions, the sample estimators 1,..., defined via the sample Gini autocovariances have meaningful population analogues defined in terms of the population Gini autocovariances. Other approaches permitting just 1st order (or even lower moment assumptions have been developed, and these are successful from various perspectives. However, these do not provide explicit closed-form estimates or solve a simple linear system. The least absolute deviations approach (Bloomfield and Steiger, 1983, and Ling, 25 requires only a first order moment assumption, but the solution must be obtained numerically. An approach of Brockwell and Davis (1991, 13.3 avoids moment assumptions by using an analogue of the usual autocorrelation function having a consistent sample estimator as the sample length increases, but this autocorrelation function lacks easy interpretation and has other limitations (see Resnick, In comparison, the Gini approach provides a systematic foundational approach toward heavy tailed autoregressive modeling. φ (G φ (G p 9

12 4.3 ARMA(p, q Processes Another special case of linear stationary model is the ARMA(p, q model given by X t = φ 1 X t φ p X t p + ξ t + θ 1 ξ t θ q ξ t q (28 for some choices of p, q 1 and putting θ = 1. Assuming causality and invertibility, the coefficients {ψ i } in (1 satisfy the recursion { θj + <k j ψ j = φ kψ j k, j < max{p, q + 1}, <k p φ kψ j k, j max{p, q + 1}, yielding ψ = θ = 1, ψ 1 = θ 1 + φ 1, etc Solving for φ 1,..., φ p and θ 1,..., θ q in terms of {γ (G ( } We will illustrate for ARMA(1, 1. In this case we have ψ = 1 and ψ j = θ 1 φ j φ j 1, j 1, and then π = 1 and π j = ( 1 j (θ 1 + φ 1 φ j 1 1, j 1. This yields γ (G ξx ( = π i γ (G ( i = (θ 1 + φ 1 ( 1 i φ i 1 1 γ (G ( i. (29 i= Next we derive the equations γ (G (1 (1 = φ 1 γ (G ( + θ 1 γ (G ξx ( and γ(g (2 (2 = φ 1 γ (G (1. The 2nd of these yields the solution for φ 1 : Using (29, the 1st equation then becomes γ (G (1 = φ 1 γ (G ( + θ 1 (θ 1 + φ 1 i= φ 1 = γ (G (2/γ (G (1. (3 i= ( 1 i φ i 1 1 γ (G ( i. (31 Since now φ 1 is given by (3 in terms of γ (G (2 and γ (G (1, this represents a quadratic equation for θ 1 in terms of the Gini autocovariances. Since the terms in the infinite sum in (31 decrease rapidly in magnitude, only a few terms are needed. 5 A Nonlinear Time Series Model and Gini Autocovariances Here we examine the role of the Gini autocovariance function for a particular nonlinear type of autoregressive process with heavy tails. Following Yeh, Arnold, and Robertson (1988, an autoregressive Pareto process is given (with some changes in notation by { p 1/α X t 1, with probability p, X t = min{p 1/α X t 1, ε t }, with probability 1 p, 1

13 where < p < 1 and {ε t } is an i.i.d. sequence of Pareto P (σ, α random variables having survival function [ ( α ] 1 x µ F (x = 1 +, x, σ with µ a location parameter (which we shall set = for convenience, σ > a scale parameter, and α > a tail index parameter. (For this distribution, only moments of order k < α are finite. It is understood that ε t is independent of {X s, s t 1}. If the series is initiated at time t = with X distributed as P (σ, α, then the series {X t, t } is strictly stationary with marginal distribution P (σ, α, and in any case X t converges in distribution to P (σ, α. For each choice of the parameters σ, α, and p we have such a process, which is a nonlinear variant of the usual AR(1 process and will be denoted by ARP(1. Its sample paths exhibit frequent rapid increases to peaks followed by sudden drops to lower levels (Figure 1. The ARP(1 processes are especially appealing as tractable models for heavy tailed modeling (α < 2. See Ferreira (212 for recent discussion of ARP(1 along with asymptotically normal estimates of the parameters p and α. Characterization of the dependence structure of ARP(1 poses challenges, however. It is evident, of course, from considering the above definition that the process exhibits weak dependence for p near and increasing dependence as p 1. In fact, the probability that an observation X t is given not by p 1/α X t 1 but rather by the innovation ε t increases with p as follows (proved in the Appendix: P (X t = ε t = 1 + p log p, < p < 1, (32 1 p independently of σ, α, and t. This indicates how often the ARP(1 series starts afresh with an innovation (conditional on not too large. In particular, we have p P (X t = ε t Another desirable approach to characterization of dependence structure is correlation-type analysis. However, for heavy tailed modeling, the standard covariance analysis does not apply, since second order moment assumptions are not available. Even under second order assumptions (α > 2, the usual autocovariance function is problematic for ARP(1. Yeh, Arnold, and Robertson (1988 develop formulas for just the variance γ( and the lag 1 autocovariance γ(1. However, the latter involves an incomplete beta integral. They do not treat γ(k, k 2. Certain other approaches have been developed to explore dependence in ARP(1. Ferreira (212 establishes that ARP(1 has unit upcrossing index, which means that upcrossings of high levels do not cluster, and that the lag k tail dependence coefficient is p k, which means that the probability that X t+k is extreme, given that X t is extreme, decays geometrically with k. Here we develop, in fact, an effective correlation-type analysis for ARP(1 assuming only α > 1, based on the Gini autocovariance function, thus including the heavy tailed setting. We obtain explicit formulas for γ (G (, γ (G ( 1, and γ (G (+1 11

14 in terms of the model parameters σ, p, and α. These formulas suffice for a straightforward interpretation of the covariance-type dependence structure within ARP(1. For this purpose, the computation of γ (G (k = 4 Cov(X 1+k, F (X 1, k =, ±1,..., proceeds as follows. As shown in Yeh, Arnold, and Robertson (1988, ( EX n = σ Γ 1 1 ( Γ = σ π ( π α α α csc. (33 α Also, since F is continuous, EF (X 1 = 1/2. Hence our target may be expressed as γ (G (k = 4 [E(X 1+k F (X 1 σ2 ( Γ 1 1 ( Γ ]. (34 α α Thus the problem reduces to computation of E(X 1+k F (X 1 for k =, ±1, ±2,.... We defer these details to the Appendix and here just comment on the solution and its interpretations relative to the parameters α and p of ARP(1. For k = we obtain γ (G ( = 2σ α Γ ( 1 1 α ( Γ = 2σ α α π ( π α csc, (35 α which, of course, is the Gini mean difference of X 1 and does not depend upon the parameter p. For k = ±1 we obtain γ (G ( 1 = 2σ p(1 p1/α 1 p γ (G (1 = 2σ p(p 1/α 1 1 p π ( π α csc α π ( π α csc α (36, (37 which depend upon all three parameters σ, α, and p. Consequently, the lag ±1 Gini autocorrelations are ρ (G ( 1 = αp(1 p1/α 1 p ρ (G (1 = αp(p 1/α 1 1 p (38 = p 1/α ρ (G ( 1. (39 These explicit functions of p and α provide simple measures showing how the correlation-type dependence in the ARP(1 process varies with the parameters p and α. Such information is not available using the Pearson autocovariances and autocorrelations, which lack explicit formulas even when well-defined (α > 2. 12

15 For k 2, explicit formulas for γ (G (k are not available, but an algorithm for numerical evaluation of γ (G (k for any k and any desired p and α have been developed. To indicate the computational complexity, the algorithm entails numerical evaluation of 3 integrals for lags ±1 with a computing time of < 1 second, 11 integrals for lags ±2 with a computing time of about 5 minutes, 49 integrals for lags ±3 with a computing time of about 9 hours, 261 integrals for lags ±4, and 1631 integrals for lags ±5. Thus autovariances and autocorrelations for lags k 3 are readily available, and in applications On the other hand, in applications these provide quite substantial enough information on correlation-type dependence. Views of the lag ±1 Gini autocorrelations ρ (G (±1 of ARP(1 as functions of < p < 1 are provided in Figure 2 for α = 1.1, 1.5, 1.75, 2., and 2.5. It is seen that the lag +1 Gini autocorrelation changes with significantly with α but considerably less with p, whereas the lag 1 Gini autocorrelation changes very significantly with p (approximately as the identity function but considerably less with α. Thus ρ (G ( 1 and ρ (G (+1 provide complementary pieces of information which, taken together, characterize nicely how the correlation-type dependence structure of ARP(1 relates to α and p. 6 Sample Gini Autocovariance Functions Here we introduce a suitable sample Gini autocovariance function, in the case of observations {X 1, X 2,..., X T } over times t = 1 to t = T from a strictly stationary time series {X t } under 1st order assumptions. Our estimators are defined by using the representations of the Gini autocovariances as functionals of relevant cdfs and substituting empirical cdfs. The estimators also have L-statistic structure and variants have U-statistic structure. 6.1 Estimation of the marginal distribution function F Let F denote the common cdf of X t. An unbiased estimator is given by the usual sample distribution function, F T (x = T 1 T I{X t x}, < x <, t=1 where I( is the usual indicator function defined as I(A = 1 or according as event A holds or not. Further, assuming that the stationary time series {X i } is ergodic, the estimator F T converges to F in suitable senses (in probability or almost surely as T. 6.2 Estimation of γ (G ( Denote the ordered values by X 1:T X 2:T X T :T. A natural approach is to estimate γ (G ( = α(f by its sample analogue estimator α( F T obtained by substitution of F T for 13

16 F in (3. This yields γ (G ( = α( F T = 2 x (2 F T (x 1 d F T (x = 2 T 2 T (2t T X t:t. (4 t=1 Proof. We have 2 x (2 F T (x 1 d F T (x = 2 T T t=1 X t:t ( 2 t T 1 = 2 T 2 T (2t T X t:t. t=1 6.3 Estimation of γ (G (k, k For estimation of γ (G (k, k, let us use (7 to write γ (G (k = 2 x (2F X1 (y 1 df X1+k, X 1 (x, y. (41 For k +1, let F be the sample version of F X1+k, X 1 based on the T k lag k bivariate observations S = {(X 1+k, X 1, (X 2+k, X 2,..., (X T, X T k }. For k 1, let F be the sample version of F X1+k, X 1 based on the T k lag k bivariate observations S = {(X 1, X 1+ k, (X 2, X 2+ k,..., (X T k, X T }, which are the same pairs as in S but with the components of each pair reversed in order. These sample bivariate distribution functions along with F T again yield estimators of γ (G (k by substitution into (41: γ (G T (k = 2 x (2 F T (y 1 d F X1+k, X 1 (x, y, (42 where F X1+k, X 1 denotes F for k +1 and F for k 1. For k +1, this yields γ (G T (k = 2 T k (2 T k F T (X t:t k 1X [t:t k],1,2, k 1, (43 t=1 where X t:t k is the tth ordered 2nd component value, and X [t:t k],1,2 is the 1st component value concomitant to the tth ordered 2nd component value, relative to the bivariate pairs in S. Similarly, for k 1, we obtain T k γ (G T (k = 2 (2 T k F T (X [t:t k ],2,1 1X t:t k, k 1, t=1 14

17 where X t:t k is the tth ordered 1st component value, and X [t:t k ],2,1 is the 2nd component value that is concomitant to the tth ordered 1st component value, relative to the bivariate pairs in S. However, relative to the set S of bivariate pairs, this equation may be expressed T k γ (G T (k = 2 (2 T k F T (X [t:t k ],1,2 1X t:t k, k 1, (44 t=1 where X t:t k denotes the tth ordered 2nd component value and X [t:t k ],1,2 denotes the 1st component value that is concomitant to the tth ordered 2nd component value. Equations (43 and (44 conveniently express γ (G T (k, k = ±1, ±2,..., in a single notation and relative to the single set of bivariate pairs S. As discussed earlier, for practical use with data it is helpful to express these as separate Gini autocovariance functions. For this purpose, we express these formulas as T γ (A T (k = 2 k (2 T k F T (X t:t k 1X [t:t k],1,2, k 1, (45 t=1 T γ (B T (k = 2 k (2 T k F T (X [t:t k],1,2 1X t:t k, k 1, (46 t=1 where X t:t k is the tth ordered 2nd component value, and X [t:t k],1,2 is the 1st component value concomitant to the tth ordered 2nd component value, relative to the bivariate pairs in S. 6.4 Asymptotic behavior of the sample estimators Consistency and asymptotic normality for the sample Gini autocovariances can be established in a straightforward fashion, although the details are a bit cumbersome. Here we provide a sketch. We may apply Gâteaux differentiation of functionals of distribution functions and related influence function methods, combined with ergodic theory and central limit theory for means and U-statistics based on time series. For γ (G T (, we use its representation as a statistical functional via (3 and apply Gâteaux differentiation as in Serfling (198, Chapter 6 to obtain the following influence function for γ (G T (: IF(z; γ (G (, F = 2z(2F (x 1 γ(g (, z R. T This leads to the following first order approximation to the estimation error T γ (G T 1 ( γ(g ( T T T t=1 IF(X t ; γ (G T (, F + remainder. (47 Under suitable ergodic assumptions, the remainder converges to almost surely. Then the first term in (47 may be treated by classical convergence theory for sums of dependent 15

18 random variables, in this case the dependent summands coming from a time series. Under suitably high moment assumptions, we obtain asymptotic normality. Otherwise, more complicated limit distributions arise. A similar approach may be carried for γ (G T (k, based on the representation (7, which expresses γ (G T (k as a functional of two distributions, F and the joint distribution of X 1 and X 1+ k and developing two corresponding partial influence functions. Specifically, we may apply partial Gâteaux differentiation as in Pires and Branco (22 and obtain a first order approximation to the estimation error of form γ (G T (k γ(g (k [ T T 1 IF (1 (X t ; γ (G (k, F, F X1+k,X 1 t=1 T k + (T k 1 t=1 IF (2 ((X t, X t+ k ; γ (G (k, F, F X1+k,X 1 + remainder. (48 Again, under ergodicity other assumptions, we may ignore the remainder term and obtain asymptotic convergence results based on the leading terms in (48. In view of the considerable variability in heavy tailed data and modeling, however, the asymptotic distributions even if suitably identified would not be applicable except for enormous sample length. Rather, a bootstrap approach would be recommended in lieu of asymptotic theory. However, in the main we view the sample Gini autocovariance function as a descriptive tool rather than an inference procedure. Therefore, convergence theory is primarily of technical interest. 7 Concluding remarks 7.1 Roles of the Gini autocovariance function The population Gini autocovariance function provides a characterization of covariance-type dependence in a time series that exists under just first moment assumptions. Thus it offers an alternative to the usual Pearson autocovariance function and is available when the latter does not exist. The sample version provides an analogous descriptive tool. The Gini autocovariance function also can facilitate model-fitting, as seen in Section 4 for AR, MA, and ARMA models, which have representions in terms of the Gini autocovariances. Again, this provides an alternative to using Pearson autocovariances. The sample Gini autocovariance function is nonparametric in nature, not being based on any particular form of time series model. However, in some cases alternative modelbased estimates of the relevant Gini autocovariance function become available as well. For example, in fitting ARP(1, one first obtains estimates of the its parameters σ, p, and α 16

19 (straightforward, efficient estimators are provided in Ferreira, 212. Using the explicit expressions for the Gini autocovariances in terms of these, or the numerical algorithms, one can develop model-based estimates of γ (G (k conveniently up to lags k 3 (which suffice for characterization of the correlation-type dependence pattern in the time series. Empirical investigation (Carcea and Serfling, 212 indicates that these improve upon the nonparametric estimates, as of course they should, being based on additional information. 7.2 Related work and further studies 1. This paper updates in part work in Serfling (21. Related work emphasizing a Gini partial autocovariance function has been carried out by Shelef and Schechtman ( As discussed especially with the ARP(1 model in Section 5, the Gini autocovariance function has two informative components associated with each lag of magnitude k. Their separate roles in practical applications are of interest. An application of this feature is developed in Shelef and Schechtman ( Robust versions of the sample Gini autocovariance function are desired, especially in the case of heavy tailed data. 4. It is desirable to relax the strictly stationarity assumption and extend the sample methods to the generality of 1st order Gini stationarity. 5. Efficient numerical algorithms are desired for the nonlinear Gini systems of equations for fitting MA(q and ARMA(p, q models. 6. Detailed results on almost sure convergence and asymptotic distribution theory are desired for the sample Gini autocovariance function in general and for the Gini-Yule-Walker estimates for AR(1. 7. For a range of fixed sample sizes, and a range of scenarios for heavy tails and for outliers, simulation studies of the sample Gini autocovariance function are being carried out (Carcea and Serfling, A standard approach with the usual Yule-Walker estimates is to recursively fit AR models using the Durbin-Levinson algorithm. Extension to the Gini-Yule-Walker estimates is desired. 9. In the AR(1 model, nonparametric unit root and cointegration tests based on the usual estimator of φ 1 can be extended to allow for heavy tails by using the Gini estimator of φ The notion of Gini covariance for bivariate random variables has been extended by Serfling and Xiao (27 to a notion of Gini covariance matrix for a random vector. Likewise, a Gini autocovariance matrix for vector time series is of interest. 17

20 Acknowledgments The authors thank G. L. Thompson for encouragement and special insights. They also are very grateful to Edna Schechtman, and Amit Shelef for helpful and constructive remarks and a preprint of their paper. Further, support under a Society of Actuaries grant, National Science Foundation Grants DMS and DMS , and National Security Agency Grant H is gratefully acknowledged. References [1] Bloomfield, P. and Steiger, W. S. (1983. Least Absolute Deviations: Theory, Applications and Algorithms. Birkhaüser, Boston. [2] Box, G. E. P. and Jenkins, G. M. (1976. Time Series Analysis: forecasting and control, Revised Edition. Prentice Hall. [3] Brockwell, P. J. and Davis, R. A. (1991. Time Series: Theory and Methods, 2nd edition. Springer. [4] Carcea, M. and Serfling, R. (212. Fitting linear and nonlinear autoregressive models allowing heavy tails and outliers. In preparation. [5] David, H. A. and Nagaraja, H. N. (23. Order Statistics, 3rd Edition. Wiley, New York. [6] Ferreira M. (212. On the extremal behavior of the Pareto Process: an alternative for ARMAX modeling. Kybernetika [7] Gini, C. (1912. Variabilità e mutabilità, contributio allo studio delle distribuzioni e delle relazione statistiche. Studi Economico-Giuridici della Reale Università di Cagliari [8] Ling, S. (25. Self-weighted least absolute deviation estimation for infinite variance autoregressive models. Journal of the Royal Statistical Society, Series B [9] Pires, A. M. and Branco, J. A. (22. Partial influence functions. Journal of Multivariate Analysis [1] Schechtman, E. and Yitzhaki, S. (1987. A measure of association based on Gini s mean difference. Communications in Statistics Theory and Methods [11] Shelef, A. and Schechtman, E. (211. A Gini-based methodology for identifying and analyzing time series with non-normal innovations. Preprint. [12] Serfling, R. (198. Approximation Theorems of Mathematical Statistics. John Wiley & Sons, New York. 18

21 [13] Serfling, R. (21. Fitting autoregressive models via Yule-Walker equations allowing heavy tail innovations. Preprint. Available at serfling. [14] Serfling, R. and Xiao, P. (27. A contribution to multivariate L-moments: L- comoment matrices. Journal of Multivariate Analysis [15] Yeh, H. C., Arnold, B. C., and Robertson, C. A. (1988. Pareto processes. Journal of Applied Probability Appendix Proof of (32. Put F (x = F (x/σ, where F (x = x α /(1 + x α, and put f (x = F (x = αx α 1 /(1 + x α 2. Then it is easily checked that P (X t = ε t = (1 pα which using Mathematica yields (32. x 2α 1 (p + x α (1 + x α 2 dx, Proofs of (35, (36, and (37. (i For k =, we obtain E(X 1 F (X 1 as follows. We have E(X 1 F (X 1 = σe((x 1 /σf (X 1 /σ = σ = σα = σα 1 α x xα x α 1 dx = σα 1 + x α (1 + x α 2 Γ( 2α+1 2α+1 Γ(3 α α Γ(3 Here we have used the standard integral formula u a m(a+1 bc/b du = (m + u b c b xf (xf (x dx = σ 1 2 Γ ( α Γ( a+1 a+1 Γ(c b Γ(c x 2α (1 + x α dx 3 Γ b ( 1 1. α, (49 provided that a > 1, b >, m >, and c > a+1. Here we are applying (49 with a = 2α, b b = α, c = 3, and m = 1, and these substitutions do satisfy the constraints provided that α > 1, which we are assuming throughout. (Also, with the substitutions a = b = α, c = 2, and m = 1, which too satisfy the constraints provided that α > 1, we obtain (33. Now returning to (34 for k =, we readily obtain γ (G ( = 4 [E(X 1 F (X 1 σ2 ( Γ 1 1 ( Γ ] α α = 2σ ( α Γ 1 1 ( Γ = 2σ α α α 19 π ( π α csc. α

22 Of course, γ (G ( is the Gini mean difference of X 1. (ii Turning now to the case k = 1, we evaluate E(X F (X 1, which may be expressed as σe((x /σf (X 1 /σ = σe(xf (Y, where { p 1/α X, with probability p, Y = min{p 1/α X, Z}, with probability 1 p, where X and Z are independent with distribution F. Letting W be a Bernoulli(p random variable independent of X and Z, we may represent Y and XF (Y as and likewise Y = p 1/α X I(W = 1 + min{p 1/α X, Z} I(W = XF (Y = XF (p 1/α X I(W = 1 + XF (min{p 1/α X, Z} I(W =. By the independence assumption regarding W, Note that E(XF (Y = p E(XF (p 1/α X + (1 p E(XF (min{p 1/α X, Z}. (5 F (p 1/α x = p 1 x α 1 + p 1 x α = xα p + x α. For the first expectation on the righthand side of (5, we have where E(XF (p 1/α X = A(p, α = xf (p 1/α xf (x dx = αa(p, α, (51 x 2α (p + x α (1 + x α 2 dx. For the second expectation on the righthand side of (5, we have [ ] zp 1/α E(XF (min{p 1/α X, Z} = xf (p 1/α xf (x dx f (z dz + [ ] F (z xf (x dx f (z dz zp 1/α = α 2 [B(p, α + C(p, α], (52 where B(p, α = [ zp 1/α ] x 2α (p + x α (1 + x α 2 dx z α 1 (1 + z α dz 2 2

23 and We thus have C(p, α = [ zp 1/α ] x α (1 + x α dx z 2α 1 2 (1 + z α dz. 3 E(XF (Y = pαa(p, α + (1 p 2 α 2 (B(p, α + C(p, α. (53 (iii For the case k = +1, we evaluate E(F (X X 1 = σe(f (X /σ(x 1 /σ = σe(f (XY, with X and Y as above. Similar to the steps for the case k = 1, we obtain E(F (XY = p E(F (Xp 1/α X + (1 p E(F (X min{p 1/α X, Z}. (54 For the first expectation on the righthand side of (54, we have from previous steps E(F (Xp 1/α X = p 1/α xf (xf (x dx = p 1/α 2 = p 1/α 2 = p 1/α 2 Γ ( ( Γ 1 1 α α ( ( Γ ( Γ 1 1 α α α ( π ( π α α csc. (55 α For the second expectation on the righthand side of (54, we have [ ] zp 1/α E(F (X min{p 1/α X, Z} = p 1/α xf (xf (x dx f (z dz where and We thus have E(F (XY = p1 1/α 2 D(p, α = E(p, α = + [ ] z F (xf (x dx f (z dz zp 1/α = α 2 p 1/α D(p, α + α 2 E(p, α, (56 [ zp 1/α [ zp 1/α ] x 2α (1 + x α dx z α 1 3 (1 + z α dz 2 ] x 2α 1 (1 + x α dx z α 3 (1 + z α dz. 3 ( π ( π α α csc + (1 pα 2 (p 1/α D(p, α + E(p, α. (57 α 21

24 To complete the derivations of γ (G (k for k = ±1, we need to evaluate the integrals A(p, α, B(p, α, C(p, α, D(p, α, and E(p, α. With the help of Mathematica, the following formulas are obtained: A(p, α = 2α2 (1 p[(1 p αp(1 p 1/α ] 2α 3 (1 p 3 π ( π α csc α B(p, α = α[p 1/α ( 1 + α + p + αp 1 α + p αp] 2α 3 (1 p 3 C(p, α = α[(1 p(1 + p1+1/α 2αp(1 p 1/α ] 2α 3 (1 p 3 D(p, α = p[α(1 p2 (1 p 2 2α 2 p(1 p 1/α ] 2α 3 (1 p 3 π ( π α csc α π ( π α csc α π ( π α csc α E(p, α = αp 1/α [2αp 2 (1 p 1/α + p 1/α (1 p(1 2p p 1 1/α ] 2α 3 (1 p 3 π ( π α csc. α Using these in (53 and (57, and combining with (34, we obtain (36 and (37. 22

25 Observations Observations Index Index Observations Index Figure 1: Sample paths of an ARP(1 process of time length 2, for α = 1.5 and p =.2 (upper left, p =.5 (upper right, and p =.8 (lower. Note that the vertical scales differ. 23

26 Gini Correlation Gini Correlation p p Gini Correlation Gini Correlation p p Gini Correlation p Figure 2: Lag +1 (upper curve and Lag 1 (lower curve Gini autocorrelations of ARP(1 processes, for α = 1.1 (upper left, 1.5 (upper right, 1.75 (middle left, 2. (middle right, and 2.5 (lower, and < p < 1. 24

27 Gini Correlation Gini Correlation p p Gini Correlation Gini Correlation p p Gini Correlation p Figure 3: Lag +2 (upper curve and Lag 2 (lower curve Gini autocorrelations of ARP(1 processes, for α = 1.1 (upper left, 1.5 (upper right, 1.75 (middle left, 2. (middle right, and 2.5 (lower, and < p < 1. 25

28 Gini Autocorrelation Gini Autocorrelation Lag Lag Gini Autocorrelation Gini Autocorrelation Lag Lag Gini Autocorrelation Gini Autocorrelation Lag Lag Figure 4: Gini autocorrelation functions of ARP(1 processes, for lags 1, 2, 3 (left column and lags +1, +2, +3 (right column for α = 1.5 and p =.2 (upper row, p =.5 (middle row, and p =.8 (lower row, 26

A Gini Autocovariance Function for Heavy Tailed Time Series Modeling

A Gini Autocovariance Function for Heavy Tailed Time Series Modeling A Gini Autocovariance Function for Heavy Tailed Time Series Modeling Marcel Carcea 1 and Robert Serfling 2 University of Texas at Dallas January 214 1 Department of Mathematics, University of Texas at

More information

A Gini Autocovariance Function for Time Series Modeling

A Gini Autocovariance Function for Time Series Modeling A Gini Autocovariance Function for Time Series Modeling Marcel Carcea 1 and Robert Serfling 2 Western New England University and University of Texas at Dallas February 11, 215 1 Department of Mathematics,

More information

A Gini Autocovariance Function for Time Series Modeling

A Gini Autocovariance Function for Time Series Modeling A Gini Autocovariance Function for Time Series Modeling Marcel Carcea 1 and Robert Serfling 2 University of Texas at Dallas February 28, 214 1 Department of Mathematical Sciences, University of Texas at

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Asymptotic Relative Efficiency in Estimation

Asymptotic Relative Efficiency in Estimation Asymptotic Relative Efficiency in Estimation Robert Serfling University of Texas at Dallas October 2009 Prepared for forthcoming INTERNATIONAL ENCYCLOPEDIA OF STATISTICAL SCIENCES, to be published by Springer

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Generalized Multivariate Rank Type Test Statistics via Spatial U-Quantiles

Generalized Multivariate Rank Type Test Statistics via Spatial U-Quantiles Generalized Multivariate Rank Type Test Statistics via Spatial U-Quantiles Weihua Zhou 1 University of North Carolina at Charlotte and Robert Serfling 2 University of Texas at Dallas Final revision for

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

Published: 26 April 2016

Published: 26 April 2016 Electronic Journal of Applied Statistical Analysis EJASA, Electron. J. App. Stat. Anal. http://siba-ese.unisalento.it/index.php/ejasa/index e-issn: 2070-5948 DOI: 10.1285/i20705948v9n1p68 The Lawrence-Lewis

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

An algorithm for robust fitting of autoregressive models Dimitris N. Politis

An algorithm for robust fitting of autoregressive models Dimitris N. Politis An algorithm for robust fitting of autoregressive models Dimitris N. Politis Abstract: An algorithm for robust fitting of AR models is given, based on a linear regression idea. The new method appears to

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Inequalities Relating Addition and Replacement Type Finite Sample Breakdown Points

Inequalities Relating Addition and Replacement Type Finite Sample Breakdown Points Inequalities Relating Addition and Replacement Type Finite Sample Breadown Points Robert Serfling Department of Mathematical Sciences University of Texas at Dallas Richardson, Texas 75083-0688, USA Email:

More information

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes arxiv:1511.07091v2 [math.st] 4 Jan 2016 Akimichi Takemura January, 2016 Abstract We present a short proof

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Time Series Analysis. Asymptotic Results for Spatial ARMA Models

Time Series Analysis. Asymptotic Results for Spatial ARMA Models Communications in Statistics Theory Methods, 35: 67 688, 2006 Copyright Taylor & Francis Group, LLC ISSN: 036-0926 print/532-45x online DOI: 0.080/036092050049893 Time Series Analysis Asymptotic Results

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Nonlinear Time Series Modeling

Nonlinear Time Series Modeling Nonlinear Time Series Modeling Part II: Time Series Models in Finance Richard A. Davis Colorado State University (http://www.stat.colostate.edu/~rdavis/lectures) MaPhySto Workshop Copenhagen September

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

A BIVARIATE MARSHALL AND OLKIN EXPONENTIAL MINIFICATION PROCESS

A BIVARIATE MARSHALL AND OLKIN EXPONENTIAL MINIFICATION PROCESS Faculty of Sciences Mathematics, University of Niš, Serbia Available at: http://wwwpmfniacyu/filomat Filomat 22: (2008), 69 77 A BIVARIATE MARSHALL AND OLKIN EXPONENTIAL MINIFICATION PROCESS Miroslav M

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

Thomas J. Fisher. Research Statement. Preliminary Results

Thomas J. Fisher. Research Statement. Preliminary Results Thomas J. Fisher Research Statement Preliminary Results Many applications of modern statistics involve a large number of measurements and can be considered in a linear algebra framework. In many of these

More information

Commentary on Basu (1956)

Commentary on Basu (1956) Commentary on Basu (1956) Robert Serfling University of Texas at Dallas March 2010 Prepared for forthcoming Selected Works of Debabrata Basu (Anirban DasGupta, Ed.), Springer Series on Selected Works in

More information

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA Applied Time Series Analysis Wayne A. Woodward Southern Methodist University Dallas, Texas, USA Henry L. Gray Southern Methodist University Dallas, Texas, USA Alan C. Elliott University of Texas Southwestern

More information

KALMAN-TYPE RECURSIONS FOR TIME-VARYING ARMA MODELS AND THEIR IMPLICATION FOR LEAST SQUARES PROCEDURE ANTONY G AU T I E R (LILLE)

KALMAN-TYPE RECURSIONS FOR TIME-VARYING ARMA MODELS AND THEIR IMPLICATION FOR LEAST SQUARES PROCEDURE ANTONY G AU T I E R (LILLE) PROBABILITY AND MATHEMATICAL STATISTICS Vol 29, Fasc 1 (29), pp 169 18 KALMAN-TYPE RECURSIONS FOR TIME-VARYING ARMA MODELS AND THEIR IMPLICATION FOR LEAST SQUARES PROCEDURE BY ANTONY G AU T I E R (LILLE)

More information

Time Series Analysis Fall 2008

Time Series Analysis Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Introduction 1 14.384 Time

More information

Residuals in Time Series Models

Residuals in Time Series Models Residuals in Time Series Models José Alberto Mauricio Universidad Complutense de Madrid, Facultad de Económicas, Campus de Somosaguas, 83 Madrid, Spain. (E-mail: jamauri@ccee.ucm.es.) Summary: Three types

More information

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m. Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

On a simple construction of bivariate probability functions with fixed marginals 1

On a simple construction of bivariate probability functions with fixed marginals 1 On a simple construction of bivariate probability functions with fixed marginals 1 Djilali AIT AOUDIA a, Éric MARCHANDb,2 a Université du Québec à Montréal, Département de mathématiques, 201, Ave Président-Kennedy

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA

More information

Multivariate Distributions

Multivariate Distributions IEOR E4602: Quantitative Risk Management Spring 2016 c 2016 by Martin Haugh Multivariate Distributions We will study multivariate distributions in these notes, focusing 1 in particular on multivariate

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED

A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED A TIME SERIES PARADOX: UNIT ROOT TESTS PERFORM POORLY WHEN DATA ARE COINTEGRATED by W. Robert Reed Department of Economics and Finance University of Canterbury, New Zealand Email: bob.reed@canterbury.ac.nz

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

July 31, 2009 / Ben Kedem Symposium

July 31, 2009 / Ben Kedem Symposium ing The s ing The Department of Statistics North Carolina State University July 31, 2009 / Ben Kedem Symposium Outline ing The s 1 2 s 3 4 5 Ben Kedem ing The s Ben has made many contributions to time

More information

CONTAGION VERSUS FLIGHT TO QUALITY IN FINANCIAL MARKETS

CONTAGION VERSUS FLIGHT TO QUALITY IN FINANCIAL MARKETS EVA IV, CONTAGION VERSUS FLIGHT TO QUALITY IN FINANCIAL MARKETS Jose Olmo Department of Economics City University, London (joint work with Jesús Gonzalo, Universidad Carlos III de Madrid) 4th Conference

More information

1 Introduction to Generalized Least Squares

1 Introduction to Generalized Least Squares ECONOMICS 7344, Spring 2017 Bent E. Sørensen April 12, 2017 1 Introduction to Generalized Least Squares Consider the model Y = Xβ + ɛ, where the N K matrix of regressors X is fixed, independent of the

More information

A Functional Central Limit Theorem for an ARMA(p, q) Process with Markov Switching

A Functional Central Limit Theorem for an ARMA(p, q) Process with Markov Switching Communications for Statistical Applications and Methods 2013, Vol 20, No 4, 339 345 DOI: http://dxdoiorg/105351/csam2013204339 A Functional Central Limit Theorem for an ARMAp, q) Process with Markov Switching

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. 9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Extremogram and ex-periodogram for heavy-tailed time series

Extremogram and ex-periodogram for heavy-tailed time series Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

Asymptotic inference for a nonstationary double ar(1) model

Asymptotic inference for a nonstationary double ar(1) model Asymptotic inference for a nonstationary double ar() model By SHIQING LING and DONG LI Department of Mathematics, Hong Kong University of Science and Technology, Hong Kong maling@ust.hk malidong@ust.hk

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Extremogram and Ex-Periodogram for heavy-tailed time series

Extremogram and Ex-Periodogram for heavy-tailed time series Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

LINEAR STOCHASTIC MODELS

LINEAR STOCHASTIC MODELS LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information