Stochastic volatility models: tails and memory

Similar documents
Tail empirical process for long memory stochastic volatility models

Heavy Tailed Time Series with Extremal Independence

Limit theorems for long memory stochastic volatility models with infinite variance: Partial Sums and Sample Covariances.

Practical conditions on Markov chains for weak convergence of tail empirical processes

Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements

High quantile estimation for some Stochastic Volatility models

Extremogram and Ex-Periodogram for heavy-tailed time series

Modelling and Estimating long memory in non linear time series. I. Introduction. Philippe Soulier

Modeling and testing long memory in random fields

Extremogram and ex-periodogram for heavy-tailed time series

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY

The largest eigenvalues of the sample covariance matrix. in the heavy-tail case

Tail process and its role in limit theorems Bojan Basrak, University of Zagreb

The sample autocorrelations of financial time series models

Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion

Long-Range Dependence and Self-Similarity. c Vladas Pipiras and Murad S. Taqqu

Poisson Cluster process as a model for teletraffic arrivals and its extremes

Long-range dependence

Some functional (Hölderian) limit theorems and their applications (II)

On uniqueness of moving average representations of heavy-tailed stationary processes

Self-Normalization for Heavy-Tailed Time Series with Long Memory

Extremal covariance matrices by

A Quadratic ARCH( ) model with long memory and Lévy stable behavior of squares

The tail empirical process for some long memory sequences

Trimmed sums of long-range dependent moving averages

Beyond the color of the noise: what is memory in random phenomena?

Does k-th Moment Exist?

Uniformly and strongly consistent estimation for the Hurst function of a Linear Multifractional Stable Motion

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS*

ASYMPTOTIC INDEPENDENCE AND A NETWORK TRAFFIC MODEL

Regular Variation and Extreme Events for Stochastic Processes

Multivariate Heavy Tails, Asymptotic Independence and Beyond

Nonlinear Time Series Modeling

Assessing the dependence of high-dimensional time series via sample autocovariances and correlations

Introduction to Empirical Processes and Semiparametric Inference Lecture 09: Stochastic Convergence, Continued

The Slow Convergence of OLS Estimators of α, β and Portfolio. β and Portfolio Weights under Long Memory Stochastic Volatility

Limit theorems for dependent regularly varying functions of Markov chains

Reduction principles for quantile and Bahadur-Kiefer processes of long-range dependent linear sequences

Stable Process. 2. Multivariate Stable Distributions. July, 2006

Poisson random measure: motivation

On robust tail index estimation for linear long-memory processes

On the estimation of the heavy tail exponent in time series using the max spectrum. Stilian A. Stoev

MODELS FOR COMPUTER NETWORK TRAFFIC

Long strange segments in a long-range-dependent moving average

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Gaussian Processes. 1. Basic Notions

ON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE. Joo-Mok Kim* 1. Introduction

A PRACTICAL WAY FOR ESTIMATING TAIL DEPENDENCE FUNCTIONS

Relations Between Hidden Regular Variation and Tail Order of. Copulas

GARCH processes probabilistic properties (Part 1)

FRACTIONAL BROWNIAN MOTION WITH H < 1/2 AS A LIMIT OF SCHEDULED TRAFFIC

Nonparametric regression with martingale increment errors

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Positive Harris Recurrence and Diffusion Scale Analysis of a Push Pull Queueing Network. Haifa Statistics Seminar May 5, 2008

Statistica Sinica Preprint No: SS R3

Thomas Mikosch and Daniel Straumann: Stable Limits of Martingale Transforms with Application to the Estimation of Garch Parameters

Stochastic Volatility Models with Long Memory

Nonparametric Density Estimation fo Title Processes with Infinite Variance.

Statistica Sinica Preprint No: SS R3

Regularly varying functions

Operational Risk and Pareto Lévy Copulas

User Guide for Hermir version 0.9: Toolbox for Synthesis of Multivariate Stationary Gaussian and non-gaussian Series

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Spatial extreme value theory and properties of max-stable processes Poitiers, November 8-10, 2012

Published: 26 April 2016

The Equivalence of Ergodicity and Weak Mixing for Infinitely Divisible Processes1

Multivariate Markov-switching ARMA processes with regularly varying noise

Strictly Stationary Solutions of Autoregressive Moving Average Equations

The Chaotic Character of the Stochastic Heat Equation

MODELING TELETRAFFIC ARRIVALS BY A POISSON CLUSTER PROCESS

Convergence of Point Processes with Weakly Dependent Points

Exercises in stochastic analysis

On Kesten s counterexample to the Cramér-Wold device for regular variation

LARGE SAMPLE BEHAVIOR OF SOME WELL-KNOWN ROBUST ESTIMATORS UNDER LONG-RANGE DEPENDENCE

An invariance principle for sums and record times of regularly varying stationary sequences

Mi-Hwa Ko. t=1 Z t is true. j=0

Nonlinear time series

Asymptotic Analysis of Exceedance Probability with Stationary Stable Steps Generated by Dissipative Flows

THEOREMS, ETC., FOR MATH 516

ON THE TAIL INDEX ESTIMATION OF AN AUTOREGRESSIVE PARETO PROCESS

Competing Brownian Particles

Asymptotic Statistics-III. Changliang Zou

Optimal series representations of continuous Gaussian random fields

Rank tests for short memory stationarity

1. Stochastic Processes and filtrations

CHANGE DETECTION IN TIME SERIES

Notes on Time Series Modeling

Research Article Strong Convergence Bound of the Pareto Index Estimator under Right Censoring

Topics in fractional Brownian motion

α i ξ t i, where ξ t i.i.d. (0, 1),

January 22, 2013 MEASURES OF SERIAL EXTREMAL DEPENDENCE AND THEIR ESTIMATION

Multivariate Markov-switching ARMA processes with regularly varying noise

A PECULIAR COIN-TOSSING MODEL

Some Results on the Ergodicity of Adaptive MCMC Algorithms

Inference for High Dimensional Robust Regression

The Convergence Rate for the Normal Approximation of Extreme Sums

STAT 992 Paper Review: Sure Independence Screening in Generalized Linear Models with NP-Dimensionality J.Fan and R.Song

9 Brownian Motion: Construction

Time Series 3. Robert Almgren. Sept. 28, 2009

Operational Risk and Pareto Lévy Copulas

Transcription:

: tails and memory Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Murad Taqqu 19 April 2012 Rafa l Kulik and Philippe Soulier

Plan Model assumptions; Limit theorems for partial sums and sample covariances (effect of leverage); Tail empirical process and asymptotic normality of Hill estimator; Limiting conditional laws (regular variation on subcones); Future work, open problems etc. Rafa l Kulik and Philippe Soulier 1

Stochastic volatility with LRD Let {Z i, i Z} be an i.i.d. sequence with regularly varying tails: lim x + P(Z 0 > x) x α L(x) = β, lim x + P(Z 0 < x) x α L(x) = 1 β, (1) where α > 0, L is slowly varying at infinity, and β [0, 1]. Under (1), if moreover E [ Z 0 α ] =, then Z 1 Z 2 is regularly varying and (Davis and Resnick, 1986) lim x + P(Z 0 > x) P(Z 0 Z 1 > x) = 0, lim x + P(Z 1 Z 2 > x) P( Z 1 Z 2 > x) = β 2 + (1 β) 2. Rafa l Kulik and Philippe Soulier 2

We will further assume that {X i } is a stationary zero mean unit variance Gaussian process which admits a linear representation with respect to an i.i.d. Gaussian white noise {η i } with zero mean and unit variance, i.e. X i = c j η i j, j=1 c 2 j = 1. (2) j=1 We assume that the process {X i } either has short memory, in the sense that its covariance function is absolutely summable, or exhibits long memory with Hurst index H (1/2, 1), i.e. its covariance function {ρ n } satisfies ρ n = cov(x 0, X n ) = c j c j+n = n 2H 2 l(n), (3) j=1 where l is a slowly varying function. Rafa l Kulik and Philippe Soulier 3

Let σ be a deterministic, nonnegative and continuous function defined on R. Define σ i = σ(x i ) and the stochastic volatility process {Y i } by Y i = σ i Z i = σ(x i )Z i. (4) Long Memory Stochastic Volatility (LMSV) model: where {η i } and {Z i } are independent. Model with leverage: where {(η i, Z i )} is a sequence of i.i.d. random vectors. For fixed i, Z i and X i are independent, but X i may not be independent of the past {Z j, j < i}. Due to Breiman s lemma, if E[σ α+ɛ (X)] <, then lim x + P(Y 0 > x) P(Z 0 > x) = lim x + P(Y 0 < x) P(Z 0 < x) = E[σα (X 0 )]. (5) Rafa l Kulik and Philippe Soulier 4

Define Partial sums S p,n (t) = [nt] i=1 Y i p, a n = inf{x : P( Y 0 > x) < 1/n}. For any function g such that E[g 2 (η 0 )] < and any integer q 1, define J q (g) = E[H q (η 0 )g(η 0 )], where H q is the q-th Hermite polynomial. The Hermite rank τ(g) of the function g is the smallest positive integer τ such that J τ (g) 0. Let R τ,h be the so-called Hermite process of order τ with self-similarity index 1 τ(1 H). Let τ p = τ(σ p ) be the Hermite rank of the function σ p. Rafa l Kulik and Philippe Soulier 5

Theorem 1. Under the above assumptions, 1. If p < α < 2p and 1 τ p (1 H) < p/α, then a p n (S p,n ne[ Y 0 p ]) D L α/p, (6) where L α/p is a totally skewed to the right α/p-stable Lévy process. 2. If p < α < 2p and 1 τ p (1 H) > p/α, then n 1 ρ τ p/2 n (S p,n ne[ Y 0 p ]) D J τ p (σ p )E[ Z 1 p ] R τp,h. (7) τ p! 3. If p > α, then a p n S p,n D Lα/p, where L α/p is a positive α/p-stable Lévy process. Rafa l Kulik and Philippe Soulier 6

Sample covariances Define Ȳp,n = n 1 n i=1 Y i p and the sample covariances ˆγ p,n (s) = 1 n n ( Y i p Ȳp,n)( Y i+s p Ȳp,n). i=1 Furthermore, define the function K p,s by K p,s(x, y) = m p σ p (x)e[σ p (κ s ζ + c s η 0 + ς s y) Z 0 p ] m p E[σ p (X 0 )σ p (X s ) Z 0 p ], where m p = E[ Z 0 p ], κ s = s 1 j=1 c2 j, ς2 s = j=s+1 c2 j, ζ and η 0 are two independent standard Gaussian random variables. Also, define b n = inf{x : P( Y 0 Y 1 > x) 1/n}. The form of b n is clear in LMSV case, but may be more involved in case of leverage. Rafa l Kulik and Philippe Soulier 7

Theorem 2. Let τ p(s) be the Hermite rank of the bivariate function K p,s, with respect to a bivariate Gaussian vector with standard marginal distributions and correlation ςs 1 γ s. If p < α < 2p and 1 τ p(s)(1 H) < p/α, then nb p n (ˆγ p,n (s) γ p (s)) d L s, where L s is a α/p-stable random variable. If p < α < 2p and 1 τ p(s)(1 H) > p/α, then ρ τ p (s)/2 n (ˆγ p,n (s) γ p (1)) d G s, where the random vector G s is Gaussian if τ p(s) = 1. Rafa l Kulik and Philippe Soulier 8

Comments: The Hermite rank of K p,s may be different in LMSV situation and the case of leverage. Results are also formulated in a multivariate set-up. Dichotomous behaviour according to the interplay between tails and memory. For partial sums and infinite variance transformations of LRD Gaussian sequences, see Davis (1983), Sly and Heyde (2008). For sample covariances of LRD linear processes with heavy tailed innovations, see Kokoszka and Taqqu (1996), Horvath and Kokoszka (2008). For weakly dependent heavy tailed stochastic volatility models without leverage, see Davis and Mikosch (2001). Proofs use point process techniques and martingale part/long memory part decomposition. Several new technical results on tail behaviour of products, beyond Breiman s lemma. We refer to Kulik and Soulier (2012a). Rafa l Kulik and Philippe Soulier 9

Tail empirical process Let u n, n 1, u n. Under our model assumptions T n (x) := F (u n + u n x), x 0, n 1, (8) F (u n ) satisfies T n (x) T (x) = (1 + x) α. Define (see Rootzén (2009), Drees (2000)), T n (s) = 1 n F (u n ) n 1 {Yj >u n +u n s}, j=1 and e n (s) = T n (s) T n (s), s [0, ). (9) Recall that τ p = τ(σ p ). Rafa l Kulik and Philippe Soulier 10

Theorem 3. Consider the model without leverage. (i) If n F (u n )ρ τ 1 n 0 as n, then n F (u n ) e n ( ) converges weakly in D([0, )). to the Gaussian process W T ( ), where W is the standard Brownian motion. (ii) If n F (u n )ρ τ 1 n as n, then ρ τ 1/2 n e n ( ) converges weakly in D([0, )) to the process (E[σ α (X 1 )]) 1 J τp (σ p )T ( )R τp,h(1). Comments For u n big, long memory does not play any role. However, if u n is small, long memory comes into play and the limit is degenerate. One can replace T n with T in the definition of the tail empirical process, provided a second order condition is fulfilled. If the dependence is strong enough, then the limit is degenerate (cf. Dehling and Taqqu 1989). Rafa l Kulik and Philippe Soulier 11

Tail empirical process with random levels Let U(t) = F (1 1/t), where F is the left-continuous inverse of F. Define u n = U(n/k). If F is continuous, then n F (u n ) = k. Define ˆT n (s) = 1 k n 1 {Yj >Y n k:n (1+s)}, j=1 ê n(s) = ˆT n (s) T (s), s [0, ). Theorem 4. Under the conditions of Theorem 3, together with a second order assumption, we have: kê n converges weakly in D([0, )) to B T, where B is the Brownian bridge (regardless of the behaviour of kρ q n). Rafa l Kulik and Philippe Soulier 12

Applications to Hill estimator A natural application of the asymptotic results for the tail empirical process ê n is asymptotic normality of the Hill estimator of the extreme value index γ defined by ˆγ n = 1 k k log i=1 ( Yn i+1:n Y n k:n ) = 0 ˆT n (s) 1 + s ds. Corollary 5. Under the assumptions of Theorem 4, k(ˆγ n γ) converges weakly to the centered Gaussian distribution with variance γ 2. For estimation of the tail index in long memory models see also Abry, Pipiras, Taqqu (2007), Beran, Das (2012). Rafa l Kulik and Philippe Soulier 13

Nice Hill plot Estimated tail index 1.6 1.8 2.0 2.2 Estimated tail index 0 1 2 3 4 0 200 400 600 800 1000 Number of extremes 0 200 400 600 800 1000 Number of extremes Figure 1: Hill estimator: α = 2 and Pareto iid (left panel), σ = 0.05 (right panel) Rafa l Kulik and Philippe Soulier 14

Less nice Hill plot Estimated tail index 2.0 2.2 2.4 2.6 2.8 Estimated tail index 0 1 2 3 4 0 200 400 600 800 1000 Number of extremes 0 200 400 600 800 1000 Number of extremes Figure 2: Hill estimator: α = 2 and Pareto iid (left panel), σ = 1 (right panel) Rafa l Kulik and Philippe Soulier 15

Limiting conditional laws In SV model exceedances over a large threshold are asymptotically independent. More precisely, for any positive integer m, and positive real numbers x, y, lim n np(y 0 > a n x, Y m > a n y) = 0. (10) However, there is a spillover from past extreme observations onto future values in the sense that lim P(Y m y Y 0 > t) = E[σα (X 0 )F Z (y/σ(x m ))] t E[σ α (X 0 )]. (11) We may be also interested in other extremal conditioning events. Rafa l Kulik and Philippe Soulier 16

To give a general framework for these conditional distributions, we introduce a modified version of the extremogram of Davis and Mikosch (2009). For fixed positive integers h < m and h 0, Borel sets A R h and B R h +1, we are interested in the limit denoted by ρ(a, B, m): γ(a, B, m) = lim t P((Y m,..., Y m+h ) B (Y 1,..., Y h ) ta). (12) The set A represents the type of events considered. For instance, if we choose A = {(x, y, z) [0, ) 3 x + y + z > 1}, then for large t, {(Y 2, Y 1, Y 0 ) ta} is the event that the sum of last three observations was extremely large. The set B represents the type of future events of interest. Rafa l Kulik and Philippe Soulier 17

We will consider regular variation on subcones using a concept of a hidden regular variation (Resnick 2007, 2008). We are interested in cones C such that there exists an integer β C and a Radon measure ν C on C such that, for all relatively compact subsets A of C with ν C ( A) = 0, lim t P((Z 1,..., Z h ) ta) ( F Z (t)) β C = ν C (A). (13) The number β C corresponds to the number of components of a point of a relatively compact subset A of the cone C that must be separated from zero. Rafa l Kulik and Philippe Soulier 18

Proposition 6. Consider the model without leverage. Under general conditions on cones and moments, there exists an integer β C and a Radon measure ν C on C such that (13) holds and for all relatively compact sets A C with ν C ( A) = 0, for m > h and h 0, and for any Borel measurable set B R h +1, we have lim t P(Y 1,h ta, Y m,m+h B) ( F Z (t)) β C = = E [ ν C (σ(x 1,h ) 1 A)P(Y m,m+h B X ) ]. It follows the extremogram can be expressed as γ(a, B, m) = E [ ν C (σ(x 1,h ) 1 A)P(Y m,m+h B X ) ] E[ν C (σ(x 1,h ) 1 A)]. (14) Rafa l Kulik and Philippe Soulier 19

Example 1 Fix some positive integer h and consider the cone C = (0, ] h. Then β C = h and the measure ν C is defined by ν C (dz 1,..., dz h ) = α h h i=1 z α 1 i dz i. Consider the set A defined by A = {(z 1,..., z h ) R h + z 1 > 1,..., z h > 1}. Then for m > h, and B R h +1, Proposition 6 yields lim P(Y m,m+h t B Y 1 > t,..., Y h > t) = [ h ] E i=1 σα (X i )P(Y m,m+h B X ) = [ h ]. E i=1 σα (X i ) Rafa l Kulik and Philippe Soulier 20

Example 2 Consider C = [0, ] [0, ] \ {0}. Then β C = 1 and the measure ν C is ν C (dz) = α{z α 1 1 d 1 δ 0 (dz 2 ) + δ 0 (dz 1 )z α 1 2 dz 2 }, where δ 0 is the Dirac point mass at 0. Consider the set A defined by A = {(z 1, z 2 ) R 2 + z 1 + z 2 > 1}. If E[σ α+ɛ (X 1 )] < for some ɛ > 0, then Proposition 6 yields lim P(Y m,m+h t B Y 1 + Y 2 > t) = = E [ P(Y m,m+h B X )(σ α (X 1 ) + σ α (X 2 )) ] E[σ α (X 1 )] + E[σ α (X 2 )]. Rafa l Kulik and Philippe Soulier 21

Estimation An estimator ˆγ n (A, B, m) is naturally defined by ˆγ n (A, B, m) = n r=1 1 {Y r,r+h 1 Y (n:n k) A}1 {Yr+m,r+m+h B} n r=1 1 {Y r,r+h 1 Y (n:n k) A}, where k is a user chosen threshold and Y (n:1) Y (n:n) increasing order statistics of the observations Y 1,..., Y n. are the Rafa l Kulik and Philippe Soulier 22

Estimation-ctd. Theorem 7. Consider the model without leverage. Assume that A is a relatively compact subcone of C. Let k/n 0, n(k/n) β C and assume that the bias is negligible. 1. If n(k/n) β Cρ τ(a,b) n 0, then n(k/n) β C µc (A){ˆγ n (A, B, m) γ(a, B, m)} converges to a centered Gaussian distribution. 2. If n(k/n) β Cρ τ(a,b) n, then ρn τ(a,b)/2 {ˆγ n (A, B, m) γ(a, B, m)} converges weakly to a distribution which is non-gaussian except if τ(a, B) = 1. Rafa l Kulik and Philippe Soulier 23

Open problems Estimation of memory parameter for volatility models using wavelet methods: (cf. Moulines, Roueff, Taqqu 2008-2011). Inference in traffic models (Fay, Mikosch and Samorodnitsky 2006, Fay, Roueff and Soulier 2007). Inference in continuous models, like (FI)CARMA driven by a Lévy noise (cf. Brockwell, Davis, Stelzel, Fasen, Klueppelberg...). Semiparametric methods: logperiodogram regression and local Whittle estimation: Robinson (1995) for long memory Gaussian processes, Moulines, Hurvich, Soulier (2003) for stochastic volatility models. Nothing known for infinite variance linear processes with long memory. Rafa l Kulik and Philippe Soulier 24