GARCH processes continuous counterparts (Part 2) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/ Maphysto Workshop Non-Linear Time Series Modeling Copenhagen September 27 30, 2004 1
Contents 1. Why processes in continuous time? 2. The GARCH diffusion limit of Nelson 3. The COGARCH process: definition and properties 2
Why continuous time models? Observations are quite often irregularly spaced. Observations quite often come in at a very high frequency. Then a continuous time model may provide a better approximation to the discrete data than a discrete model. Aim: Construct continuous time models with features of GARCH. 3
The diffusion approximation of Nelson σ 2 n = ω + λσ 2 n 1ε 2 n 1 + δσ 2 n 1 = ω + (λε 2 n 1 + δ)σ 2 n 1 Y n = σ n ε n where (ε n ) n N i.i.d.: GARCH(1,1) process. Set Then G n := n Y i, n N 0 i=0 G n G n 1 = Y n, so the increments of (G n ) n N0 are a GARCH process, i.e. (G n ) is accumulated GARCH. Question: Can we find a sequence of processes, whose increments on finer becoming grids are GARCH processes, such that the processes converge in distribution to a non-trivial limit process? 4
Setup: Take grid width h > 0 hg nh = h G (n 1)h + h σ nh hε nh, n N, hσ 2 (n+1)h = ω h + ( h 1 λ h hε 2 nh + δ h ) hσ 2 nh, n N 0, ( h ε nh ) n N0 i.i.d. N(0, h) ( h σ 2 0, hg 0 ) independent of ( h ε nh ) n N ω h > 0, λ h 0, δ h 0. Then ( h G nh h G (n 1)h ) n N is GARCH(1,1) process hg t := h G nh, hσ 2 t := h σ 2 nh, nh t < (n + 1)h defines ( h G t, hσ 2 t ) for all t R + Question: When does ( h G t, hσ 2 t ) t 0 converge weakly to a process (G, σ 2 ) as h 0? (weak convergence is in the space D(R +, R 2 ) of càdlàg functions, endowed with the Borel sets of the Skorohod topology; weak convergence of processes implies in particular convergence of finite dimensional distributions) 5
Theorem: (Nelson, 1990) Suppose ( h G 0, hσ 2 0) d (G 0, σ 2 0), h 0 P (σ 2 0 > 0) = 1 lim h 1 ω h = ω 0 h 0 lim h 1 (1 δ h λ h ) = θ h 0 lim 2h 1 λ 2 h = λ 2 > 0 h 0 Then ( h G, hσ 2 ) converges weakly as h 0 to the unique solution (G, σ 2 ) of the diffusion equation dg t = σ t db (1) t, (1) dσ 2 t = (ω θσ 2 t ) dt + λσ 2 t db (2) t, (2) with starting value (G 0, σ0), 2 where (B (1) t ) t 0 and (B (2) t ) t 0 are independent Brownian motions, independent of (G 0, σ0). 2 If 2θ/λ 2 > 1 and ω > 0, then the solution (σt 2 ) t 0 is strictly stationary iff σ0 2 d = Γ(1 + 2θ/λ 2, 2ω/λ 2 ) Example: ω h = ωh, δ h = 1 λ h/2 θh, λ h = λ h/2 6
Interpretation: The solution of (1) and (2) can be approximated by a GARCH process in discrete time. Observe: The stationary limiting process σ 2 has Pareto like tails. The limit (G, σ 2 ) is driven by two independent Brownian motions. The GARCH process has only one source of randomness! The processes G and σ 2 are continuous. But empirical volatility can exhibit jumps. Estimation of the parameters of the diffusion limit and of the discrete GARCH processes may lead to significantly different results (Wang, 2002) Further literature: Drost and Werker (1996), Duan (1997) 7
The COGARCH(1,1) process Klüppelberg, Lindner, Maller (2004) Recall discrete GARCH(1,1): σn 2 = ω + λyn 1 2 + δσn 1 2. = ω = ( ω n 1 n 1 i=0 j=i+1 n 0 (δ + λε 2 j) + σ 2 0 { exp exp Y n = σ n ε n = σ n s j=0 = ω + (δ + λε 2 n 1)σ 2 n 1 n 1 (δ + λε 2 j) j=0 log(δ + λε 2 j) }{{} random walk } log(δ + λε 2 j) { n 1 j=0 }{{} random walk ( n n 1 ) ε j ε j j=0 j=0 } ) ds + σ0 2 }{{} increment of random walk Both appearing random walks are linked 8
Idea: Replace appearing random walks by Lévy processes (= continuous time analogue of random walk) Replace ε j by jumps of a Lévy process L. Recall: A stochastic process (L t ) t 0 is a Lévy process iff it has independent increments: for 0 a < b c < d: L d L c and L b L a are independent it has stationary increments: the distribution of L t+s L t does not depend on t it is stochastically continuous with probability one it has right-continuous paths with finite left-limits (càdlàg paths) L 0 = 0 a.s. 9
Examples Brownian motion (has normal increments) Compound Poisson process: (ε n ) n N i.i.d. sequence, independent of (v n ) n N i.i.d. with exponential distribution with mean c n T n := j=1 v n N t := max{n N 0 : T n t} L t := N t Note: All Lévy processes apart from Brownian motion have jumps. j=1 ε j 10
Lévy-Kintchine formula (L t ) t 0 Lévy process: Ee isl t = e tχl(s), s R, χ L (s) = iγ L s τl 2 s 2 2 + (e isx 1 isx1 { x <1} )Π L (dx), s R. (γ L, τ L, Π L ) characteristic triplet τ 2 L R 0 Brownian part Π L Lévy measure: x 2 Π L (dx) <, x <1 x 1 Π L (dx) < If x <1 x Π L(dx) < and τl 2 = 0: finite variation case. χ L (s) = iγ L,0 s τl 2 s 2 2 + (e isx 1)Π L (dx) s R. γ L,0 drift of L R 11
Jumps of Lévy processes L t := L t L t jumps totally described by the Lévy measure Π L infinite (not compound Poisson) = almost surely, (L t ) t 0 has infinitely many jumps in finite time intervals 0<s t L s < x <1 x Π L(dx) < Always: 0<s t L s 2 < almost surely. 12
COGARCH(1,1) - definition For Π L 0, δ > 0, λ 0 define auxiliary Lévy process X t = t log δ log(1 + λ δ ( L s) 2 ), t 0. 0<s t For ω > 0 and a finite random variable σ0 2 independent of (L t ) t 0 define the volatility process ( t ) σt 2 = ω e X s ds + σ0 2 e X t, t 0. Define COGARCH(1,1) (G t ) t 0 by 0 G 0 = 0, dg t = σ t dl t, t 0. Note: G jumps at the same times as L with jump size G t = σ t L t. 13
Properties (1) (X t ) t 0 is spectrally negative Lévy process of finite variation with Brownian part 0, drift ( log δ) and Lévy measure Π X : Π X ([0, )) = 0 Π X ((, x]) = Π L ({ y (e x 1)δ/λ}), x > 0. Proof By definition: γ X = log δ, τx 2 [ = 0, Π X ((, x]) = E x Π X (dx) = x <1 0<s 1 [ = E 0<s 1 y (e x 1)δ/λ I { log(1+(λ/δ)( Ls ) 2 ) x} ] I { y (e, x > 0. x 1)δ/λ} log(1 + λ δ y2 ) Π L (dy) <. ] 14
(2) (σt 2 ) t 0 satisfies the SDE dσt 2 = ωdt + σt 2 e X t d(e X t ), t 0. and σt 2 = σ0 2 + ωt + log δ t 0 σ 2 sds + λ δ σs( L 2 s ) 2, t 0. (3) 0<s t Proof Itô s Lemma. Compare (3) to discrete-time GARCH(1,1): σ 2 n σ 2 n 1 = ω (1 δ)σ 2 n 1 + λσ 2 n 1ε 2 n 1 = σ 2 n = σ 2 0 + ωn (1 δ) n 1 i=1 σ 2 i + λ n 1 i=1 σ 2 i ε 2 i. 15
(3) Stationarity: Suppose log (1 + λδ ) x2 Π L (dx) < log δ (4) Then σ 2 t R d σ 2 d = ω 0 e X tdt. Otherwise, σ 2 t P. Proof Erickson & Maller (2004) Example: L compound Poisson with rate c and jump distribution ε = Π L = cp ε (4) c E log (1 + λδ ) ε2 < log δ c log δ + E log(δ + λε 2 ) < log δ If c = 1, then (4) E log(δ + λε 2 ) < 0 16
(4) (σt 2 ) t 0 is a Markov process, hence (σt 2 ) t 0 is strictly stationary for σ0 2 d = σ 2. (5) (σ 2 t, G t ) t 0 is a bivariate Markov process. (6) If (σ 2 t ) t 0 is stationary, then (G t ) t 0 has stationary increments. 17
Second order properties of (σ 2 t ) t 0 X t = t log δ s t log(1 + λ δ ( L s) 2 ), t 0 is a Lévy process of finite variation ( t ) σt 2 = ω e X s ds + σ0 2 e X t, t 0 Define Ee cx t = e tψ X(c), then Ψ X (c) = log Ee cx 1 = c log δ + 0 R ( (1 + λ ) δ y2 ) c 1 Π L (dy) 18
For c > 0: (σ t ) t 0 stationary. (1) Ee cx 1 < EL 2c 1 < and Ψ X (c) < (2) EL 2 1 < and Ψ X (1) < 0 = σt 2 random variable) d σ 2 (finite (3) E(σ 2 ) k < EL 2k 1 < and Ψ X (k) < 0. In that case Eσ 2k k!ω k = k l=1 ( ( Ψ X(l)) ) cov(σt 2, σt+h) 2 = ω 2 2 Ψ X (1)Ψ X (2) 1 (Ψ X (1)) 2 e h Ψ X(1) (4) 0 < δ < 1, λ > 0 = σ has always infinite moments. 19
Second order properties of (G t ) t 0 dg t = σ t dl t = G jumps at the same times as L does: G t = σ t L t = t > 0 fix : E( G t ) k = 0 Define for r > 0 fix : G (r) t = G t+r G t = (t,t+r] σ s dl s Take (σt 2 ) t 0 stationary = (G (r) t ) t 0 stationary. 20
Theorem (ACF of (G (r) t )) (L t ) t 0 pure jump process (τ 2 L = 0), EL 2 1 <, EL 1 = 0, Ψ X (1) < 0. Then EG (r) t = 0 E(G (r) t ) 2 = cov(g (r) t, G (r+h) t ) = 0 If EL 4 1 <, Ψ(2) < 0, then ωr Ψ X (1) EL2 1 cov((g (r) t ) 2, (G (r+h) t ) 2 ) ( ) EL = e r ΨX(1) 2 1 1 Ψ X (1) cov(σ2 r, G 2 r)e h ψx(1). (5) If EL 8 1 <, Ψ(4) < 0, R x3 Π L (dx) = 0 = (5) > 0. 21
Theorem (Tail behaviour): Suppose for D := {d [0, ) : E L 1 2d < } we have sup D D = C > 0, κ D : lim x x κ P (σ 2 > x) = C If furthermore: α > 4κ : E L 1 α < and (L t ) t 0 of finite variation and not negative of a subordinator = t > 0 C t > 0 : lim x x 2κ P (G t > x) = C 1,t 22
Proof: σ1 2 = e X 1 σ0 2 + ω ( σ0 2 independent of e X 1, ω 1 1 0 0 ex s X 1 ds e X s X 1 ds, ). Hence for stationary solution σ 2 a random fixed point equation M d = e X 1, σ 2 d = Mσ 2 + Q, Q d = ω 1 Then apply Theorem of Goldie (1991). 0 e X s ds. 23
References Drost, F. and Werker, B. (1996) Closing the GARCH gap: continuous time GARCH modelling. J. Econometrics 74, 31 57. Duan, J.-C. (1997) Augmented GARCH(p, q) process and its diffusion limit. J. Econometrics 79, 97-127. Erickson, B. and Maller, R. (2004) Generalised Ornstein-Uhlenbeck processes and the convergence of Lévy integrals. Séminaire des Probabilités, to appear. Klüppelberg, C., Lindner, A. and Maller, R. (2004a) A continuous time GARCH process driven by a Lévy process: stationarity and second order behaviour. J. Appl. Probab. 41, 601 622. Klüppelberg, C., Lindner, A. and Maller, R. (2004b) Continuous time volatility modelling. COGARCH versus Ornstein-Uhlenbeck models. Invited paper for the Bachelier Colloquium 2005 Nelson, D. (1990) ARCH models as diffusion approximations. J. Econometrics 45, 7 38. Wang, Y. (2002) Asymptotic nonequivalence of GARCH models and diffusions. Ann. Statist. 30, 754 783. 24
L 100 60 20 0 20 0 2000 4000 6000 8000 10000 G 1000 500 0 500 0 2000 4000 6000 8000 10000 sigma 10 20 30 40 0 2000 4000 6000 8000 10000 Gdiff1 150 50 0 50 100 0 2000 4000 6000 8000 10000 Figure 1: Simulated compound Poisson process (L t ) 0 t 10 000 with rate 1 and standard normally distributed jump sizes (first) with corresponding COGARCH process (G t ) (second), volatility process (σ t ) (third) and differenced COGARCH process (G (1) t ) of order 1, where G (1) t = G t+1 G t (last). The parameters were: β = 1, δ = 0.95 and λ = 0.045. The starting value was chosen as σ 0 = 10. 25
ACF 0.4 0.5 0.6 0.7 0.8 0.9 1.0 ACF 0.4 0.6 0.8 1.0 0 20 40 60 80 100 Lag 0 20 40 60 80 100 Lag ACF 0.0 0.2 0.4 0.6 0.8 1.0 ACF 0.0 0.2 0.4 0.6 0.8 1.0 0 100 200 300 400 500 Lag 0 100 200 300 400 500 Lag Figure 2: Sample autocorrelation functions of σ t (top left), σ 2 t (top right), G (1) t (bottom left) and (G (1) t ) 2 (bottom right), for the process simulated in Figure 1. The dashed lines in the bottom graphs show the confidence bounds ±1.96/ 9999. 26