GARCH processes continuous counterparts (Part 2)

Similar documents
GARCH processes probabilistic properties (Part 1)

Multivariate Generalized Ornstein-Uhlenbeck Processes

Continuous Time Approximations to GARCH(1, 1)-Family Models and Their Limiting Properties

Estimation of COGARCH Models with implementation in R

Regular Variation and Extreme Events for Stochastic Processes

Exponential functionals of Lévy processes

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Exercises. T 2T. e ita φ(t)dt.

An introduction to Lévy processes

Infinitely divisible distributions and the Lévy-Khintchine formula

On pathwise stochastic integration

MA8109 Stochastic Processes in Systems Theory Autumn 2013

A FRACTIONALLY INTEGRATED ECOGARCH PROCESS

Bernstein-gamma functions and exponential functionals of Lévy processes

Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao

Numerical Methods with Lévy Processes

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

A Note on the Non-negativity of Continuous-time ARMA and GARCH Processes

On the quantiles of the Brownian motion and their hitting times.

Lecture 4: Introduction to stochastic processes and stochastic calculus

I forgot to mention last time: in the Ito formula for two standard processes, putting

Local vs. Nonlocal Diffusions A Tale of Two Laplacians

Inference for Lévy-Driven Continuous-Time ARMA Processes

Poisson random measure: motivation

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Limit theorems for Hawkes processes

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Nonlinear Time Series Modeling

Affine Processes. Econometric specifications. Eduardo Rossi. University of Pavia. March 17, 2009

Heavy Tailed Time Series with Extremal Independence

STOCHASTIC GEOMETRY BIOIMAGING

Ornstein-Uhlenbeck Processes and Extensions

Czado, Haug: A fractionally integrated ECOGARCH process

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

Bridging the Gap between Center and Tail for Multiscale Processes

Stable Lévy motion with values in the Skorokhod space: construction and approximation

Extremal behavior of stochastic volatility models

On Reflecting Brownian Motion with Drift

Asymptotic properties of maximum likelihood estimator for the growth rate for a jump-type CIR process

University Of Calgary Department of Mathematics and Statistics

JONATHON PETERSON AND GENNADY SAMORODNITSKY

Rare event simulation for the ruin problem with investments via importance sampling and duality

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises

Strictly Stationary Solutions of Autoregressive Moving Average Equations

ELEMENTS OF PROBABILITY THEORY

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Practical conditions on Markov chains for weak convergence of tail empirical processes

Extremogram and Ex-Periodogram for heavy-tailed time series

Branching Processes II: Convergence of critical branching to Feller s CSB

Stochastic Analysis. Prof. Dr. Andreas Eberle

Asymptotics for posterior hazards

Non-Stationary Time Series and Unit Root Testing

for all f satisfying E[ f(x) ] <.

Extremogram and ex-periodogram for heavy-tailed time series

1 Brownian Local Time

Ernesto Mordecki 1. Lecture III. PASI - Guanajuato - June 2010

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

On the submartingale / supermartingale property of diffusions in natural scale

ECONOMETRICS II, FALL Testing for Unit Roots.

Goodness-of-Fit Testing with Empirical Copulas

The strictly 1/2-stable example

Discrete Ornstein-Uhlenbeck process in a stationary dynamic enviroment

Mean-field dual of cooperative reproduction

Asymptotic Irrelevance of Initial Conditions for Skorohod Reflection Mapping on the Nonnegative Orthant

Normalized kernel-weighted random measures

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Some Aspects of Universal Portfolio

Stochastic differential equation models in biology Susanne Ditlevsen

Other properties of M M 1

Exercises in stochastic analysis

M-estimators for augmented GARCH(1,1) processes

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 22 12/09/2013. Skorokhod Mapping Theorem. Reflected Brownian Motion

Moment Properties of Distributions Used in Stochastic Financial Models

Stochastic Processes

Modelling energy forward prices Representation of ambit fields

1. Stochastic Process

INVARIANT MANIFOLDS WITH BOUNDARY FOR JUMP-DIFFUSIONS

Jump-type Levy Processes

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility

Generalised Fractional-Black-Scholes Equation: pricing and hedging

Stochastic Differential Equations.

Operational Risk and Pareto Lévy Copulas

Estimation of Dynamic Regression Models

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

On Optimal Stopping Problems with Power Function of Lévy Processes

Non-Stationary Time Series and Unit Root Testing

Comment on Weak Convergence to a Matrix Stochastic Integral with Stable Processes

The sample autocorrelations of financial time series models

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Stochastic Gradient Descent in Continuous Time

Outline. A Central Limit Theorem for Truncating Stochastic Algorithms

Introduction to Stochastic processes

LAN property for sde s with additive fractional noise and continuous time observation

Systems Driven by Alpha-Stable Noises

Transcription:

GARCH processes continuous counterparts (Part 2) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/ Maphysto Workshop Non-Linear Time Series Modeling Copenhagen September 27 30, 2004 1

Contents 1. Why processes in continuous time? 2. The GARCH diffusion limit of Nelson 3. The COGARCH process: definition and properties 2

Why continuous time models? Observations are quite often irregularly spaced. Observations quite often come in at a very high frequency. Then a continuous time model may provide a better approximation to the discrete data than a discrete model. Aim: Construct continuous time models with features of GARCH. 3

The diffusion approximation of Nelson σ 2 n = ω + λσ 2 n 1ε 2 n 1 + δσ 2 n 1 = ω + (λε 2 n 1 + δ)σ 2 n 1 Y n = σ n ε n where (ε n ) n N i.i.d.: GARCH(1,1) process. Set Then G n := n Y i, n N 0 i=0 G n G n 1 = Y n, so the increments of (G n ) n N0 are a GARCH process, i.e. (G n ) is accumulated GARCH. Question: Can we find a sequence of processes, whose increments on finer becoming grids are GARCH processes, such that the processes converge in distribution to a non-trivial limit process? 4

Setup: Take grid width h > 0 hg nh = h G (n 1)h + h σ nh hε nh, n N, hσ 2 (n+1)h = ω h + ( h 1 λ h hε 2 nh + δ h ) hσ 2 nh, n N 0, ( h ε nh ) n N0 i.i.d. N(0, h) ( h σ 2 0, hg 0 ) independent of ( h ε nh ) n N ω h > 0, λ h 0, δ h 0. Then ( h G nh h G (n 1)h ) n N is GARCH(1,1) process hg t := h G nh, hσ 2 t := h σ 2 nh, nh t < (n + 1)h defines ( h G t, hσ 2 t ) for all t R + Question: When does ( h G t, hσ 2 t ) t 0 converge weakly to a process (G, σ 2 ) as h 0? (weak convergence is in the space D(R +, R 2 ) of càdlàg functions, endowed with the Borel sets of the Skorohod topology; weak convergence of processes implies in particular convergence of finite dimensional distributions) 5

Theorem: (Nelson, 1990) Suppose ( h G 0, hσ 2 0) d (G 0, σ 2 0), h 0 P (σ 2 0 > 0) = 1 lim h 1 ω h = ω 0 h 0 lim h 1 (1 δ h λ h ) = θ h 0 lim 2h 1 λ 2 h = λ 2 > 0 h 0 Then ( h G, hσ 2 ) converges weakly as h 0 to the unique solution (G, σ 2 ) of the diffusion equation dg t = σ t db (1) t, (1) dσ 2 t = (ω θσ 2 t ) dt + λσ 2 t db (2) t, (2) with starting value (G 0, σ0), 2 where (B (1) t ) t 0 and (B (2) t ) t 0 are independent Brownian motions, independent of (G 0, σ0). 2 If 2θ/λ 2 > 1 and ω > 0, then the solution (σt 2 ) t 0 is strictly stationary iff σ0 2 d = Γ(1 + 2θ/λ 2, 2ω/λ 2 ) Example: ω h = ωh, δ h = 1 λ h/2 θh, λ h = λ h/2 6

Interpretation: The solution of (1) and (2) can be approximated by a GARCH process in discrete time. Observe: The stationary limiting process σ 2 has Pareto like tails. The limit (G, σ 2 ) is driven by two independent Brownian motions. The GARCH process has only one source of randomness! The processes G and σ 2 are continuous. But empirical volatility can exhibit jumps. Estimation of the parameters of the diffusion limit and of the discrete GARCH processes may lead to significantly different results (Wang, 2002) Further literature: Drost and Werker (1996), Duan (1997) 7

The COGARCH(1,1) process Klüppelberg, Lindner, Maller (2004) Recall discrete GARCH(1,1): σn 2 = ω + λyn 1 2 + δσn 1 2. = ω = ( ω n 1 n 1 i=0 j=i+1 n 0 (δ + λε 2 j) + σ 2 0 { exp exp Y n = σ n ε n = σ n s j=0 = ω + (δ + λε 2 n 1)σ 2 n 1 n 1 (δ + λε 2 j) j=0 log(δ + λε 2 j) }{{} random walk } log(δ + λε 2 j) { n 1 j=0 }{{} random walk ( n n 1 ) ε j ε j j=0 j=0 } ) ds + σ0 2 }{{} increment of random walk Both appearing random walks are linked 8

Idea: Replace appearing random walks by Lévy processes (= continuous time analogue of random walk) Replace ε j by jumps of a Lévy process L. Recall: A stochastic process (L t ) t 0 is a Lévy process iff it has independent increments: for 0 a < b c < d: L d L c and L b L a are independent it has stationary increments: the distribution of L t+s L t does not depend on t it is stochastically continuous with probability one it has right-continuous paths with finite left-limits (càdlàg paths) L 0 = 0 a.s. 9

Examples Brownian motion (has normal increments) Compound Poisson process: (ε n ) n N i.i.d. sequence, independent of (v n ) n N i.i.d. with exponential distribution with mean c n T n := j=1 v n N t := max{n N 0 : T n t} L t := N t Note: All Lévy processes apart from Brownian motion have jumps. j=1 ε j 10

Lévy-Kintchine formula (L t ) t 0 Lévy process: Ee isl t = e tχl(s), s R, χ L (s) = iγ L s τl 2 s 2 2 + (e isx 1 isx1 { x <1} )Π L (dx), s R. (γ L, τ L, Π L ) characteristic triplet τ 2 L R 0 Brownian part Π L Lévy measure: x 2 Π L (dx) <, x <1 x 1 Π L (dx) < If x <1 x Π L(dx) < and τl 2 = 0: finite variation case. χ L (s) = iγ L,0 s τl 2 s 2 2 + (e isx 1)Π L (dx) s R. γ L,0 drift of L R 11

Jumps of Lévy processes L t := L t L t jumps totally described by the Lévy measure Π L infinite (not compound Poisson) = almost surely, (L t ) t 0 has infinitely many jumps in finite time intervals 0<s t L s < x <1 x Π L(dx) < Always: 0<s t L s 2 < almost surely. 12

COGARCH(1,1) - definition For Π L 0, δ > 0, λ 0 define auxiliary Lévy process X t = t log δ log(1 + λ δ ( L s) 2 ), t 0. 0<s t For ω > 0 and a finite random variable σ0 2 independent of (L t ) t 0 define the volatility process ( t ) σt 2 = ω e X s ds + σ0 2 e X t, t 0. Define COGARCH(1,1) (G t ) t 0 by 0 G 0 = 0, dg t = σ t dl t, t 0. Note: G jumps at the same times as L with jump size G t = σ t L t. 13

Properties (1) (X t ) t 0 is spectrally negative Lévy process of finite variation with Brownian part 0, drift ( log δ) and Lévy measure Π X : Π X ([0, )) = 0 Π X ((, x]) = Π L ({ y (e x 1)δ/λ}), x > 0. Proof By definition: γ X = log δ, τx 2 [ = 0, Π X ((, x]) = E x Π X (dx) = x <1 0<s 1 [ = E 0<s 1 y (e x 1)δ/λ I { log(1+(λ/δ)( Ls ) 2 ) x} ] I { y (e, x > 0. x 1)δ/λ} log(1 + λ δ y2 ) Π L (dy) <. ] 14

(2) (σt 2 ) t 0 satisfies the SDE dσt 2 = ωdt + σt 2 e X t d(e X t ), t 0. and σt 2 = σ0 2 + ωt + log δ t 0 σ 2 sds + λ δ σs( L 2 s ) 2, t 0. (3) 0<s t Proof Itô s Lemma. Compare (3) to discrete-time GARCH(1,1): σ 2 n σ 2 n 1 = ω (1 δ)σ 2 n 1 + λσ 2 n 1ε 2 n 1 = σ 2 n = σ 2 0 + ωn (1 δ) n 1 i=1 σ 2 i + λ n 1 i=1 σ 2 i ε 2 i. 15

(3) Stationarity: Suppose log (1 + λδ ) x2 Π L (dx) < log δ (4) Then σ 2 t R d σ 2 d = ω 0 e X tdt. Otherwise, σ 2 t P. Proof Erickson & Maller (2004) Example: L compound Poisson with rate c and jump distribution ε = Π L = cp ε (4) c E log (1 + λδ ) ε2 < log δ c log δ + E log(δ + λε 2 ) < log δ If c = 1, then (4) E log(δ + λε 2 ) < 0 16

(4) (σt 2 ) t 0 is a Markov process, hence (σt 2 ) t 0 is strictly stationary for σ0 2 d = σ 2. (5) (σ 2 t, G t ) t 0 is a bivariate Markov process. (6) If (σ 2 t ) t 0 is stationary, then (G t ) t 0 has stationary increments. 17

Second order properties of (σ 2 t ) t 0 X t = t log δ s t log(1 + λ δ ( L s) 2 ), t 0 is a Lévy process of finite variation ( t ) σt 2 = ω e X s ds + σ0 2 e X t, t 0 Define Ee cx t = e tψ X(c), then Ψ X (c) = log Ee cx 1 = c log δ + 0 R ( (1 + λ ) δ y2 ) c 1 Π L (dy) 18

For c > 0: (σ t ) t 0 stationary. (1) Ee cx 1 < EL 2c 1 < and Ψ X (c) < (2) EL 2 1 < and Ψ X (1) < 0 = σt 2 random variable) d σ 2 (finite (3) E(σ 2 ) k < EL 2k 1 < and Ψ X (k) < 0. In that case Eσ 2k k!ω k = k l=1 ( ( Ψ X(l)) ) cov(σt 2, σt+h) 2 = ω 2 2 Ψ X (1)Ψ X (2) 1 (Ψ X (1)) 2 e h Ψ X(1) (4) 0 < δ < 1, λ > 0 = σ has always infinite moments. 19

Second order properties of (G t ) t 0 dg t = σ t dl t = G jumps at the same times as L does: G t = σ t L t = t > 0 fix : E( G t ) k = 0 Define for r > 0 fix : G (r) t = G t+r G t = (t,t+r] σ s dl s Take (σt 2 ) t 0 stationary = (G (r) t ) t 0 stationary. 20

Theorem (ACF of (G (r) t )) (L t ) t 0 pure jump process (τ 2 L = 0), EL 2 1 <, EL 1 = 0, Ψ X (1) < 0. Then EG (r) t = 0 E(G (r) t ) 2 = cov(g (r) t, G (r+h) t ) = 0 If EL 4 1 <, Ψ(2) < 0, then ωr Ψ X (1) EL2 1 cov((g (r) t ) 2, (G (r+h) t ) 2 ) ( ) EL = e r ΨX(1) 2 1 1 Ψ X (1) cov(σ2 r, G 2 r)e h ψx(1). (5) If EL 8 1 <, Ψ(4) < 0, R x3 Π L (dx) = 0 = (5) > 0. 21

Theorem (Tail behaviour): Suppose for D := {d [0, ) : E L 1 2d < } we have sup D D = C > 0, κ D : lim x x κ P (σ 2 > x) = C If furthermore: α > 4κ : E L 1 α < and (L t ) t 0 of finite variation and not negative of a subordinator = t > 0 C t > 0 : lim x x 2κ P (G t > x) = C 1,t 22

Proof: σ1 2 = e X 1 σ0 2 + ω ( σ0 2 independent of e X 1, ω 1 1 0 0 ex s X 1 ds e X s X 1 ds, ). Hence for stationary solution σ 2 a random fixed point equation M d = e X 1, σ 2 d = Mσ 2 + Q, Q d = ω 1 Then apply Theorem of Goldie (1991). 0 e X s ds. 23

References Drost, F. and Werker, B. (1996) Closing the GARCH gap: continuous time GARCH modelling. J. Econometrics 74, 31 57. Duan, J.-C. (1997) Augmented GARCH(p, q) process and its diffusion limit. J. Econometrics 79, 97-127. Erickson, B. and Maller, R. (2004) Generalised Ornstein-Uhlenbeck processes and the convergence of Lévy integrals. Séminaire des Probabilités, to appear. Klüppelberg, C., Lindner, A. and Maller, R. (2004a) A continuous time GARCH process driven by a Lévy process: stationarity and second order behaviour. J. Appl. Probab. 41, 601 622. Klüppelberg, C., Lindner, A. and Maller, R. (2004b) Continuous time volatility modelling. COGARCH versus Ornstein-Uhlenbeck models. Invited paper for the Bachelier Colloquium 2005 Nelson, D. (1990) ARCH models as diffusion approximations. J. Econometrics 45, 7 38. Wang, Y. (2002) Asymptotic nonequivalence of GARCH models and diffusions. Ann. Statist. 30, 754 783. 24

L 100 60 20 0 20 0 2000 4000 6000 8000 10000 G 1000 500 0 500 0 2000 4000 6000 8000 10000 sigma 10 20 30 40 0 2000 4000 6000 8000 10000 Gdiff1 150 50 0 50 100 0 2000 4000 6000 8000 10000 Figure 1: Simulated compound Poisson process (L t ) 0 t 10 000 with rate 1 and standard normally distributed jump sizes (first) with corresponding COGARCH process (G t ) (second), volatility process (σ t ) (third) and differenced COGARCH process (G (1) t ) of order 1, where G (1) t = G t+1 G t (last). The parameters were: β = 1, δ = 0.95 and λ = 0.045. The starting value was chosen as σ 0 = 10. 25

ACF 0.4 0.5 0.6 0.7 0.8 0.9 1.0 ACF 0.4 0.6 0.8 1.0 0 20 40 60 80 100 Lag 0 20 40 60 80 100 Lag ACF 0.0 0.2 0.4 0.6 0.8 1.0 ACF 0.0 0.2 0.4 0.6 0.8 1.0 0 100 200 300 400 500 Lag 0 100 200 300 400 500 Lag Figure 2: Sample autocorrelation functions of σ t (top left), σ 2 t (top right), G (1) t (bottom left) and (G (1) t ) 2 (bottom right), for the process simulated in Figure 1. The dashed lines in the bottom graphs show the confidence bounds ±1.96/ 9999. 26