On a class of stochastic differential equations in a financial network model

Similar documents
Some Aspects of Universal Portfolio

Convergence at first and second order of some approximations of stochastic integrals

Multilevel Monte Carlo for Stochastic McKean-Vlasov Equations

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Convex polytopes, interacting particles, spin glasses, and finance

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Competing Brownian Particles

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

Strong Solutions and a Bismut-Elworthy-Li Formula for Mean-F. Formula for Mean-Field SDE s with Irregular Drift

I forgot to mention last time: in the Ito formula for two standard processes, putting

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Weakly interacting particle systems on graphs: from dense to sparse

Walsh Diffusions. Andrey Sarantsev. March 27, University of California, Santa Barbara. Andrey Sarantsev University of Washington, Seattle 1 / 1

Selected Exercises on Expectations and Some Probability Inequalities

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

Topics in fractional Brownian motion

Exercises in stochastic analysis

EQUITY MARKET STABILITY

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Notes on uniform convergence

1. Stochastic Processes and filtrations

Formulas for probability theory and linear models SF2941

Large Deviations Principles for McKean-Vlasov SDEs, Skeletons, Supports and the law of iterated logarithm

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

A Second-Order Model of the Stock Market

LOCAL TIMES OF RANKED CONTINUOUS SEMIMARTINGALES

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Properties of an infinite dimensional EDS system : the Muller s ratchet

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Hybrid Atlas Model. of financial equity market. Tomoyuki Ichiba 1 Ioannis Karatzas 2,3 Adrian Banner 3 Vassilios Papathanakos 3 Robert Fernholz 3

On mean-field approximation of part annihilation and spikes (Probabilit. Citation 数理解析研究所講究録 (2017), 2030:

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility

Lecture 21 Representations of Martingales

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

On continuous time contract theory

Jump-type Levy Processes

Weak solutions of mean-field stochastic differential equations

A Concise Course on Stochastic Partial Differential Equations

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

Exercises with solutions (Set D)

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem

Branching Processes II: Convergence of critical branching to Feller s CSB

Interest Rate Models:

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Stability of Stochastic Differential Equations

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

Approximation of BSDEs using least-squares regression and Malliavin weights

Brownian survival and Lifshitz tail in perturbed lattice disorder

A diamagnetic inequality for semigroup differences

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments

On pathwise stochastic integration

Large deviations and averaging for systems of slow fast stochastic reaction diffusion equations.

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

. Find E(V ) and var(v ).

Uniformly Uniformly-ergodic Markov chains and BSDEs

A class of globally solvable systems of BSDE and applications

Hardy-Stein identity and Square functions

Fourier transforms of measures on the Brownian graph

Discretization of SDEs: Euler Methods and Beyond

Brownian Motion and Stochastic Calculus

Market environments, stability and equlibria

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

Branching Brownian motion seen from the tip

Mean-Field optimization problems and non-anticipative optimal transport. Beatrice Acciaio

Albert N. Shiryaev Steklov Mathematical Institute. On sharp maximal inequalities for stochastic processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

1 Brownian Local Time

(B(t i+1 ) B(t i )) 2

for all f satisfying E[ f(x) ] <.

Stochastic Areas and Applications in Risk Theory

Weak convergence and large deviation theory

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral

Random Variables and Their Distributions

Homogenization for chaotic dynamical systems

Properties of the Autocorrelation Function

Stochastic optimal control with rough paths

Geometry of log-concave Ensembles of random matrices

Lesson 4: Stationary stochastic processes

µ X (A) = P ( X 1 (A) )

TRANSPORT INEQUALITIES FOR STOCHASTIC

Analysis of an M/G/1 queue with customer impatience and an adaptive arrival process

A. Bovier () Branching Brownian motion: extremal process and ergodic theorems

f (1 0.5)/n Z =

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

Metric Spaces and Topology

Lecture Notes 3 Convergence (Chapter 5)

Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term

arxiv: v1 [math.pr] 24 Sep 2018

Convoluted Brownian motions: a class of remarkable Gaussian processes

Strong approximation for additive functionals of geometrically ergodic Markov chains

Kernel Method: Data Analysis with Positive Definite Kernels

Harnack Inequalities and Applications for Stochastic Equations

n E(X t T n = lim X s Tn = X s

Mean-Field optimization problems and non-anticipative optimal transport. Beatrice Acciaio

Existence and Comparisons for BSDEs in general spaces

Vast Volatility Matrix Estimation for High Frequency Data

Limit theorems for Hawkes processes

Econ Lecture 14. Outline

Transcription:

1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University of California, Santa Barbara Part of research is joint work with Nils Detering & Jean-Pierre Fouque Thira May 217 Thera Stochastics

Motivation: On (; F ; (F t ); P) let us consider R N -valued diffusion process (X 1 (t ); : : : ; X N (t )), t < 1 induced by the following random graph structure. Suppose that at time we have a random graph of N vertices f1; : : : ; N g and define the strength of connections between vertices i and j by F -measurable random variable a i ;j (whose distribution may depend on N ) for every 1 i 6= j N and fix a i ;i = for 1 i N. We shall consider dx i (t ) = 1 N NX j=1 for i = 1; : : : ; N, t, a i ;j (X i (t ) X j (t )) dt + db i (t ) ;

where (B 1 (t ); : : : ; B N (t )), t is the standard N -dimensional BM, independent of (X 1 (); : : : ; X N ()) and of the random variables (a i ;j ) 1i ;j N. The randomness determined at time affects the diffusion process (X 1 ( ); : : : ; X N ( )). Deterministic A in the context of financial network : Carmona, Fouque, Sun ( 13), Fouque & Ichiba ( 13),... The system is solvable as a linear stochastic system for X ( ) := (X 1 ( ); : : : ; X N ( )). Let A (N) := (a i ;j ) 1i ;j N be the (N N ) random matrix and B ( ) be the (N 1) -vector valued standard Brownian motion.

Then the system can be rewritten as dx (t ) = A (N) X (t )dt + db (t ) ; A (N) 1 := N Diag(A(N) 1 1 N ) N A(N) ; where 1 N is the (N 1) vector of ones, and Diag(c) is the diagonal matrix whose diagonal elements are those elements in the vector c. Note that each row sum of elements in the matrix A (N) is zero by definition, i.e., a (N) i ;i = X j 6=i a (N) i ;j for each i = 1; : : : ; N, where a (N) i ;j random matrix A (N). is the (i ; j ) element of the

5 The solution to this linear equation is given by X (t ) = e ta(n ) X () + e sa(n ) db (s) ; t : Here we understand e ta(n ) is the (N N ) matrix exponential. Given the initial value X () and A (N), the law of X ( ) is conditionally an N -dimensional Gaussian law with mean e ta(n ) X () and variance covariance matrix Var(X (t )ja (N) ). Q. How to understand the case N! 1 of large network, i.e., what happens if N! 1? For example, if A (N) a :s :! A(1) and N!1

if a (1) i ;j = 1 for j = i + 1, a (1) i ;i = 1, a (1) i ;j, o.w., i.e., A (1) := B @ 1 1 1 1......... 1 C A ; then how can we solve? dx 1 (t ) = (X 2 (t ) X 1 (t ))dt + dw 1 (t ) ; dx 2 (t ) = (X 3 (t ) X 2 (t ))dt + dw 2 (t ) ; Finite N case: Fernholz & Karatzas ( 8-9) studied flow, filtering and pseudo-brownian motion process in equity markets..

7 Example as N! 1 For simplicity let us set X i () =. Given X 2 ( ), we have X 1 (t ) = e (t s) X 2 (s)ds + e (t s) db 1 (s) ; and also, given X 3 ( ), we have X 2 (s) = Z s e (s u) X 3 (u)du + Z s e (s u) db 2 (u) for t, and hence substituting X 2 ( ) into the first one, X 1 (t ) = for t. + e (t s) db 1 (s) + Z s Z e (t s) s e (s u) X 3 (u)du e (t u) db 2 (u)ds

By the product rule for semimartingales, we observe Z s e u (s u) k 1 db (u)ds = for k 2 N, t, and hence u (t u)k e db (u) ; k Z s e u db (u)ds = e u (t u)db (u) ; Z sk Z s1 e u db (u)ds 1 ds k = u (t u)k e db (u) k! for k 2 N, t. Thus for the above example we have for t. X 1 (t ) = 1X k= e (t u) (t u)k k! db k+1(u)

9 X 1 (t ) = 1X k= e (t u) (t u)k k! is a centered, Gaussian process with covariances E[X 1 (s)x 1 (t )] = e (s+t) 1 X k= Z s Z = e (t s) s e 2v I (2 q db k+1(u) e 2u (k!) 2 (s u)k (t u) k du (t s + v )v )dv for s t, where I ( ) is the modified Bessel function of the first kind with parameter, i.e., I (x ) := for x >, 1. 1X k= x 2 2k+ 1 (k + 1) ( + k + 1)

1 In particular, Var(X 1 (t )) = e 2v I (2v )dv = te 2t (I (2t ) + I 1 (2t )) < 1 (it grows as O(t 1=2 ) for large t, also, E[X 1 (s)x 1 (s + t )] = O(e (t 2p (t+s)s) t 1=4 ) :) Thus X 1 ( ) is not stationary. The (marginal) distribution of X k ( ), k 2 N is the same as X 1 ( ), and hence, we may compute (at least numerically) E[X 1 (t )X 2 (u)] = e (t s) E X 2 (s)x 2 (u) ds = e (t s) E X 1 (s)x 1 (u) ds and recursively, E[X 1 (t )X k (u)], k 2 N for t ; u < 1.

Sample path of (X 1 ( ); X 2 ( )) generated from the covariance structure. X 1 (t ) = X 1 ( ) X 2 ( ) e (t s) X 2 (s)ds + for t and Law(X 1 ( )) = Law(X 2 ( )). e (t s) db 1 (s) ;

Weighted by Poisson probabilities An interpretation of for t : X 1 (t ) = =: 1X k= 1X k= e (t u) (t u)k k! p k (t u)db k+1(u) db k+1(u) Suppose N (s) ; s t is a Poisson process with rate 1, independent of (B k ( ); k 2 N). Then h 1X X 1 (t ) = E k= 1 fn(t u) = kgdb k+1(u) F (t ) where F (t ) := (B k (s); s t ; k 2 N), t. i ;

If we replace the Poisson probability by compound Poisson probability, i.e., fn (t ) := N(t) X k=1 k ; where ( k ; k 2 N) are I.I.D. integer-valued R.V. s with P( 1 = i ) = p i, 1 i q, P q i=1 p i = 1 for some q 2 N, independent of N ( ) and (B k ( ); k 2 N), then where fx 1 (t ) := E = ep k (t ) := for k 2 N, t h 1X k= 1X k= h @k @z k exp 1 fen(t u) = kg db k+1(u) F (t ) ep k (t u)db k+1(u) ; q X i=1 13 i p i t (z i z 1) = i

14 corresponds to the modified matrix A (1) := B @ 1 p 1 p 2 p q 1 p 1 p 2 p q..................... 1 C A ; and dx k (t ) = X k (t ) + qx i=1 p i X i+k (t ) dt + db k (t ) with X k () = for k 2 N, t. In particular, if q = 2, p 1 = p 2 = 1=2, then ep k (t ) = bk X=2c j= e t t k j 2 k j (k 2j )! j! ; t ; k 2 N :

15 Another modification: dx 1 (t ) = (X 1 (t ) X 2 (t ))dt + db 1 (t ) ; dx 2 (t ) = (X 2 (t ) X 3 (t ))dt + db 2 (t ) ; We may use the same reasoning in this case to obtain X 1 (t ) = 1X k= e t s ( 1)k (t s) k k! with exponentially growing variance db k+1(s) Var(X 1 (t )) = te 2t (I (2t ) I 1 (2t )) ; t :

A formulation of equation with identical distribution Let us consider (; F ; P; ff (t ); t g) on which X ( ) is an adapted stochastic process which is a weak solution to dx (t ) = b(t ; X (t ); X 1 (t ))dt + (t ; X (t ); X 1 (t ))db (t ) ; t ; where r -dimensional standard Brownian motion B ( ) is independent of d -dimensional process X 1 ( ) which has the same distribution as X ( ) on [; T ] i.e., Law(X (s); s T ) = Law(X 1 (s); s T ) ; and also P a.s. X () = x 2 R d and Z T (kb(t ; X (t ); X 1 (t ))k + k i ;j (t ; X (t ); X 1 (t ))k 2 )dt < +1 for 1 i d, 1 j r and T. Here we assume

b : R+ R d R d! R d and : R+ R d R d! R d r are Lipschitz continuous with at most linear growth, i.e., there exist a constant K > such that kb(t ; x ; y) b(t ; ex ; ey)k + k(t ; x ; y) (t ; ex ; ey)k K (kx ex k + ky and kb(t ; x ; y)k 2 + k(t ; x ; y)k 2 K 2 (1 + kx k 2 + kyk 2 ) for every (t ; x ; y) 2 R+ R d R d. We also assume that X 1 ( ) is adapted to the filtration ff 1 (t ); t g generated by the Brownian motions (B k (t ); k 1; t ) augmented by the P -null sets. We shall solve this system with distributional identity.

When b(t ; x ; y) = x y and (t ; x ; y) = 1, it reduces to the first example X 1 (t ) = e (t s) X 2 (s)ds + e (t s) db 1 (s) ; t : In the linear case we may consider the corresponding A (1) to the example of the block matrix form A (1) = B @ A 1;1 A 1;2 A 1;1 A 1;2............... 1 C A : It looks similar to the nonlinear diffusion dx (t ) = b(x (t ); E[X (t )])dt + (X (t ); E[X (t )])db (t ) ; t of mean-field which appears as Mckean-Vlasov limit of interacting particles (Mckean ( 67), Kac ( 73), Sznitman ( 89), Tanaka ( 84), Shiga & Tanaka ( 85),... ), but is different.

Proposition. On some probability space (; F ; P; ff (t ); t g) there is a unique weak solution to dx (t ) = b(t ; X (t ); X 1 (t ))dt + (t ; X (t ); X 1 (t ))db (t ) ; t ; with Law(X 1 (t ); t T ) = Law(X (t ); t T ) and satisfies (t ) = E[ sup kx (s) X 1 (s)k 2 ] ; t st (s)ds + e (t s) Z s (u)du ds c 1 (t ) ; for t T, T >, where := 4K 2 ( 1 + T ), c := 9 max(1; K 2 ( 1 + T )(1 _ T )) ; c 1 := and 1 is a global constant form the Burkholder-Davis-Gundy inequality. 19 1 e(c )T c

Idea of proof: Given B ( ) and X 1 ( ), we may construct X ( ) by the method of Picard iteration, i.e., there exists a map : C ([; 1); R d ) C ([; 1); R r )! C ([; 1); R d ) with X (t ) = t (X 1 ; B ) for t. We shall find a fixed point of ( ; B ), i.e., Law(X 1 ( )) = Law( (X 1 ; B )) = Law(X ( )) by evaluating the Wassestein distance W 2;T (; e) := inf E [ sup k(t ) (t e )k 2 ] tt where = Law(( )), e = Law(e ( )) and the infimum is taken over the joint law of (( ); e ( )), and using Banach fixed point theorem.

S := n ( ) : (s)ds + e (t s) Z s c 1 (t ) ; t T (u)du ds Note that c >, ( 1 e (c )T ) = ( c ) < 1 and, 1 (t ) := e ct satisfies 1 (s)ds+ e (t s) Z s 1 (u)du ds = c 1 1 (t ) ; t T ; and so, 1 ( ) 2 S and S is non-empty. If f 2 S, then c f 2 S for every positive constant c >. o

22 Also, if f ; g 2 S, then a f + (1 a) g 2 S for every a 2 [; 1], and hence, S is convex. Moreover, since ( 1 e (x )T ) = ( x ) is a non-decreasing function of x for x >, 2 (t ) := e c2t with < c 2 c satisfies 2 (s)ds+ e (t s) Z s 2 (u)du ds = 1 e (c2 )T c 2 2 (t ) c 1 2 (t ) ; t T ; and hence 2 ( ) 2 S for every < c 2 c.

We may extend to the case of the form dx (t ) = b(t ; X (t ); : : : ; X q (t ))dt +(t ; X (t ); : : : ; X q (t ))db (t ) with Law(X ( )) = Law(X 1 ( )) = = Law(X q ( )) for some q 2 N and with Lipschitz coefficients, where X i ( ) is adapted to the filtration generated by (B k ( ); k i ) for i 2 N.

24 Coming back to the diffusions on the graph We say the infinite dimensional matrix x = (x i ;j )(i ;j)2n 2 row-finite if for each i 2 N there is k (i ) 2 N such that x i ;j = for every j k (i ). is We say the infinite dimensional matrix x = (x i ;j )(i ;j)2n 2 is uniformly row-finite, if there is n 2 N such that x i ;j = for every i 2 N and every j with ji j j n. We also say the infinite dimensional matrix (x i ;j )(i ;j)2n 2 (uniformly) column-finite, if its transpose (x i ;j ) (i ;j)2n is 2 (uniformly) row finite. is Let us denote by A the class of uniformly positive definite, bounded, infinite dimensional matrices which are both uniformly row and uniformly column finite.

Suppose that there exist u > d > such that all the eigenvalues of A (N) are bounded above by u and below by d for every N, and as N! 1, each (i ; j ) element a (N) i ;j of A (N) converges to an (i ; j ) element a (1) i ;j of fixed matrix A (1) 2 A almost surely for every (i ; j ) 2 N 2, i.e., lim a(n) i ;j = a (1) i ;j : N!1 Assume the first k elements X (k ;N) () = (X 1 (); : : : ; X k ()) of initial random variables X (N) () converges weakly to an R k -valued random vector (k) for every k 2 N. We also assume that sup N E[kX (k ;N) ()k 4 ] < 1 for every k 2 N.

26 Then for every k 2 N and T >, as N! 1, the law of the first k elements X (k ;N) ( ) = (X 1 ( ); : : : ; X k ( )) of X (N) ( ) = (X 1 ( ); : : : ; X N ( )) converges weakly in C ([; T ]) to the law of the first k -dimensional stochastic process Y (k) ( ) := (Y 1 ( ); : : : ; Y k ( )) of Y ( ) := (Y i ( )) i2n defined by Y (t ) = e ta(1) Y () + e sa(1) dw (s) ; t ; where Law(Y (k) ()) = Law( (k) ) for every k 2 N and W ( ) is the R 1 -valued standard Brownian motion.

Summary Thank you all for your attentions, and Happy Birthday! Examples of linear systems on infinite graph A class of stochastic differential equations with restrictions in their distribution Part of research is supported by grants NSF -DMS-13-13373 and DMS-16-15229.