ITO OLOGY. Thomas Wieting Reed College, Random Processes 2 The Ito Integral 3 Ito Processes 4 Stocastic Differential Equations

Similar documents
Lecture 21 Representations of Martingales

A Concise Course on Stochastic Partial Differential Equations

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Lecture 19 L 2 -Stochastic integration

Jointly measurable and progressively measurable stochastic processes

1. Stochastic Processes and filtrations

Reflected Brownian Motion

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

An Overview of the Martingale Representation Theorem

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

Exercises in stochastic analysis

Applications of Ito s Formula

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

1 Math 241A-B Homework Problem List for F2015 and W2016

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

13 The martingale problem

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Summable processes which are not semimartingales

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Brownian Motion and Conditional Probability

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

On pathwise stochastic integration

Gaussian Processes. 1. Basic Notions

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

Basic Definitions: Indexed Collections and Random Functions

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Ito Formula for Stochastic Integrals w.r.t. Compensated Poisson Random Measures on Separable Banach Spaces. B. Rüdiger, G. Ziglio. no.

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

Lecture 17 Brownian motion as a Markov process

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Measure theory and probability

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2

Stochastic Calculus. Alan Bain

Stochastic Integration.

Stochastic integration. P.J.C. Spreij

Part III Stochastic Calculus and Applications

n E(X t T n = lim X s Tn = X s

Modeling with Itô Stochastic Differential Equations

Verona Course April Lecture 1. Review of probability

GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS

What s more chaotic than chaos itself? Brownian Motion - before, after, and beyond.

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Stochastic Differential Equations.

Inference for Stochastic Processes

Linear maps. Matthew Macauley. Department of Mathematical Sciences Clemson University Math 8530, Spring 2017

4.6. Linear Operators on Hilbert Spaces

Empirical Processes: General Weak Convergence Theory

arxiv: v2 [math.pr] 27 Oct 2015

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

Lecture 12. F o s, (1.1) F t := s>t

Generalized Gaussian Bridges of Prediction-Invertible Processes

Trajectorial Martingales, Null Sets, Convergence and Integration

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

MA8109 Stochastic Processes in Systems Theory Autumn 2013

Convergence of Feller Processes

Uniformly Uniformly-ergodic Markov chains and BSDEs

I. ANALYSIS; PROBABILITY

ON THE EQUIVALENCE OF CONGLOMERABILITY AND DISINTEGRABILITY FOR UNBOUNDED RANDOM VARIABLES

Wiener Measure and Brownian Motion

9 Brownian Motion: Construction

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

be the set of complex valued 2π-periodic functions f on R such that

Linear Algebra. Preliminary Lecture Notes

4th Preparation Sheet - Solutions

LOCAL TIMES OF RANKED CONTINUOUS SEMIMARTINGALES

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory

Additive Summable Processes and their Stochastic Integral

Optimization Theory. Linear Operators and Adjoints

Chapter 2 Wick Polynomials

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

µ X (A) = P ( X 1 (A) )

S 1 S 2 L S 1 L S 2 S 2 R

THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON

7. Homotopy and the Fundamental Group

An essay on the general theory of stochastic processes

Some basic elements of Probability Theory

Introduction to Empirical Processes and Semiparametric Inference Lecture 22: Preliminaries for Semiparametric Inference

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Cauchy s Theorem (rigorous) In this lecture, we will study a rigorous proof of Cauchy s Theorem. We start by considering the case of a triangle.

CONDITIONAL FULL SUPPORT OF GAUSSIAN PROCESSES WITH STATIONARY INCREMENTS

Introduction to Signal Spaces

Topics in fractional Brownian motion

An Itō formula in S via Itō s Regularization

Linear Algebra. Preliminary Lecture Notes

Lecture 2: Random Variables and Expectation

ELEMENTS OF PROBABILITY THEORY

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Rational and Algebraic General Solutions of First-Order Algebraic ODEs

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Advanced Probability

On Least Squares Linear Regression Without Second Moment

Functional Analysis. Martin Brokate. 1 Normed Spaces 2. 2 Hilbert Spaces The Principle of Uniform Boundedness 32

PREDICTABLE REPRESENTATION PROPERTY OF SOME HILBERTIAN MARTINGALES. 1. Introduction.

Regularization by noise in infinite dimensions

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Proofs for Large Sample Properties of Generalized Method of Moments Estimators

Lecture 11: Random Variables

II - REAL ANALYSIS. This property gives us a way to extend the notion of content to finite unions of rectangles: we define

Spectral Measures, the Spectral Theorem, and Ergodic Theory

Transcription:

ITO OLOGY Thomas Wieting Reed College, 2000 1 Random Processes 2 The Ito Integral 3 Ito Processes 4 Stocastic Differential Equations 1 Random Processes 1 Let (, F,P) be a probability space. By definition, is a set, F is a σ-algebra of subsets of, and P is a normalized finite nonnegative measure on F. Let L 2 () be the real hilbert space comprised of the square integrable real-valued measurable functions defined on. We will employ the following notations: (1) [F, G] := F (ω)g(ω)p (dω) (F L 2 (), G L 2 ()) and: (2) [H, 1] = H(ω)P (dω) (H L 2 ()) For any functions F and G in L 2 (), one regards F and G as indistinguishable iff: (3) [F G, F G] =0 2 Let J be the interval in R comprised of the nonnegative real numbers. Let λ be lebesgue measure on the σ-algebra of borel subsets of J. By a realvalued random process on J, one means a measurable mapping X carrying the product space J tor. For such a mapping, we will employ the following notation: (4) X(s)(ω) =X s (ω) =X(s, ω) =X ω (s) =X(ω)(s) (s J, ω ) We will require that: (5) X s L 2 () (0 s) 1

In effect, then, one may regard the random process X as a mapping carrying J to L 2 (): J X L 2 () jointly measurable in s and ω. One says that X is continuous in the mean iff, as a mapping carrying J to L 2 (), X is continuous. One says that X has continuous trajectories iff, for each ω in, X ω is continuous (as a real-valued function defined on J). In this case, one may regard X as a mapping carrying toc(j): X C(J) jointly measurable in s and ω. ByC(J), one denotes the set comprised of all continuous real-valued functions defined on J. 3 Let B be a real-valued random process on J which defines a Brownian motion, with start state 0: (6) B 0 =0 The various basic properties of B will emerge in due course. For each r in J, let E r be the σ-subalgebra of F comprised of all sets of the form: Br 1 (A) where A is any borel subset of R. However, with regard to our subsequent description of the Ito Integral, let us take E 0 to be the σ-subalgebra of F comprised of all sets N in F for which either P (N) =0orP (N) =1. In turn, let F s be the σ-subalgebra of F generated by the union of the various σ-subalgebras E r, where 0 r s: (7) F s := E r (0 s) Clearly: 0 r s (8) F s F t (0 s<t) We will assume that the σ-subalgebra of F generated by the union of the various σ-subalgebras F s is F itself: (9) F = 0 s F s We obtain a filtration of F: (10) F s F 2

It may happen that: (11) F t = 0 s<t F s (0 <t) or that: (12) F s = s<t F t (0 s) In the former case, one says that the filtration of F is left continuous; in the latter case, right continuous. In general, neither condition holds. However, Brownian motion has continuous trajectories, so both conditions hold. 4 For each s in J, let L 2 s() be the closed linear subspace of L 2 () comprised of all functions in L 2 () which are measurable with respect to F s. Clearly: (13) L 2 s() L 2 t () (0 s<t) Relation (9) entails that the closure of (the linear span of) the union of the various closed linear subspaces L 2 s() is L 2 () itself. We obtain a filtration of L 2 (): (14) L 2 s() L 2 () 5 For each s in J, let Π s be the orthogonal projection operator carrying L 2 () to L 2 s(). For each function H in L 2 (), Π s (H) is the conditional expectation of H with respect to F s : Π s (H)(ω)P (dω) =[Π s (H), 1 A ] A =[H, Π s (1 A )] (A F s ) =[H, 1 A ] = H(ω)P (dω) A 6 One refers to a random process X as a martingale with respect to the given filtration of F iff: (15) Π r (X s )=X r (0 r<s) 3

It follows that: (16) [X s, 1]=[X 0, 1] (s J) so one may say that the martingale X has mean m := [X 0, 1]. Moreover: (17) (X s X r ) L 2 r() (0 r<s) 7 By definition, B is a martingale (having mean 0) with respect to the given filtration of F, so: (18) Π r (B s )=B r (0 r<s) and: (19) (B s B r ) L 2 r() (0 r<s) By definition: (20) [B s B r,b s B r ]=(s r) (0 r<s) This relation entails that B is (uniformly) continuous in the mean. Moreover: (21) [H(B s B r ),H(B s B r )] =[H 2 (B s B r ) 2, 1] =[H 2, 1][(B s B r ) 2, 1] =[H, H](s r) (0 r < s, H L r ()) because, by definition, H 2 and (B s B r ) 2 are independent. By L r (), one denotes the real algebra of real-valued functions H defined on, measurable with respect to F r, and bounded (modulo P ). 2 The Ito Integral 8 Let t and t be any real numbers for which 0 t <t. Let := [t,t ]. Let W be the real linear space comprised of all measurable mappings X carrying the product space to R, which meet the requirements that: (22) X s L 2 s() (t s t ) and: (23) [X s,x s ]λ(ds) < 4

Let W be supplied with the following inner product: (24) [X, Y ] := [X s,y s ]λ(ds) (X W, Y W ) For any mappings X and Y in W, one regards X and Y as indistinguishable iff: (25) [X Y,X Y ] =0 By applying Fubini s Theorem, one can readily show that X and Y are indistinguishable iff there exists a set N in F t such that P (N) = 0 and such that, for each ω in \N, there is a borel subset M ω of such that λ(m ω )=0 and, for each t in \M ω, X ω (t) =Y ω (t). Now one may define the real linear mapping I carrying W to L 2 t () as follows. For mappings in W of the form: (26) H1 [ r,s) (t r<s t, H L r ()) one defines: (27) I (H1 [ r,s) ):=H(B s B r ) One applies relation (21) to show that, on the linear span W 0 of mappings of the foregoing form, I preserves inner products; and one applies elementary arguments to show that W 0 is dense in W. One completes the definition of I by passing to limit in the mean, obtaining the following fundamental relation: (28) [I (X),I (Y )] = [X, Y ] (X W,Y W ) One refers to this relation as Ito s Relation of Isometry. It entails that I is injective modulo the relation of indistinguishability on W. 9 We will employ the following notation for the Ito Integral: (29) I (X)(ω) = X(s, ω)b(ds, ω) (X W ) where: :=[t,t ] 10 Now let W be the real linear space comprised of all real-valued random processes X on J which meet the requirements: (30) X s L 2 s() (0 s) 5

and: (31) [X s,x s ]λ(ds) < (0 t) Let W be supplied with the following (pseudo -) inner products: (32) [X, Y ] t := [X s,y s ]λ(ds) (0 t, X W,Y W) For any random processes X and Y in W, one regards X and Y as indistinguishable iff: (33) [X Y,X Y ] t =0 (0 t) By applying Fubini s Theorem, one can readily show that X and Y are indistinguishable iff there exists a set N in F such that P (N) = 0 and such that, for each ω in \N, there is a borel subset M ω of J such that λ(m ω ) = 0 and, for each t in J\M ω, X ω (t) =Y ω (t). 11 Assembling the foregoing terms, we may describe the Ito Integral I as the linear mapping carrying W to W, defined and uniquely characterized by the conditions that: 0 if 0 t<r (34) I(H 1 [ r,s) ) t := H (B t B r ) if r t<s (0 r < s, H L r ()) H (B s B r ) if s t and: (35) [I(X) t,i(y ) t ]=[X, Y ] t (0 t, X W,Y W) One presumes to define the Ito Integral I as follows: (36) I(X) t := I t (X W t ) (0 t, X W)) where: (37) W t := W and I t := I Clearly, I(X) is a mapping carrying J tor and it meets requirements (30) and (31). However, it may not be jointly measurable in t and ω. Nevertheless, one can show that I(X) is a martingale (the definition of which does not require that I(X) be jointly measurable in t and ω). One may then apply Doob s Theorem to adjust I(X) so that it has continuous trajectories. For each t in J, the old I(X) t and the new I(X) t are indistinguishable in L 2 (). 6

At this point, one should recall our specification of E 0. (See Article 3.) By this specification, it is plain that the new I(X) t must lie in L 2 t (). 12 One can readily show that, for any mapping X carrying J tor, if, for each t in J, X t is measurable in ω, and if, for each ω in, X ω is continuous in t, then X is jointly measurable in t and ω. It follows that the new I(X) is jointly measurable in t and ω. 13 In this context, one should note that, for any random processes X and Y in W, if X and Y have continuous trajectories then X and Y are indistinguishable iff there exists a set N in F such that P (N) = 0 and such that, for each ω in \N, X ω = Y ω. Hence, modulo P, one can specify I(X) precisely as a mapping carrying J tor. 14 The range of I proves to be the real linear subspace of W comprised of all martingales which have mean 0. This result is the Martingale Representation Theorem. 15 We will employ the following notation: (38) I t (X)(ω) = X(s, ω)b(ds, ω) (0 t, X W) The range of I t proves to be the closed linear subspace of L 2 t () comprised of the functions H for which [H, 1] = 0. One refers to this fact as Ito s Representation Theorem. 3 Ito Processes 16 Let U and V be any random processes in W. Let X 0 be any function in L 2 0(). Such a function must in fact be constant modulo P. In terms of U, V, and X 0, one defines the random process X in W as follows: (39) X(t, ω) :=X 0 (ω)+ U(s, ω)ds + V (s, ω)b(ds, ω) where (t, ω) is any ordered pair in J. In the foregoing relation, the second integral is Ito s integral I t. Of course, one must verify that the first integral defines a random process in W. One refers to X as the Ito Process defined by U, V, and X 0. 17 For clarity, let us note that relation (39) (and all such relations to follow) must be interpreted modulo λ P. However, for each ω in, the first integral 7

in relation (39) is necessarily continous in t. By design of the Ito Integral, the second integral is also continuous in t. Hence, one may (implicitly) augment relation (39) by requiring that the random process X have continuous trajectories. Therefore, modulo P, one can specify X precisely as a mapping carrying J tor. (See Article 13.) 18 Now let M be the family comprised of all real-valued functions L defined and continuous on J R, which meet the requirement that, for each τ in J, there is a nonnegative real number β such that: (40) L(t, x) L(t, y) β x y (0 t τ, x R, y R) It follows that: (41) L(t, x γ(1 + x ) (0 t τ, x R) where: γ := β sup L(t, 0) 0 t τ Let L be any function in M. For any random process X in W, one may form the mapping X carrying J toras follows: (42) X(t, ω) :=L(t, X(t, ω)) ((t, ω) J ) One can readily show that X is a random process in W. To this end, one needs only requirement (41). 4 Stochastic Differential Equations 19 Let X 0 be any function in L 2 0() and let K and L be real-valued functions in M. For each random process X in W, we may form the random process Y in W as follows: (43) Y (t, ω) :=X 0 (ω)+ K(s, X(s, ω))ds + L(s, X(s, ω))b(ds, ω) where (t, ω) is any ordered pair in J. In this way, we obtain a mapping T carrying W to itself: T(X) :=Y (X W) We plan to show that (in a certain sense) T is a contraction mapping on W and that, as a result, it admits a unique fixed point Z: (44) Z(t, ω) :=X 0 (ω)+ K(s, Z(s, ω))ds + L(s, Z(s, ω))b(ds, ω) 8

where (t, ω) is any ordered pair in J. One interprets this random process Z as the solution of the stochastic differential equation: (45) dz (t, ω) =K(t, Z(t, ω)) + L(t, Z(t, ω))w (t, ω) ((t, ω) J ) dt uniquely determined by the initial condition: (46) Z(0,ω)=X 0 (ω) (ω ) By W, one denotes the fictitious random process called white noise. imagines that: One (47) B(ds, ω) =W (s, ω)ds 20 Let us show that there is precisely one solution Z to the integral form (44) of the stochastic differential equation (45). Let τ be any positive real number. Let β be a nonnegative real number for which: (48) K(t, x) K(t, y) L(t, x) L(t, y) β x y where t is any real number for which 0 t τ and where x and y are any real numbers. Let t and t be any real numbers for which 0 t <t τ and let:=[t,t ]. Let X t be any function in L 2 t (). Let T be the mapping carrying W to itself, defined as follows: T(X)(t, ω) :=X t (ω)+ K(s, X(s, ω))ds + L(s, X(s, ω))b(ds, ω) where X is any mapping in W and where (t, ω) is any ordered pair in. The second of the foregoing integrals is Ito s Integral I [ t,t ]. We will prove that: (50) [T(X ) T(X ),T(X ) T(X )] 2β 2 (1 + τ 2 )(t t )[X X,X X ] where X and X are any mappings in W. Hence, if t t is sufficiently small then T is a contraction mapping carrying W to itself. 21 Let us assume for the moment that we have proved relation (50). Let: 0=t 0 <t 1 <t 2 < <t k 1 <t k = τ 9

be a partition of [0,τ] for which: 2β 2 (1 + τ 2 )(t j+1 t j ) < 1 (0 j<k) By repeated application of the Contraction Mapping Principle, we can design a mapping Z τ in W τ such that: (51) Z τ (t, ω) :=X 0 (ω)+ K(s, Z τ (s, ω))ds+ L(s, Z τ (s, ω))b(ds, ω) where (t, ω) is any ordered pair in [ 0,τ ]. Letting τ tend to, we can obtain the random process Z in W satisfying (and uniquely determined by) relation (44). 22 Let us prove relation (50). Let X and X be any mappings in W. Let us adopt the following notational compressions: F (s, ω) :=K(s, X (s, ω)) K(s, X (s, ω)) G(s, ω) :=L(s, X (s, ω)) L(s, X (s, ω)) ((s, ω) ) We have: [T(X ) T(X ), T(X ) T(X )] = F (s, ω)λ(ds) + G(s, ω)b(ds, ω) 2 P (dω)λ(dt) { 2 F (s, ω)λ(ds) 2 P (dω) + G(s, ω)b(ds, ω) 2 P (dω) } λ(dt) 2 2 { (t t ) 2 + { (t t ) 2 β 2 F (s, ω) 2 λ(ds)p (dω) G(s, ω) 2 P (ω)λ(ds) } λ(dt) X (s, ω) X (s, ω) 2 P (dω)λ(ds) + β 2 X (s, ω) X (s, ω) 2 P (dω)λ(ds) } λ(dt) 2β 2 (1 + τ 2 )[X X,X X ] λ(dt) =2β 2 (1 + τ 2 )(t t )[X X,X X ] which proves relation (50). 10

23 Let us emphasize that the random process Z = Z which appears on the left side of relation (44) and the random processs Z = Z which appears (twice) on the right side of relation (44) are, though indistinguishable, not identically the same as mappings. However, with reference to Articles 11, 12, and 13, we may arrange that Z have continuous trajectories and we may infer that, modulo P, the random process Z in W which satisfies relation (44) is uniquely determined as a mapping carrying J tor. 11