Exercises in stochastic analysis

Similar documents
Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

1. Stochastic Processes and filtrations

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Exercises. T 2T. e ita φ(t)dt.

Lecture 17 Brownian motion as a Markov process

Applications of Ito s Formula

A Concise Course on Stochastic Partial Differential Equations

Gaussian Processes. 1. Basic Notions

Probability and Measure

Lecture 21 Representations of Martingales

Lecture 12. F o s, (1.1) F t := s>t

ELEMENTS OF PROBABILITY THEORY

Convergence at first and second order of some approximations of stochastic integrals

n E(X t T n = lim X s Tn = X s

A D VA N C E D P R O B A B I L - I T Y

Selected Exercises on Expectations and Some Probability Inequalities

Theoretical Tutorial Session 2

I forgot to mention last time: in the Ito formula for two standard processes, putting

Stochastic integration. P.J.C. Spreij

Wiener Measure and Brownian Motion

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

{σ x >t}p x. (σ x >t)=e at.

Lecture 19 L 2 -Stochastic integration

Lecture 22 Girsanov s Theorem

Solutions to the Exercises in Stochastic Analysis

1. Aufgabenblatt zur Vorlesung Probability Theory

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Brownian Motion and Conditional Probability

On pathwise stochastic integration

Stochastic Calculus (Lecture #3)

Exercises Measure Theoretic Probability

Lecture 4: Introduction to stochastic processes and stochastic calculus

Stability of Stochastic Differential Equations

An essay on the general theory of stochastic processes

1 Brownian Local Time

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

Useful Probability Theorems

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2)

µ X (A) = P ( X 1 (A) )

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

A Barrier Version of the Russian Option

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Topics in fractional Brownian motion

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

P (A G) dp G P (A G)

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Verona Course April Lecture 1. Review of probability

Part III Stochastic Calculus and Applications

Part II Probability and Measure

Poisson random measure: motivation

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

MA8109 Stochastic Processes in Systems Theory Autumn 2013

(B(t i+1 ) B(t i )) 2

1 Independent increments

The Pedestrian s Guide to Local Time

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

ON THE POLICY IMPROVEMENT ALGORITHM IN CONTINUOUS TIME

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

PROBLEMS. (b) (Polarization Identity) Show that in any inner product space

On a class of stochastic differential equations in a financial network model

ITO OLOGY. Thomas Wieting Reed College, Random Processes 2 The Ito Integral 3 Ito Processes 4 Stocastic Differential Equations

Brownian Motion. Chapter Definition of Brownian motion

Brownian Motion and Stochastic Calculus

STAT 7032 Probability Spring Wlodek Bryc

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Universal examples. Chapter The Bernoulli process

4 Expectation & the Lebesgue Theorems

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1)

Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term

Stochastic Integration and Continuous Time Models

Reflected Brownian Motion

Uniformly Uniformly-ergodic Markov chains and BSDEs

Optimal stopping for non-linear expectations Part I

18.175: Lecture 3 Integration

The strictly 1/2-stable example

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples

Jump Processes. Richard F. Bass

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

Real Analysis Problems

Properties of an infinite dimensional EDS system : the Muller s ratchet

Product measure and Fubini s theorem

Exercises Measure Theoretic Probability

Contents. 1 Preliminaries 3. Martingales

STAT 331. Martingale Central Limit Theorem and Related Results

Analysis Qualifying Exam

4 Sums of Independent Random Variables

Transcription:

Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with an N could be done totally or partially) in the next classes. Of course, this is a pre-selection, one more selection and changes are possible. 1 Laws and marginals Exercise 1 P). Let τ be a random variable r.v.) with values in [, + [, define the continuous-time) waiting process X as X t = 1 τ<t. Compute the m-dimensional distributions of X, for any m integer. Exercise 2 P). Exhibit two processes, which are not equivalent but have the same 1- and 2-dimensional distributions. Exhibit a family of compatible 2-dimensional distribions, which does not admit a process having those distributions as 2-dimensional marginals. Exercise 3. Let F, F) be a measurable space, G a subset not necessarily measurable) of F, define the restriction of F to G as F G := {A G A F}; it is easy to see that F G is a σ-algebra on G. Now consider E, E) a measurable space morally, the state space for some continuous process), call C = C[, T ]; E). Show that BC) = E [,T ] ) C. This allows to define the law of a continuous process on C, directly from the definition of law for a generic process. 2 Trajectories and filtrations Exercise 4. Let X be a continuous process. Prove that its natural filtration F t ) t F t = σx s s t)) is left-continuous, that is F t = s<t F s. Exercise 5. Let X, Y be two processes, a.e. continuous and modifications one of each other. Show that they are indistinguishable. Let X be a process, such that a.e. trajectory is continuous. Show that there exists a process U, indistinguishable of X, which is everywhere continuous, i.e. every trajectory is continuous. Exercise 6. Let X be a progressively measurable bounded process. Show that the process Y t = X sds) t is progressively measurable with respect to the same filtration). 1

Exercise 7. Let X, Y be two processes, modifications one of each other. Suppose that X is adapted to a filtration F = F t ) t and that F is completed, that is, contains all the P -negligible sets. Prove that also Y is adapted to F. Exercise 8 P). Let X be a process, adapted to a filtration F. Fix t and t n ) n a sequence of non-negative times converging to t. Show that the r.v. s lim sup n X tn and lim inf n X tn are measurable with respect to F t + = s>t F s ; in particular, the set { lim n X tn = X n } is in F t +. Is it true that the r.v. lim sup r t X r is F t +-measurable? What if X is right-continuous or left-continuous)? 3 Gaussian r.v. s Exercise 9 P). Let X n ) n be a family of d-dimensional Gaussian r.v. s, converging in L 2 to a r.v. X. Show that X is Gaussian and find its mean and convariance matrix. A stronger result holds: the thesis remains true if the convergence is only in law see Baldi.16). [Recall that Y is Gaussian with mean m and cocariance matrix A if and only if its characteric function is Ŷ ξ) = exp[im ξ 1 2 ξ Aξ].] Exercise 1. Let X be a R d -valued non-degenerate Gaussian r.v., with mean m and covariance matrix A. Find an expression for its moments. 4 Conditional expectation and conditional law Exercise 11 P). Prove the conditional versions of the following theorems: Lebesgue, Jensen. Fatou, Exercise 12. Let X 1,... X n a family of i.i.d. r.v. s, call S = n k=1 X k. Prove that E[X k S] is independent of k and so is 1 n S. Exercise 13. Let X be a r.v., G, H be two filtrations such that X is independent of H. Is it true that E[X G H] = E[X G]? Exercise 14 P). Let X, Y be two r.v. s with values in R d. We know, from a measurability theorem by Doob see the notes and Baldi, Chapter ), that, for every Borel bounded function f on R d, there exists a measurable function g = g f on R d, such that E[fX) Y ] = gy ) a.s.. Notice that the statement remains true if we change g outside a negligible set with respect to the law of Y. Prove that, for every f, g f depends only on the law of X, Y ). Conversely, prove that the law of X, Y ) depends only on the law of Y and such a family g f ) f. Exercise 15 P). Here is an example where the g of the previous exercise and so the conditional expectiation) can be computed. Let X, Y be two r.v. s with values in R d, suppose that X, Y ) has an absolutely continuous law with respect to the Lebesgue measure), call ρ Y, ρ X,Y the densities of the laws resp. of Y, X, Y ) which is the relation 2

between them?). Prove that, for every Borel bounded function f on R d, a version of E[fX) Y ] = gy ) is given by gy) = fx)hx, y)dx, 1) R d hx, y) = ρ X,Y x, y) 1 ρy >y). 2) ρ Y y) Notice that the statement remains true if we change h outside the set {ρ Y > }. Exercise 16. With the notation of the previous exercise, compute h when X, Y ) is a non-degenerate Gaussian r.v. with mean m and covariance matrix A. Notice that, even if h is defined up to Y # P -negligible sets and so up to Lebesgue-negligible sets, since Y is Gaussian non-degenerate), the expression 2) gives the unique continuous representative of h we will call it h as well). Notice also that, for fixed y, h, y) is a Gaussian density. For such h, we will denote the corresponding g f as g f y) = E[fX) Y = y]. 5 Markov processes Exercise 17. Show that the Markow property, with respect to the natural filtration, depends only on the law of a process: if X is a Markov process with respect to F t = σx s s t)) and Y has the same law of X, then Y is also Markov with respect to G t = σy s s t)). Show also that, if p is a transition function for X, then p is a transition function also for Y. Exercise 18. Show that, if X is a Markov process with respect to a filtration F, then it is Markov also with respect to a subfiltration G such that X is adapted to G). Exercise 19 P). Let S n be a symmetric random walk, call M n = max j n S j. Show that M is not a Markov process. Exercise 2. Find a Markov process X, such that X is not Markov with respect to the same filtration of X or even the natural filtration of X ). Can you guess a sufficient condition, for example on the joint law of X s, X t ) for s < t), which makes X Markov? Exercise 21 P). Let X be a process with independent increments. Show that X is a Markov process. Exercise 22. Let X be a process with independent increments; call µ the law of X, µ s,t the law of X t X s, for s < t. Show that these laws determines uniquely the law of X and that, for every s < r < t, it holds µ s,t = µ s,r µ r,t, that is fx + y)µ s,r dx)µ s,r dy) = fz)µ s,t dz). 3) R d R d R d 3

Conversely, let µ, µ s,t, s < t, be a family of probability measures on R d such that, for every s < r < t, it holds µ s,t = µ s,r µ r,t. Show that there exists a process X such that the law of X is µ and, for every s < t, the law of X t X s is µ s,t. Is there a filtration which makes X a process with independent increments? Exercise 23. Let X be a process with independent increments, with values in R d. Suppose that, for some s, X s has a law which is absolutely continuous with respect to the Lebesgue measure resp. non atomic). Prove that, for every t s, X t has an absolutely continuous resp. non atomic) law. 6 Brownian motion Exercise 24. Show that a Brownian motion is unique in law. Exercise 25 P). Verify the scaling invariance of the Brownian motion: if W is a real Brownian motion, with respect to a filtration F, then 1. for every r >, W s+r W r ) t is a Brownian motion with respect to F s+r ) s ; 2. W is a Brownian motion with respect to F s ) s ; 3. for every c >, 1 c W cs ) s is a Brownian motion with respect to F cs ) s ; 4. the process Z, defined as Z =, Z s = sw 1/s for s >, is a Brownian motion with respect to its natural completed filtration. A remark on property 3: if we put s = t/c, 3 implies that Ws := 1 c W t is a Brownian motion; in other words, we can say that the space scales as the square root of the time, in law. Think about the links between this and Hölder and no BV properties of the trajectories. Exercise 26 P). Let W be a real Brownian motion. Show that, for a.e. trajectory, Hint: use Exercise 25. W t lim t t =. 4) Exercise 27 P). Let W be a real Brownian motion. For T >, ω in Ω, define the random set S T ω) = {t [, T ] W t ω) > }. Show that the law of the r.v. L 1 S T ) T 5) is independent of T L 1 is the 1-dimensional Lebesgue measure). This law is in fact the arcsine law: P L 1 S T ) αt ) = 2 π arcsin α). Exercise 28. Let W be a real Brownian motion. Prove the following facts: 4

For a.e. ω in Ω, the trajectory W ω) is not monotone on any interval [a, b], with a < b; in particular, there exists a dense set S in [, + [ of local maximum points for W ω). For a.e. ω in Ω, the trajectory W ω) assumes different maxima on every couple of non-trivial) compact disjoint intervals with rational extrema; in particular, every local maximum for W ω) is strict. Exercise 29. Let W be a real Brownian motion. Prove that, if A is a negligible set of R d with respect to L 1 ), then, for a.e. ω, the random set Cω) = {t [, T ] W t ω) A} is negligible again with respect to L 1 ). Exercise 3. Let W be a real Brownian motion. Show that, on every interval [, δ], δ >, W passes through infinite times, with probability 1. Exercise 31. A d-dimensional Brownian motion, with respect to a filtration F = F t ) t, is a process B, adapted to F, with values in R d, such that: B = a.s.; B has independent increments with respect to the filtration F); for every s < t, the law of B t B s is centered Gaussian with covariance matrix t s)i d I d is the d d identity matrix); B has continuous trajectories. In case F is the natural completed filtration of B, we call B a standard Brownian motion. Show that a d-dimensional process X is a standard Brownian motion if and only if its components X j s are real independent standard Brownian motions. Exercise 32. Let W a real Brownian motion. Prove that, for every T >, α < 1/2, there exists c > such that the event {W t ct α, t [, T ]} occurs with positive probability. Hint: prove first than, for any fixed c, there exists a small) T >, depending only on c, α and the law of W, such that the above event is not negligible. 7 Stopping times Exercise 33 P). Let X be a continuous process, with values in a metric space E, adapted to a right-continuous completed filtration F; let G, F be an open, resp. closed set in E. Define τ G = inf{t X t / G}, ρ F = inf{t X t F }; notice that ρ F = τ F c. Prove that τ G, ρ F are stopping times with respect to F. Show that the laws of τ G and of ρ F depend only on the law of X. Prove also that the laws of τ G, X) and ρ F, X), as r.v. s with values in [, + ] C[, T ]; E), depend only on the law of X. Convention: unless otherwise stated, if the set {t X t / G} is empty, we define τ G = + ; an analogous convention will be used in the sequel for similar cases. 5

Exercise 34. Let X be a real process, adapted to F, and let τ be a random time, such that {τ < t} is in F t for every t. Show that τ is a stopping time with respect to the augmented filtration F + = F t +) t, where F t + = s>t F s. Exercise 35. Prove the stopping theorem for Brownian motion: If W is a real Brownian motion motion with respect to a filtration F t ) t and τ is a finite stopping time with respect to the same filtration), then W t+τ W τ ) t is a standard Brownian motion, independent of F τ. Exercise 36. Let X be a continuous process with values in R 2, such that X = and the law of X is invariant under rotation i.e. the processes RX t ) t and X t ) t are equivalent, for every orthognonal matrix R). Let τ = inf{t X t = 1} and suppose τ < + a.s. see the convention in Exercise 33). Show that τ and X τ are independent r.v. s and that the law of X τ is λ S 1, the renormalized Lebesgue measure on the sphere S 1. We recall that λ is the unique probability measure on S 1 with is invariant under rotation and satisfies λ[cα, cβ]) = cλ[α, β]) for every c >, α, β angles. Find examples of R 2 -valued continuous processes X with X = ) such that: τ, X τ are independent, but the law of X τ is not λ S 1; the law of X τ is λ S 1, but τ and X τ are not independent; for every t >, P X t = ) = and the law of λ S 1. X t X t is λ S 1, but the law of X τ is not 8 Martingales: examples, stopping theorem, Doob and maximal inequalities Exercise 37. Let M be a continuous- or discrete-time) martingale, with respect to a filtration F; let G be a subfiltration of F i.e. G t F t for every t), such that M is still adapted to G. Verify that M is a martingale with respect to G. Exercise 38 P). Let ξ j ) j be a sequence of i.i.d. r.v. s. Let f be a measurable function such that E[ fξ 1 ) ] < +. Prove that the following processes are martingales: X n = n j=1 fξ j) ne[fξ 1 )]; Y n = E[fξ 1 )] n n j=1 fξ j), if E[fξ 1 )]. Deduce that the following are martingales:... Exercise 39. Find an example of a martingale which is not a Markov process. Exercise 4 P). Let X be a process with independent increments with respect to a filtration F). Prove that X t E[X t ] is a martingale with respect to F). Let Y be a process with centered independent increments. Prove that X 2 t E[X 2 t ] is a martingale. 6

Exercise 41 N). Let W be a Brownian motion. Prove that W t, W 2 t t, exp[λw t 1 2 λ2 t], λ in C, are martingales. Let X = X t ) be a continuous process, adapted to F, with X =. Suppose that exp[λw t 1 2 λ2 t] is a martingale, for every real λ. Prove that X is a Brownian motion with respect to F. Exercise 42 P). Let M be a discrete-time martingale, let τ be a finite stopping time. Prove that the process M τ, defined by M τ n = M τ n, is a martingale with respect to the same filtration of M). Let X be a continuous-time martingale, let τ be a stopping time, suppose that the the hypotheses of the stopping theorem are satisfied. Prove that M τ, defined as M τ t = M τ t, is a martingale again with respect to the same filtration). Exercise 43 N). Let W be a Brownian motion; for a, b >, call τ a,b = inf{t W t / ] a, b[}. Compute E[τ a,b ] and P {B τa,b = b}. Exercise 44. Let W be a real Brownian motion; for γ >, a >, let ρ γ,a = inf{t W t = a + γt}. Prove that P ρ γ,a < + ) = exp[ 2γa]. Hint: consider the martingale M t = exp[2γw t 2γ 2 t] and remember the behaviour of the Brownian motion at infinity. Exercise 45 N). Let X be a continuous sub-martingale, let h : R R be a positive convex nondecreasing function, θ >. Prove that P sup X t λ) e hθλ) E[exp[hθX T )]]. 6) [,T ] If X is a Brownian motion, taking hx) = e x and making the infimum over θ >, we get the exponential bound for the Brownian motion. 9 Martingales: convergence and Doob-Meyer decomposition Exercise 46 N). Let M be a continuous martingale in L 2, with M =. We remind that, since the process A = M is nonnegative nondecreasing, it converges to some finite or infinite limit, as t +. Prove the following dichotomy: on {lim t + A t < + }, M converges a.s. as t + ; on {lim t + A t = + }, it holds for a.e. ω: for every a, the trajectory Mω) reaches a at least one time; in particular, a.e. trajectory does not converge. Exercise 47. Let M t ) t be a positive i.e. M t ) continuous supermartingale and let T = inf {t : M t = }. Prove that, if T ω) <, then for any t T ω), M t ω) =. 7

1 Stochastic integrals Through all this section, let B t ) t be a real continuous Brownian motion, starting from, defined on a filtered probability space Ω, A, P, F = F t ) t ), where F satisfies the usual assumptions. Exercise 48. Prove that, for α β < the space M 2 [α, β] of P L -equivalence classes of) progressively measurable real processes X r ) α r β such that [ β ] E X r 2 dr <, α [ ] β with the scalar product X, Y = E α X ry r dr, is a Hilbert space. Hint: it is a closed subspace of L 2 Ω, A B [α, β]), P L ).) Exercise 49. see Baldi, 6.1) Prove that, for α β, M 2 [α, β] H β α H rdb r is homogeneus with respect to the product of bounded F α -measurable r.v. s, i.e. β α Y H rdb r = Y β α H rdb r if Y L P, F α ). Exercise 5. see Baldi, 6.9) If X M 2 [α, β] is a continuous process such that sup α r β X r L 2 P) then, for every sequence π n ) n of partitions α = t n, <... < t n,mn = β, with π n, it holds m n 1 lim n k= where the limit is taken in L 2 P). ) β X tn,k Btn,k+1 B tn,k = X r db r Exercise 51. Assuming that H has adapted C 1 trajectories on [α, β], prove that a.s. β α β H s db s = H β B β H α B α H sb s ds. α Exercise 52. On a probability space Ω, A, P), let X be a real random variable, independent of a σ-algebra F A. Assuming that the law of X is N, σ 2 ), and that Y is a F-measurable real r.v. with Y = 1 a.s., prove that XY has the same law of X and it is independent of F. Exercise 53. Assume that H s ) s is adapted and H s = 1 for any s. Prove that the process Z s = s H rdb r is a Brownian motion. Exercise 54. Wiener integrals) Let f 1,..., f n L 2, T ), L ). Prove that the random variables F i = T f i s) db s L 2 Ω, P) i = 1..., n) have joint Gaussian law: compute its mean vector and its covariance matrix. 8 α

Let f be a locally bounded Borel function defined on [, ). Prove that the process Z = is Gaussian with independent increments. f s) db s Exercise 55. Multiple Wiener integrals) Fix T >, let 2 = { s, t) R 2 s t T } and call rectangle any set R =]a, α] ]b, β] 2. Prove the following statements. 1. For any rectangle R =]a, α] ]b, β], t I R s, t) db s = defines a process I R B M 2 [, T ]. { Bα B a if t ]b, β] otherwise 2. For R, R rectangles, I R B, I R B M 2 = L 2 R R ). 3. Given a function f = n i=1 α ii Ri where α i R and R i are rectangles, f B = n α i I Ri B) M 2 [, T ] i=1 does not depend on the representation of f. 4. The following isometry holds: f B, g B M 2 = f s, t) g s, t) dsdt. 2 5. The map f f B extends to an isometry of L 2 2, L 2) into M 2 [, T ], and in particular f B is well-defined for any f L 2 2, L 2) as follows: given f n f in L 2 2, L 2), put f B = lim n f n B). 6. Define the multiple Wiener integral and prove that I 2 f) = [ T E T db t f s, t) db s = T f B) t db t L 2 P) 2 ] db t f s, t) db s = f s, t) g s, t) dsdt. 2 9

Exercise 56. With the notation of the above exercise, is the process r T ) r adapted? is it a martingale? r db t f s, t) db r L 2 P ) Exercise 57. Repeat the construction of the exercise above for any n > 2 and provide a precise definition of T n n 2 2 I n f) = db tn db tn 1 db t1 f t 1, t 2,..., t n ) L 2 P), for any f L 2 n, L n ), where n = {t 1,..., t n ) R n t 1... t n T }. Exercise 58. With the notation of the above exercises, prove the following orthogonality relations, for multiple Wiener integrals: given 1 n, m m, f L 2 n, L 2), g L 2 m, L 2), it holds { f, g L E [I n f) I m g)] = 2 n ) if n=m otherwise. Moreover, E [I n f)] =. 11 Itô s Formula As in the previous section, we assume that we are given B t ) t, a real continuous Brownian motion, starting from, defined on a filtered probability space Ω, A, P, F = F t ) t ), where F satisfies the usual assumptions. Exercise 59. Using Itô s formula, find alternative expressions for 1. B sdb s, B sds; 2. B 2 t, B2 s db s ; 3. cos B t ), sin B s) db s ; 4. exp {B s} db s. Exercise 6 Stochastic exponential). see Baldi, sec. 7.2) For H Λ 2 [, [), put X t = H sdb s and prove that is a positive local martingale. t E X) t = exp { X t 1 2 } H s 2 ds 1

Exercise 61. Prove that for any t >, E [E B) t ] = 1, and lim t E B) t = a.s. Exercise 62. Define for t [, 1[ { } 1 Z t = exp B2 t. 1 t 2 1 t) Show that t Z t is a positive martingale for t [, 1[, with E [Z t ] = 1 and lim t 1 Z t =. Exercise 63 Exponential inequality). see Baldi, prop. 7.5) Let H Λ 2 [, T ]) and fix k >. Then ) T } P H s db s > c, Hs 2 ds kt 2 exp { c2. 2kT sup t T Exercise 64. Prove that, given p 2, X t = H sdb s, for H Λ 2 [, T ]), it holds X t p = p X s p 1 sgn X s ) H s db s + 1 2 p p 1) X s p 2 H 2 s ds. Exercise 65. A first step towards local times) Given ɛ >, set f ɛ x) = x 2 + ɛ 2) 1/2, use Itô s formula to show that for H M 2 [, T ]), it holds, for X t = H sdb s, [ XT E + ɛ 2) ] [ ] T 1/2 ɛ 2 = E 2 Xt 2 + ɛ2) 3/2 H2 t dt. Deduce that, for some absolute constant C >, it holds [ T lim sup E ɛ ] 1 2ɛ I { X t ɛ}ht 2 dt CE [ X T ]. Exercise 66 Burkholder-Davis-Gundy inequalities). see Baldi, prop. 7.9) Fix p 2 and show that there exists some constant C p > such that, for any T > and any H M 2 [, T ]), it holds E sup t T In particular, this entails that E sup t T H s db s p ) C p E T H s db s p ) C p T p 2 2 E T Hs 2 ds p/2). ) H s p ds. Actually, in the first inequality above, the opposite inequality holds too, with a different constant.) 11

Exercise 67. Given H Λ 2 [, T ], define recursively I n Λ 2 [, T ] as follows: Prove that for n 2, it holds { I t) = 1 I n t) = I n 1 s) H s db s for n 1. ni n t) = I n 1 t) 11.1 Multidimensional Itô s Formula H s db s I n 2 t) H 2 s ds. In what follows, we assume that n 1 is fixed we are given B t ) t, an n-dimensional Brownian motion, starting from the origin, defined on Ω, A, P, F = F t ) t ), where F satisfies the usual assumptions. Exercise 68. Let f C 2 b Rn ), i.e. f and D 2 f uniformly bounded. Prove that 1. if f =, then t f B t ) is a martingale; 2. if t f B t ) is a martingale, then f =. Exercise 69. Let n = 2 and define X t, Y t ) = exp B 1 t ) cos B 2 t ), exp B 1 t ) sin B 2 t )). Compute the stochastic differential of the process X t 1) 2 + Y 2 t. Exercise 7. Let R >, x R n, with x < R and let τ R = inf {t : x + B t = R}. Prove that E [τ R ] = R 2 x 2) /n. Exercise 71 Exit time from a spherical shell). Fix < r < R, T > and x R n, with r < x < R. Let τ r = inf {t : x + B t = r} τ R = inf {t : x + B t = R} τ = τ r τ R. Assume n 3, write down and fully justify Itô formula for f x + B τ t ), where f x) = x 2 n. Taking expectations and letting t, deduce that P τ r < τ R ) = x 2 n R 2 n) / r 2 n R 2 n). Letting R, prove that P τ r < ) = r/ x ) n 2 < 1. Deduce the following assertions. 12

1. Points are polar for B t, i.e. P t : B t = x ) =. 2. The process leaves every compact set with strictly positive probability, but infact it holds lim t B t ω) = a.e. ω. Hint: strong-markov and Borel-Cantelli). Adapt the steps above for the case n = 2, using f x) = log x and discuss the validity of the corresponent conclusions. Exercise 72. Let X 1,..., X d be Itô processes defined on the same space, fixed above). Using the differential notation dx i, d [X i, X j ], write as Itô processes: 1. Π n i=1 Xi ; 2. X 1 /X 2 assume X 2 > ɛ for some ɛ > ); 3. X 1 k, where k 2, 4. log X 1), assume X 1 > ɛ for some ɛ > ); 5. X 1) X 2, assume X 1 > ɛ for some ɛ > ); 6. sin X 1) cos X 2) + cos X 1) sin X 2) ; 11.2 Lévy theorem Exercise 73. Given an n-dimensional Wiener process B, let H s Λ 2 B [, ]) with H s Hs = Id for a.e. ω, for all s [, [. Prove that is a Brownian motion. t H s db s Exercise 74. Given a real valued Wiener process B, let H s Λ 2 B [, ]) and assume that H2 s ds = f t) is a deterministic process. Prove that is a Gaussian process. t H s ds Exercise 75 Reflection principle). Let B be a real valued Wiener process, and let a >. Write T a = inf {t : B t a}. Define W t = ) I{s<Ta} I {s Ta} dbs and prove that W is a Wiener process. Use this fact and the identity to conclude that {T a < t, B t < a} = {W t > a} P T a < t) = P B t > a) + P W t > a) = P B t > a). 13

12 SDEs Exercise 76. Consider the matrix A = 1 1 and the two-dimensional SDE, with additive noise, ) dx t = AX t dt + σdb t, with X = p, q ), σ =, σ ) and db t is the stochastic differential of a real 1- dimensional) Brownian motion. Recall the existence and uniqueness results for such an equation and prove that [ E X t 2] = X 2 + σt. 2 Exercise 77 Ornstein-Uhlenbeck process). Consider a real-valued Wiener process B and two real numbers θ, σ. Consider the following SDE dx t = θx t dt + σdb t with X = x and prove the following assertions. 1. It holds 2. The process X t is Gaussian. X t = e θt x + e θt e θs σdb s. 3. If θ >, then lim t X t exists in in law but not in L 2 Ω, P). Compute its mean and covariance. Exercise 78 Variation of constants formula). Consider a real-valued Wiener process B and two real numbers θ, σ and a continuous function f : [, [ R R. Consider the following SDE, dx t = θx t + f t, X t )) dt + σdb t, with X = x. Prove the following assertions. 1. X t admits the representation X t = e θt x + e θt e θs f X s ) ds + e θt e θs σdb s. 2. Assume that f t, x) = g t) does not depend on x and prove that X t is Gaussian. 3. Assuming f t, x) = g t) and that lim t e θt eθs g s) ds = µ, prove that lim t X t exists in law. Compute its mean and covariance. 14

Exercise 79. Let µ, σ be real numbers, with σ and let B be a real-valued Wiener process, starting from. Consider the following SDE: ds t = S t µdt + σdb t ), S = s ], [. Prove the following assertions. 1. Strong existence and uniqueness hold for S t, which admits the following expression S t = s exp { µ σ 2 /2 ) t + σb t } >. 2. The law of log S t is Gaussian, i.e. the law of S t is log-normal. Compute its mean and variance. 3. Discuss the behaviour of S t as t. Exercise 8. Fix T, K > and let µ, σ B and S as in the exercise above. Write Φ x) = x exp t 2 /2 ) / 2π for the repartition function of a normal distribution. 1. Show the following Black-Scholes formula for the price at time of a European call option with expiration time T and strike price K: C = e µt E [ S T K) +] = s Φ c + σ ) T Ke µt Φ c), where c = log K/c ) µ σ 2 /2 ) T σ T 2. Show Black-Scholes formula for the correspondent European put option: P = e µt E [ K S T ) +] = s Φ p + σ ) T Ke µt Φ p), where p = log K/c ) µ + σ 2 /2 ) T σ T 3. Deduce the Put-Call parity for the European option: C P = S K. Exercise 81. Let µ, µ, σ, σ be real numbers, and let B be a real-valued Wiener process, starting from. Consider the following SDE s:.. ds t = S t µdt + σdb t ), S = s ], [, ds t = S t µ dt + σ db t ), S = s ], [ where s, s are real numbers, with s. Prove that if, for some t 1 < t 2, it holds then s = s, µ = µ and σ = σ. S t1 = S t 1 S t2 = S t 2 15

13 Itô processes and stopping times Exercise 82. On a filtered probability space Ω, A, P, F = F t ) t ), with the usual assumptions, let τ be a stopping time, i.e. a [, ]-valued r.v. such that, for every t, {τ t} F t. Prove that there exists a decreasing sequence of stopping times τ n, converging uniformly to τ on {τ < }, such that τ n ω) N/2 n { }. Exercise 83. Let H MB 2 [, T ] and let τ be a stopping time with respect to the natural filtration of a Brownian motion B). Prove that s H s I [,τ[ s) is a process in M 2 B [, T ] and that, if we write X t = H sdb s, then T H s I [,τ[ s) db s = X T τ. Hint: assume first that the range of τ is discrete and use the locality of the stochastic integral, then use the exercise above) 16