T n B(T n ) n n T n. n n. = lim

Similar documents
Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

I forgot to mention last time: in the Ito formula for two standard processes, putting

1.1 Definition of BM and its finite-dimensional distributions

Theoretical Tutorial Session 2

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions

Stochastic integration. P.J.C. Spreij

Brownian Motion and Stochastic Calculus

Lecture 17 Brownian motion as a Markov process

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

A D VA N C E D P R O B A B I L - I T Y

Exercises in Extreme value theory

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

Stochastic integral. Introduction. Ito integral. References. Appendices Stochastic Calculus I. Geneviève Gauthier.

Brownian Motion. Chapter Definition of Brownian motion

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Stochastic Calculus (Lecture #3)

Brownian Motion and Conditional Probability

A Concise Course on Stochastic Partial Differential Equations

Exercises. T 2T. e ita φ(t)dt.

1 Independent increments

MA 8101 Stokastiske metoder i systemteori

Universal examples. Chapter The Bernoulli process

JUSTIN HARTMANN. F n Σ.

BROWNIAN MOTION AND HAUSDORFF DIMENSION

1 IEOR 6712: Notes on Brownian Motion I

Lecture 21 Representations of Martingales

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

4 Sums of Independent Random Variables

ERRATA: Probabilistic Techniques in Analysis

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Selected Exercises on Expectations and Some Probability Inequalities

MATH 6605: SUMMARY LECTURE NOTES

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Branching Brownian motion seen from the tip

BROWNIAN MOTION AND LIOUVILLE S THEOREM

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Exercises in stochastic analysis

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

µ (X) := inf l(i k ) where X k=1 I k, I k an open interval Notice that is a map from subsets of R to non-negative number together with infinity

(A n + B n + 1) A n + B n

1 Math 285 Homework Problem List for S2016

Stochastic Analysis. Prof. Dr. Andreas Eberle

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

Lecture 12. F o s, (1.1) F t := s>t

n E(X t T n = lim X s Tn = X s

Lecture 11. Multivariate Normal theory

FE 5204 Stochastic Differential Equations

1. Stochastic Processes and filtrations

Stochastic Calculus for Finance II - some Solutions to Chapter VII

Insert your Booktitle, Subtitle, Edition

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples

NAN WANG and KOSTAS POLITIS

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

Stochastic Calculus February 11, / 33

Verona Course April Lecture 1. Review of probability

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Thus f is continuous at x 0. Matthew Straughn Math 402 Homework 6

Lecture 19 : Brownian motion: Path properties I

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 22 12/09/2013. Skorokhod Mapping Theorem. Reflected Brownian Motion

Random Process Lecture 1. Fundamentals of Probability

Weak solutions of mean-field stochastic differential equations

(B(t i+1 ) B(t i )) 2

Probability Theory. Richard F. Bass

Convergence of the long memory Markov switching model to Brownian motion

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

Useful Probability Theorems

P ( N m=na c m) (σ-additivity) exp{ P (A m )} (1 x e x for x 0) m=n P (A m ) 0

Jump Processes. Richard F. Bass

Self-normalized laws of the iterated logarithm

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The strictly 1/2-stable example

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Stochastic Differential Equations.

Probability Theory I: Syllabus and Exercise

Stochastic Differential Equations

Additive Lévy Processes

4 Expectation & the Lebesgue Theorems

Part III Stochastic Calculus and Applications

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1)

P (A G) dp G P (A G)

9 Brownian Motion: Construction

Stochastic Models (Lecture #4)

Propp-Wilson Algorithm (and sampling the Ising model)

Stochastic Analysis I S.Kotani April 2006

An essay on the general theory of stochastic processes

Stability of Stochastic Differential Equations

A PECULIAR COIN-TOSSING MODEL

Notes 15 : UI Martingales

Basic Definitions: Indexed Collections and Random Functions

Transcription:

Homework..7. (a). The relation T n T n 1 = T(B n ) shows that T n T n 1 is an identically sequence with common law as T. Notice that for any n 1, by Theorem.16 the Brownian motion B n (t) is independent of F + (T n 1 ). Consequently, T n T n 1 = T(B n ) is independent of T 1,,T n 1. Hence, T n T n 1 is an i.i.d. sequence. By The law of large numbers, So T n B(T n ) B(T n ) = ET and n T n T n B(T n ) = T n = 0 a.s. = 0 a.s. (b). Write By independence and stationarity of the Brownian increments, B(T n ) B(T n 1 is an i.i.d. sequence with common distribution as B(T). By the conclusion of part (a), By Borel-Cantelli lemma, B(T n ) B(T n 1 ) n=1 = 0 a.s. P B(T n ) B(T n 1 ) n < Or, n=1 P B(T) n < This leads to E B(T) <. By the representation B(T n ) = n k=1 B(T k ) B(T k 1 ) (Under the convention B(T 0 ) = 0) By the law of large numbers and part (b) B(T n ) Comparing this to Part (a), EB(T) = 0. = EB(T) a.s..10. Define the filtration A(t);t 0 as: A(t) = F + (e t ). Notice that for any s,t > 0, X(s+t) = e B(e (s+t) (s+t) ) B(e s ) +e t X(s) 1

Conditioning on A(s), X(s) is a constant and B(e (s+t) ) B(e s ) is a random variable with the same distribution as B(e (s+t) e s ) with B(t) starting at 0. Our job is reduce to find the distribution of e (s+t) B(e (s+t) e s )+e t x d = e (s+t) e (s+t) e s U +e t x where U N(0,1). It is easy (by checking the expectation and the variance) to see that the distribution is N(e t x, 1 e t ). Therefore, the transition density is p(t,x,y) = 1 π(1 e t ) exp (y e t x) (1 e t ) Prove do the second part, consider the Brownian motion B(t) = 0 t = 0 tb(1/t) t > 0 x,y R We have X( t) = e t B(e t ) = e t B(e t ) t 0 By what have been proved (with B(t) being replaced by B(t)), X( t); t 0 has the same distribution as X(t); t 0..15. (a) Write X(t) = exp σb(t) σ t t 0 Clearly, X(t) is adaptive with respect to the filtration F + (t). Notice that X(t) 0 and For any 0 s < t, EX(t) = exp E [ X(t) F + (s) ] = exp = exp = exp So X(t) is a Martingale. σ t E exp σb(t) = 1 < σ t E σ t exp σb(s) σ t exp σb(s) exp [ ] F E exp σb(t) + (s) [ ] F exp σb(t) B(s) + (s) σ (t s) = X(s)

(b). Due to similarity, we only show that B(t) t is a martingale. For s < t, by part (a), [ E exp σb(t) σ t F (s)] + = exp σb(s) σ s Taking second derivatives with respective to σ on the both sides we have, respectively, [ (B(t) σt ) E t exp σb(t) σ t F (s)] + (B(t) σt ) = t exp σb(s) σ s To justify it, a usual way of dominated convergence is needed (as the differentiation is defined by it). Taking σ = 0 on the both sides, E [ B(t) t F + (s) ] = B(s) s This, together with the obvious integrability, shows that B(t) t is a martingale. (c). We consider the martingale X(t) = B(t) 4 6tB(t) +3t. Given N > 0, write T N = T N. By the fact that for any t > 0, B(t T N ) max a,b, we have that X(t T N ) max a 4,b 4 +6max a,b +3N By Proposition.4. (by the condition a < 0 < b, I believe that the authors consider the case B(0) = 0), EX(T N ) = E 0 X(0) = 0, or EB(T N ) 4 6ET N B(T N ) +3ET N = 0 We now let N. By the fact that B(T N ) max a,b and dominated convergence, EB(T N) 4 = EB(T) 4 = a 4 PB(T) = a+b 4 PB(T) = b = a 4 b a a +b +b4 a +b = a b( a 3 +b 3 ) a +b = a b( a a b+b ) where the third step follows from the calculation in the proof of Theorem.49. By the bound0 T N B(T N ) T max a,b,thefactthatet < (why?) andthedominated convergence, ET NB(T N ) = ETB(T) = a ET1 B(T)=a +b ET1 B(T)=b = a ET +(b a )ET1 B(T)=b = a 3 b+(b a )ET1 B(T)=b 3

where the last step follows from Theorem.49. By monotonic convergence, ET N = ET Therefore, we conclude that ) a b( a a b+b ) 6 ( a 3 b+(b a )ET1 B(T)=b +3ET = 0 ( ) With the same argument to the martingale B(t) 3 3tB(t), we have EB(T) 3 3ETB(T) = 0. Notice that EB(T) 3 = a 3 PB(T) = a+b 3 PB(T) = b = a3 b+b 3 a a +b = a b(b a ) ETB(T) = aet1 B(T)=a +bet1 B(T)=b Combine our computation, = aet +(b a)et1 B(T)=b = a b+(b+ a )ET1 B(T)=b ET1 B(T)=b = a b(b+ a ) 3(b+ a ) Bring this back to (*), ET = a b3 + a 3 b+3 a b 3.16. Define the stopping time T = infs > 0; B(s) a+bt When Brownian motion starts at 0, there is positive chance that T =. By continuity of the Brownian curve, P 0 B(t) = a+bt for some t > 0 = P 0 T < Let N > 0 and write T N = T N and consider the process X(t) = expbb(t) b t t 0 4

Taking σ = b in Problem.15, X(t) is a martingale. WriteT N = N T forn > 0. Foranyt 0, B(t T N ) a+b(t T N ). Consequently, 0 X(t T N ) exp b ( a+b(t T N ) ) b (t T N ) expab By Proposition.4, therefore, E 0 expbb(t N ) b T N = E 0 X(T) = E 0 X(0) = 1 ( ) Write E 0 expbb(t N ) b T N = E 0 expbb(t N ) b T N 1 T< +E 0 expbb(t N ) b T N 1 T= On the event T <, bb(tn ) b T N = bb(t) b T = ab a.s. By the bound expbb(t N ) b T N expab and dominated convergence, E 0expbB(T N ) b T N 1 T< = expabp 0 T < In addition, E 0 expbb(t N ) b T N 1 T= = E 0 expbb(n) b N1 T= According to the law of large numbers, Consequently, B(N) N = 0 a.s. expbb(n) b N = 0 a.s. On the event T = we have the bound Hence, by dominated convergence, expbb(n) b N expab E 0expbB(N) b N1 T= = 0 ( ) 5

Summarizing our argument since (**), we have expabp 0 T < = 1 Warning. The argument for (***) collapses without the indicator of T =, as E 0 expbb(n) b N = 1 N > 0.19. (a). Under the interpretation of this exercise, B(t) = B 1 (t) +ib (t), where B 1 (t) and B (t) are two independent linear Brownianmotions with B 1 (0) = 0 and B (0) = 1. Notice that the function f(x,y) = e λy cosλx and g(x,y) = e λy sinλx satisfies f(x,y) = 0and g(x,y) = 0. ByCorollary.53theprocessese λb (t) cosλb 1 (t) and e λb (t) sinλb 1 (t) are martingales. Hence, the requested conclusion follows from the relation e iλb(t) = e λb (t) cosλb 1 (t)+ie λb (t) sinλb 1 (t) (b). Extra assumption: We have to assume that λ 0. Indeed, B(T) = B 1 (T) is a real random variable. What we try to prove is the characteristic function of B 1 (T) is equal to e λ which would be greater than 1 when λ < 0. BydefinitionT = infs > 0; B (s) = 0. Ifweareallowedtouse Optional stopping theorem, Ee iλb(t) = Ee iλb(0) = e λ where the last step follows from the fact that B(0) = i. We now justify the use of Proposition.4 (Optional stopping rule). First notice that for any t > 0, e iλb(t t) e λb (T t) 1 where the last step follows from the assumption that λ 0 and the fact that B (T t) 0. Hence, Proposition.4 applies. Remark. First,B(T) = B 1 (T). Second, B 1 (0) = 0impliesthatB 1 (t)issymmetric. Third, by definition T is independent of B 1 (t). Therefore, B 1 (T) is symmetric. So the characteristic function of B 1 (T) is real and even function. Thus, for any λ R, Ee iλb(t) = Ee iλb 1(T) = e λ This result shows that the real random variable B 1 (T) obeys Cauchy distribution, as pointed out in Theorem.37. 6