Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp. Show that if sup n x k dp n < + then x j dp n x j dp for g k. 2. Prove that a series of random variables S n, which is Cauchy in probability, converges in probabilty to some random variable S. 3. Let X, X 2,... be i.i.d random variables. Prove that E X < + if and only if P ( X n > n infinitely often) =. 4. Let X, X 2,... be i.i.d random variables with P (X = ) = p, P (X = ) = p and set S n = X +... + X n. Show that if p /2 then P (S n = infinitely often) =, while if p = /2 then P (S n = ifinitelyoften) =. 5. Prove that Lyapunov s condition implies the Lindberg condition. 6. Let a probability measure µ satisfy µ{a} = µ{b} =. Show that then µ(a, b] = lim T 2π e ita e itb φ(t)dt, it where φ(t) is the characteristic function of µ. This shows, in particular, that φ determines µ. 7. Use a similar argument to show that µ{a} = lim T 2T e ita φ(t)dt. 8. Let x, x 2,... be atoms of a measure µ with a characteristic function φ(t). Let also X and Y be two independent random variables which have characteristic function φ. (i) Show that (ii) Show that P (X Y = ) = lim T 2T P (X Y = ) = φ(t) 2 dt. P (X = y)µ(dy) = k (µ{x k }) 2. Conclude that lim T 2T φ(t) 2 dt = k (µ{x k }) 2.
(iii) Show that µ has no point masses if φ(t) is in L 2. 9. Suppose that X is irrational with probability one. Let µ n be the distirbution of the fractional part {nx}. Show that n n k= µ k converges weakly to the uniform distirbution on [, ].. Suppose that X nk M n for all k r n in the Levy-Lindeberg theorem. Assume that M n /s n. Verify that Lyapunov s condition holds.. Suppose independent X n have density x 3 outside [, ]. Show that S n / n log n converges to the normal variable. 2. Let Ω = [, ] and let d n (ω) be the n-th digit in the binary expansion of ω [, ]. Let l n (ω) be the length of the run of zeros starting at d n, that is, l n = if d n (ω) =, and l n (ω) = k if d n (ω) =... = d n+k (ω) = while d n+k (ω) =. Think of ω as result of an infinite series of independent coin tossings with p = /2 for d n =, this turns this into a probability space. Show that l n (ω) is an α-mixing sequence with α n = 4/2 n. 2 Set #2 3. Use the martingale convergence theorem to show that for X j independent random variables, with EX k = for all k N, if k= E(X2 k ) < + then the series k= X k converges almost surely. 4. Consider a process X, X 2,..., taking values in [, + ). Assume that x = is an absorbing state in the sense that if X n = then X n+m = for all m. Let D be the event that the process is eventually absorbed at zero, that is, D = [there exists n such that X n =.] Assume that for every x there exists δ > so that P (D X, X 2,..., X n ) δ for X n x and n =, 2,.... Prove that almost surely either X n is eventually absorbed or X n +. 5. Let Y j be independent random variables with P (Y j = ±) = /2 and set S n = Y +... + Y n. Define N as the time until the first positive sum: N = min{n : S n > }. Show that E(N ) = +. Generalize to the i.i.d case with EY j =. 6. Let µ be a finite Borel measure on [, ] such that the map T x = 2xmod preserves µ. Show that µ is singular with respect to the Lebesgue measure.
7. Show that if B t is the standard Brownian motion then E[Bt 2k ] = (2k)! 2 k k! tk, for all k N. 8. Let B(t) = (B (t),..., B n (t)) be an n-dimensional Brownian motion (B j (t) are independent standard one-dimensional BM) and let K R n have Lebesgue measure zero. Show that the expected total time that B(t) spends in K is zero. 9. Let X, X 2,... be a martingale and assume that X (ω) and increments X n (ω) X n (ω) are bounded by a constant independent of n and ω. Let τ be a stopping time with a finite mean. Show that X τ is integrable and E(X τ ) = E(X ). 2. Let X(t) be independent standard normal variables, one for each dyadic rational point t. Let W () = and W (n) = n k= X k. Suppose that W (t) is already defined for dyadic rationals of the form k/2 n and put ( ) 2k + W 2 n+ = 2 ( ) k2 W n + 2 +n/2 X ( 2k + 2 n+ Prove by induction that the process W (t) for dyadic t has finite-dimensional distributions of Brownian motion. Now construct brownian motion with continuous paths by extension. This avoids using the Kolmogorov extension theorem. 2. Let τ x be the first time the Brownian motion hits a point x > : τ x = inf[t : W (t) x]. Show that the distirbution of τ x has a density p(t, x) = x 2π t 3/2 e x2 /2t. Relate this to the heat equation. 22. Let ρ(s, t) be the probability that the Brownian motion has at least one zero in the interval (s, t). Use problem 2 and Markovianity to show that ρ(s, t) = 2 π arccos s t. 23. Let Y (t) be jump process constructed as follows: Y () = y and Y (t) = y for t < τ. The random time τ has the distribution function P (τ > t) = e σt with σ >. At the time τ the process Y (t) jumps to a random value y with the probability density p(y, y ). This continues: Y (t) jumps at a time τ 2 so that P (τ 2 τ > t) = e σt and at that time it ).
jumps to a value y 2 with probability density p(y 2, y ) and so on. (i) Find the generator for Y (t). (ii) Consider the family of processes Y N (t) as above, with σ N = Nσ. Find α and β so that the generators for the process Z N (t) = N β Y N (N α t) converge to the generator for the one-dimensional Brownian motion. (iii) Introduce also a process X(t) = Y (s)ds. Right down the joint generator for X(t) and Y (t). Find a re-scaling X N (t) = N β X(N α t) so that the generator for X N (t) converges to that of the Brownian motion. 3 Set #3 24. Let a function f(s, ω) vary smoothly in t in the sense that E ( f(s, ω) f(t, ω) 2) K s t +ε for all t, s T and some ε >. Prove that then all stochastic integrals coincide in the sense that f(t, ωdb t = lim f(t j, ω) B j, t j for any choice of t j [t j, t j+ ]. In particular, Ito and Stratonovich integrals coincide]for such functions. 25. The notation f db means that we are talking about the Stratonovich integral. (i) Compute j B s db s, B 2 s db s. (ii) Let X t satisfy a Stratonovich SDE dx t = rx t + αx t db t, re-write it as an Ito equation and solve for X t. 26. Use Ito formula to write dx t = u(t, ω)dt+v(t, ω)db t for the following processes: X t = B 3 t, Y t = e txt, Z t = B (t) 2 + B 2 (t) 2. 27. Let X t and Y t be Ito processes. Show that d(x t Y t ) = X t dy t + Y t dx t + dx t dy t. What is the integration by parts formula for Ito integrals? 28. Let θ(t, ω) be F t -adapted and square integrable. Show that [ Z(t, ω) = exp θ(s, ω)db s 2 ] θ 2 (s, ω)dt
is a martingale. 29. Let s k = E(B k t ), use Ito s formula to show that β k (t) = k(k ) 2 β k 2 (s)ds. 3. Let X t be an Ito process with dx t = v(t, ω)db t. (a) Show that in general X 2 t (t) is not a martingale. (b) Show if v is bounded then the process M t = X 2 t v s 2 ds is a martingale. The process X, X := v s 2 ds is called the quadratic variation of the martingale X t. 3. Define a smooth approximation of g(x) = x as g ε (x) = { 2 x, (ε + x 2 ε ) if x ε, if x < ε. (a) Show that Ito s formula can still be applied to g ε (x) though it is not C 2. (b) Use Ito s formula to deduce that g ε (B t ) = g ε (B ) + (c) Prove that g ε(b s )χ(b s ( ε, ε))db s = g ε(b ε )db ε + 2ε {s [, t] : B s ( ε, ε)}. in L 2 (P ) as ε. (d) Let ε and conclude that B t = B + where L t (ω) is the local time defined as B s ε χ(b s ( ε, ε))db s sgn(b s )db s + L t (ω), L t = lim ε 2ε {s [, t] : B s ( ε, ε)} and sgn(x) = {, if x, if x >.
(d) Show that Y t = sgn(b s)db s is M t -measurable, where M t is the σ- algebra generated by B s, s t. (e) Show that if X t is a strong solution of dx t = sgn(x t )db t then X t is a Brownian motion. Observe that then db t = sgn(x t )dx t. Use (d) to conclude that a strong solution of dx t = sgn(x t )db t may not exist. 32. (a) Let Y (t) = (cos B t, sin B t ), find the generator of the process Y (t), it has the form Lf(x, x 2 ) = 2 2 f a ij (x, x 2 ) + x i x j i,j= 2 j= d j (x) f x j. Show that if f(x, x 2 ) = f( x ), x = x 2 + x2 2 then Lf = and explain why. (b) Defien the Brownian motion on an ellipse, find its generator and the kernel. 33. Solve the Ornstein-Uhlenbeck equation dx t = µx t dt + σdb t, find E(X t and E(X t E(X t )) 2. Do the same for the mean-reverting Ornstein- Uhlenbeck equation dx t = (m X t )dt + σdb t, 34. Find the solution of stochastic Lotka-Volterra model dx t = rx t (K X t ) + βx t db t. Discuss it in terms of population dynamics.