Exercises. T 2T. e ita φ(t)dt.

Similar documents
PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Theoretical Tutorial Session 2

MA8109 Stochastic Processes in Systems Theory Autumn 2013

I forgot to mention last time: in the Ito formula for two standard processes, putting

Stochastic Differential Equations.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

Exercises in stochastic analysis

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Some Tools From Stochastic Analysis

Stochastic Calculus (Lecture #3)

A Short Introduction to Diffusion Processes and Ito Calculus

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Reflected Brownian Motion

Stability of Stochastic Differential Equations

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Stochastic Calculus February 11, / 33

Verona Course April Lecture 1. Review of probability

Brownian Motion and Conditional Probability

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

1. Stochastic Processes and filtrations

Selected Exercises on Expectations and Some Probability Inequalities

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

A Concise Course on Stochastic Partial Differential Equations

Solutions to the Exercises in Stochastic Analysis

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian Motion. Chapter Definition of Brownian motion

Lecture 21 Representations of Martingales

Lecture 12. F o s, (1.1) F t := s>t

(B(t i+1 ) B(t i )) 2

{σ x >t}p x. (σ x >t)=e at.

Probability and Measure

1 Independent increments

Lecture 19 L 2 -Stochastic integration

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

µ X (A) = P ( X 1 (A) )

Properties of an infinite dimensional EDS system : the Muller s ratchet

Product measure and Fubini s theorem

Universal examples. Chapter The Bernoulli process

BROWNIAN MOTION AND LIOUVILLE S THEOREM

Lecture 17 Brownian motion as a Markov process

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

the convolution of f and g) given by

Rough paths methods 4: Application to fbm

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

MATH 6605: SUMMARY LECTURE NOTES

Hardy-Stein identity and Square functions

Strong Markov property of determinatal processes

GARCH processes continuous counterparts (Part 2)

On pathwise stochastic integration

Brownian Motion and Stochastic Calculus

Lecture notes for Numerik IVc - Numerics for Stochastic Processes, Wintersemester 2012/2013. Instructor: Prof. Carsten Hartmann

1* (10 pts) Let X be a random variable with P (X = 1) = P (X = 1) = 1 2

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Applications of Ito s Formula

Discretization of SDEs: Euler Methods and Beyond

Introduction to Random Diffusions

Derivation of Itô SDE and Relationship to ODE and CTMC Models

From Random Variables to Random Processes. From Random Variables to Random Processes

Lecture 7. 1 Notations. Tel Aviv University Spring 2011

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

1 Brownian Local Time

Characteristic Functions and the Central Limit Theorem

Lecture 22 Girsanov s Theorem

1.1 Definition of BM and its finite-dimensional distributions

Gaussian Processes. 1. Basic Notions

A Class of Fractional Stochastic Differential Equations

Branching Processes II: Convergence of critical branching to Feller s CSB

FE 5204 Stochastic Differential Equations

Uniformly Uniformly-ergodic Markov chains and BSDEs

Lecture 7. Sums of random variables

) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D

LAN property for sde s with additive fractional noise and continuous time observation

Stochastic Differential Equations

Stochastic Integration and Continuous Time Models

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

P (A G) dp G P (A G)

n E(X t T n = lim X s Tn = X s

LogFeller et Ray Knight

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

Probability Theory. Richard F. Bass

4 Sums of Independent Random Variables

Regular Variation and Extreme Events for Stochastic Processes

1. Stochastic Process

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Weak solutions of mean-field stochastic differential equations

Rough Burgers-like equations with multiplicative noise

Backward Stochastic Differential Equations with Infinite Time Horizon

Stochastic Models (Lecture #4)

Particle models for Wasserstein type diffusion

Stochastic Analysis I S.Kotani April 2006

Controlled Diffusions and Hamilton-Jacobi Bellman Equations

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

An Introduction to Malliavin calculus and its applications

Wiener Measure and Brownian Motion

Transcription:

Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp. Show that if sup n x k dp n < + then x j dp n x j dp for g k. 2. Prove that a series of random variables S n, which is Cauchy in probability, converges in probabilty to some random variable S. 3. Let X, X 2,... be i.i.d random variables. Prove that E X < + if and only if P ( X n > n infinitely often) =. 4. Let X, X 2,... be i.i.d random variables with P (X = ) = p, P (X = ) = p and set S n = X +... + X n. Show that if p /2 then P (S n = infinitely often) =, while if p = /2 then P (S n = ifinitelyoften) =. 5. Prove that Lyapunov s condition implies the Lindberg condition. 6. Let a probability measure µ satisfy µ{a} = µ{b} =. Show that then µ(a, b] = lim T 2π e ita e itb φ(t)dt, it where φ(t) is the characteristic function of µ. This shows, in particular, that φ determines µ. 7. Use a similar argument to show that µ{a} = lim T 2T e ita φ(t)dt. 8. Let x, x 2,... be atoms of a measure µ with a characteristic function φ(t). Let also X and Y be two independent random variables which have characteristic function φ. (i) Show that (ii) Show that P (X Y = ) = lim T 2T P (X Y = ) = φ(t) 2 dt. P (X = y)µ(dy) = k (µ{x k }) 2. Conclude that lim T 2T φ(t) 2 dt = k (µ{x k }) 2.

(iii) Show that µ has no point masses if φ(t) is in L 2. 9. Suppose that X is irrational with probability one. Let µ n be the distirbution of the fractional part {nx}. Show that n n k= µ k converges weakly to the uniform distirbution on [, ].. Suppose that X nk M n for all k r n in the Levy-Lindeberg theorem. Assume that M n /s n. Verify that Lyapunov s condition holds.. Suppose independent X n have density x 3 outside [, ]. Show that S n / n log n converges to the normal variable. 2. Let Ω = [, ] and let d n (ω) be the n-th digit in the binary expansion of ω [, ]. Let l n (ω) be the length of the run of zeros starting at d n, that is, l n = if d n (ω) =, and l n (ω) = k if d n (ω) =... = d n+k (ω) = while d n+k (ω) =. Think of ω as result of an infinite series of independent coin tossings with p = /2 for d n =, this turns this into a probability space. Show that l n (ω) is an α-mixing sequence with α n = 4/2 n. 2 Set #2 3. Use the martingale convergence theorem to show that for X j independent random variables, with EX k = for all k N, if k= E(X2 k ) < + then the series k= X k converges almost surely. 4. Consider a process X, X 2,..., taking values in [, + ). Assume that x = is an absorbing state in the sense that if X n = then X n+m = for all m. Let D be the event that the process is eventually absorbed at zero, that is, D = [there exists n such that X n =.] Assume that for every x there exists δ > so that P (D X, X 2,..., X n ) δ for X n x and n =, 2,.... Prove that almost surely either X n is eventually absorbed or X n +. 5. Let Y j be independent random variables with P (Y j = ±) = /2 and set S n = Y +... + Y n. Define N as the time until the first positive sum: N = min{n : S n > }. Show that E(N ) = +. Generalize to the i.i.d case with EY j =. 6. Let µ be a finite Borel measure on [, ] such that the map T x = 2xmod preserves µ. Show that µ is singular with respect to the Lebesgue measure.

7. Show that if B t is the standard Brownian motion then E[Bt 2k ] = (2k)! 2 k k! tk, for all k N. 8. Let B(t) = (B (t),..., B n (t)) be an n-dimensional Brownian motion (B j (t) are independent standard one-dimensional BM) and let K R n have Lebesgue measure zero. Show that the expected total time that B(t) spends in K is zero. 9. Let X, X 2,... be a martingale and assume that X (ω) and increments X n (ω) X n (ω) are bounded by a constant independent of n and ω. Let τ be a stopping time with a finite mean. Show that X τ is integrable and E(X τ ) = E(X ). 2. Let X(t) be independent standard normal variables, one for each dyadic rational point t. Let W () = and W (n) = n k= X k. Suppose that W (t) is already defined for dyadic rationals of the form k/2 n and put ( ) 2k + W 2 n+ = 2 ( ) k2 W n + 2 +n/2 X ( 2k + 2 n+ Prove by induction that the process W (t) for dyadic t has finite-dimensional distributions of Brownian motion. Now construct brownian motion with continuous paths by extension. This avoids using the Kolmogorov extension theorem. 2. Let τ x be the first time the Brownian motion hits a point x > : τ x = inf[t : W (t) x]. Show that the distirbution of τ x has a density p(t, x) = x 2π t 3/2 e x2 /2t. Relate this to the heat equation. 22. Let ρ(s, t) be the probability that the Brownian motion has at least one zero in the interval (s, t). Use problem 2 and Markovianity to show that ρ(s, t) = 2 π arccos s t. 23. Let Y (t) be jump process constructed as follows: Y () = y and Y (t) = y for t < τ. The random time τ has the distribution function P (τ > t) = e σt with σ >. At the time τ the process Y (t) jumps to a random value y with the probability density p(y, y ). This continues: Y (t) jumps at a time τ 2 so that P (τ 2 τ > t) = e σt and at that time it ).

jumps to a value y 2 with probability density p(y 2, y ) and so on. (i) Find the generator for Y (t). (ii) Consider the family of processes Y N (t) as above, with σ N = Nσ. Find α and β so that the generators for the process Z N (t) = N β Y N (N α t) converge to the generator for the one-dimensional Brownian motion. (iii) Introduce also a process X(t) = Y (s)ds. Right down the joint generator for X(t) and Y (t). Find a re-scaling X N (t) = N β X(N α t) so that the generator for X N (t) converges to that of the Brownian motion. 3 Set #3 24. Let a function f(s, ω) vary smoothly in t in the sense that E ( f(s, ω) f(t, ω) 2) K s t +ε for all t, s T and some ε >. Prove that then all stochastic integrals coincide in the sense that f(t, ωdb t = lim f(t j, ω) B j, t j for any choice of t j [t j, t j+ ]. In particular, Ito and Stratonovich integrals coincide]for such functions. 25. The notation f db means that we are talking about the Stratonovich integral. (i) Compute j B s db s, B 2 s db s. (ii) Let X t satisfy a Stratonovich SDE dx t = rx t + αx t db t, re-write it as an Ito equation and solve for X t. 26. Use Ito formula to write dx t = u(t, ω)dt+v(t, ω)db t for the following processes: X t = B 3 t, Y t = e txt, Z t = B (t) 2 + B 2 (t) 2. 27. Let X t and Y t be Ito processes. Show that d(x t Y t ) = X t dy t + Y t dx t + dx t dy t. What is the integration by parts formula for Ito integrals? 28. Let θ(t, ω) be F t -adapted and square integrable. Show that [ Z(t, ω) = exp θ(s, ω)db s 2 ] θ 2 (s, ω)dt

is a martingale. 29. Let s k = E(B k t ), use Ito s formula to show that β k (t) = k(k ) 2 β k 2 (s)ds. 3. Let X t be an Ito process with dx t = v(t, ω)db t. (a) Show that in general X 2 t (t) is not a martingale. (b) Show if v is bounded then the process M t = X 2 t v s 2 ds is a martingale. The process X, X := v s 2 ds is called the quadratic variation of the martingale X t. 3. Define a smooth approximation of g(x) = x as g ε (x) = { 2 x, (ε + x 2 ε ) if x ε, if x < ε. (a) Show that Ito s formula can still be applied to g ε (x) though it is not C 2. (b) Use Ito s formula to deduce that g ε (B t ) = g ε (B ) + (c) Prove that g ε(b s )χ(b s ( ε, ε))db s = g ε(b ε )db ε + 2ε {s [, t] : B s ( ε, ε)}. in L 2 (P ) as ε. (d) Let ε and conclude that B t = B + where L t (ω) is the local time defined as B s ε χ(b s ( ε, ε))db s sgn(b s )db s + L t (ω), L t = lim ε 2ε {s [, t] : B s ( ε, ε)} and sgn(x) = {, if x, if x >.

(d) Show that Y t = sgn(b s)db s is M t -measurable, where M t is the σ- algebra generated by B s, s t. (e) Show that if X t is a strong solution of dx t = sgn(x t )db t then X t is a Brownian motion. Observe that then db t = sgn(x t )dx t. Use (d) to conclude that a strong solution of dx t = sgn(x t )db t may not exist. 32. (a) Let Y (t) = (cos B t, sin B t ), find the generator of the process Y (t), it has the form Lf(x, x 2 ) = 2 2 f a ij (x, x 2 ) + x i x j i,j= 2 j= d j (x) f x j. Show that if f(x, x 2 ) = f( x ), x = x 2 + x2 2 then Lf = and explain why. (b) Defien the Brownian motion on an ellipse, find its generator and the kernel. 33. Solve the Ornstein-Uhlenbeck equation dx t = µx t dt + σdb t, find E(X t and E(X t E(X t )) 2. Do the same for the mean-reverting Ornstein- Uhlenbeck equation dx t = (m X t )dt + σdb t, 34. Find the solution of stochastic Lotka-Volterra model dx t = rx t (K X t ) + βx t db t. Discuss it in terms of population dynamics.