{σ x >t}p x. (σ x >t)=e at.

Similar documents
PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Feller Processes and Semigroups

Lecture 12. F o s, (1.1) F t := s>t

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

An Introduction to Stochastic Processes in Continuous Time

Lecture 17 Brownian motion as a Markov process

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

ELEMENTS OF PROBABILITY THEORY

A Concise Course on Stochastic Partial Differential Equations

Weak convergence and large deviation theory

4th Preparation Sheet - Solutions

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

Brownian Motion and Stochastic Calculus

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

Exercises in stochastic analysis

Exercises. T 2T. e ita φ(t)dt.

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

CONSTRUCTING STRONG MARKOV PROCESSES

Stochastic integration. P.J.C. Spreij

Mean-field dual of cooperative reproduction

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Convergence of Feller Processes

1. Stochastic Processes and filtrations

Ernesto Mordecki 1. Lecture III. PASI - Guanajuato - June 2010

Applications of Ito s Formula

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

Brownian Motion. Chapter Definition of Brownian motion

Interest Rate Models:

CHAPTER V DUAL SPACES

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

Introduction to Random Diffusions

P (A G) dp G P (A G)

THEOREMS, ETC., FOR MATH 515

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

1. Aufgabenblatt zur Vorlesung Probability Theory

Real Analysis Problems

Kernel Density Estimation

Lecture 4: Introduction to stochastic processes and stochastic calculus

A Short Introduction to Diffusion Processes and Ito Calculus

Poisson Jumps in Credit Risk Modeling: a Partial Integro-differential Equation Formulation

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Tools from Lebesgue integration

Harmonic Functions and Brownian motion

(B(t i+1 ) B(t i )) 2

FE 5204 Stochastic Differential Equations

1.1 Definition of BM and its finite-dimensional distributions

Potential Theory on Wiener space revisited

Optimal Stopping under Adverse Nonlinear Expectation and Related Games

Probability and Measure

Universal examples. Chapter The Bernoulli process

GAUSSIAN PROCESSES AND THE LOCAL TIMES OF SYMMETRIC LÉVY PROCESSES

McGill University Department of Mathematics and Statistics. Ph.D. preliminary examination, PART A. PURE AND APPLIED MATHEMATICS Paper BETA

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

An invariance result for Hammersley s process with sources and sinks

Errata for Stochastic Calculus for Finance II Continuous-Time Models September 2006

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

The strictly 1/2-stable example

Hardy-Stein identity and Square functions

INTRODUCTION TO ALGEBRAIC TOPOLOGY. (1) Let k < j 1 and 0 j n, where 1 n. We want to show that e j n e k n 1 = e k n e j 1

Immerse Metric Space Homework

Stochastic Calculus for Finance II - some Solutions to Chapter VII

Reflected Brownian Motion

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

I forgot to mention last time: in the Ito formula for two standard processes, putting

Wiener Measure and Brownian Motion

Doléans measures. Appendix C. C.1 Introduction

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Nash Type Inequalities for Fractional Powers of Non-Negative Self-adjoint Operators. ( Wroclaw 2006) P.Maheux (Orléans. France)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

PROBABILITY THEORY II

Chapter 1. Poisson processes. 1.1 Definitions

1 Independent increments

Translation Invariant Exclusion Processes (Book in Progress)

Harmonic Functions and Brownian Motion in Several Dimensions

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

The main results about probability measures are the following two facts:

THEOREMS, ETC., FOR MATH 516

Nonlinear Lévy Processes and their Characteristics

Verona Course April Lecture 1. Review of probability

for all f satisfying E[ f(x) ] <.

B. Appendix B. Topological vector spaces

Kolmogorov Equations and Markov Processes

Probability: Handout

Real Analysis: Homework # 12 Fall Professor: Sinan Gunturk Fall Term 2008

10.1. The spectrum of an operator. Lemma If A < 1 then I A is invertible with bounded inverse

µ X (A) = P ( X 1 (A) )

11. Spectral theory For operators on finite dimensional vectors spaces, we can often find a basis of eigenvectors (which we use to diagonalize the

Cores for generators of some Markov semigroups

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

An essay on the general theory of stochastic processes

Stochastic Processes

e - c o m p a n i o n

Transcription:

3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ 2 /2α) distributed random variable. If X 0 d = N(0,σ 2 /2α), show that X t d = N(0,σ 2 /2α) (in other words: the N(0,σ 2 /2α) distribution is an invariant distribution for the Markov process). Show that X t is a Gaussian process with the given mean and covariance functions. Exercise 3.2 Complete the proof of Lemma 3.1.9. Exercise 3.3 Let W be a BM. Show that the reflected Brownian motion defined by X = X 0 + W is a Markov process with respect to its natural filtration and compute its transition function. (Hint: calculate the conditional probability P ν {X t B F X s ) by conditioning further on F W s ). Exercise 3.4 Let X be a Markov process with state space E and transition function (P t ) t 0. Show that for every bounded, measurable function f on E and for all t 0, the process (P t s f(x s )) s [0,t] is a martingale. Exercise 3.5 Prove that µ t1,...,t n defined in the proof of Corollary 3.2.2 are probability measures that form a consistent system. Hint: for showing that they are probability measures, look at the proof of the Fubini theorem. Exercise 3.6 Work out the details of the proof of Lemma 3.2.3. Exercise 3.7 Show for the Poisson process X with initial distribution ν = δ x in Example 3.1.8, that X is a Markov process w.r.t. the natural filtration, with the transition function specified in the example. Exercise 3.8 Show Corollary 3.3.6 that canonical Brownian motion has the strong Markov property. Exercise 3.9 Prove Lemma 3.4.4 and Theorem 3.4.5. Exercise 3.10 Let X be a canonical, right-continuous Markov process with state space (E,E) and for x E. Consider the stopping time σ x =inf{t >0 X t = x}. i) Using the Markov property, show that for every x E for all s, t 0. {σ x >t+ s} = {σ x >t} {σ x >s}, ii) Conclude that there exists an a [0, ], possibly depending on x, such that (σ x >t)=e at. Remark: this leads to a classification of the points in the state space of a right-continuous canonical Markov process. A point for which a = 0 is called an absorption point or a trap. If a (0, ), the point is called a holding point. Points for which a = are called regular.

122 CHAPTER 3. MARKOV PROCESSES iii) Determine a for the Markov jump process (in terms of λ and the stochastic matrix P ) (cf. Example 3.2.5) and for the Poisson process. Hint: compute E x σ x. iv) Given that the process starts in state x, what is the probability that the new state is y after time σ x for Markov jump process? Exercise 3.11 Consider the situation of Exercise 3.10. Suppose in addition that X has the strong Markov property. Suppose that x E is a holding point, i.e. a point for which a (0, ). i) Observe that σ x <, -a.s. and that {X σx = x, σ x < } {σ x θ σx =0,σ x < }. ii) Using the strong Markov property, show that {X σx = x, σ x < } = {X σx = x, σ x < } {σ x =0}. iii) Conclude that {X σx = x, σ x < } = 0, i.e. a canonical Markov process with rightcontinuous paths, satisfying the strong Markov property can only leave a holding point by a jump. Exercise 3.12 Prove Corollary 3.3.7 and Lemma 3.3.8. Exercise 3.13 Show for Example 3.4.3 that X is a Markov process, and show the validity of the assertions stated. Explain which condition of Theorem 3.3.4 fails in this example. Exercise 3.14 Show that the maps φ and ψ in the proof of Theorem 3.4.1 are Borel measurable. Exercise 3.15 Derive the expression for the joint density of BM and its running maximum given in Corollary 3.4.2. Exercise 3.16 Let W be a standard BM and S t its running maximum. Show that for all t 0 and x>0 P{S t x} = P{τ x t} =2P{W t x} = P{ W t x}. Exercise 3.17 Prove Corollary 3.4.6 Exercise 3.18 Consider the Poisson process. Define an appropriate Banach space such that the associated transition function is a strongly continuous semigroup. Show that the generator of the Poisson process is given by for x Z +. Af(x) =λf(x + 1) λf(x).

3.11. EXERCISES 123 Exam- Exercise 3.19 Show for the generator A of the Ornstein-Uhlenbeck process (cf. ple 3.1.7 (B) and 3.5.7) that Af(x) = 1 2 σ2 f (x) αxf (x), x R,f {g : R R g, g, ĝ C 0 (R), ĝ(x) =xg (x)}. You may use the expression for the generator of Brownian motion derived in Example 3.6.5. Hint: denote by Pt X and Pt W the transition functions of Ornstein-Uhlenbeck process and BM respectively. Show that Pt X f(x) =Pg(t) W f(e αt x)whereg(t) =σ 2 (1 e 2αt )/2α. Exercise 3.20 Prove the Integration Lemma. Exercise 3.21 Prove the claim made in Example 3.5.13. Hint: to derive the explicit expression for the resolvent kernel it is needed to calculate integrals of the form 0 e a2 t b 2 /t dt. t To this end, first perform the substitution t =(b/a)s 2. Next, make a change of variables u = s 1/s and observe that u(s) =s 1/s is a continuously differentiable bijective function from (0, ) tor, theinverseu 1 : R (0, ) of which satisfies u 1 (t) u 1 ( t) =t, whence (u 1 ) (t)+(u 1 ) (t) = 1. Exercise 3.22 Prove the validity of the expression for the resolvent of the Markov jump process in Example 3.5.14. Exercise 3.23 Show that the Markov process from Example 3.2.5 is a Feller-Dynkin process if PC 0 (E) C 0 (E). Give an example of a Markov jump process that is not a Feller-Dynkin process. Exercise 3.24 Prove Lemma 3.6.6. Use this lemma to show the validity of the expression for the generator of W 2 t,withw t a standard BM, given in Example 3.6.7. Exercise 3.25 In the proof of Lemma 3.6.11, show that P{η r nt} ˆp n for n =0, 1,... Exercise 3.26 Branching model in continuous time Let E = Z + = {0, 1, 2,...}. Let λ, µ > 0. Cells in a certain population either split or die (independently of other cells in the population) after an exponentially distributed time with parameter λ+µ. With probability λ/(λ+µ) the cell then splits, and with probability µ/(λ + µ) it dies. Denote by X t the number of living cells at time t. This is an (E,E)-valued stochastic process, where E is the collection of all subsets of E. Assume that it is a Markov jump process. i) Argue that the generator Q is given by λi j = i +1 Q(i, j) = (λ + µ)i, j = i µi, j = i 1,i>0.

124 CHAPTER 3. MARKOV PROCESSES ii) Suppose X 0 = 1 a.s. We would like to compute the generating function G(z,t) = j z j P 1 {X t = j}. Show (using the Kolmogorov forward equations) that G satisfies the partial differential equation G =(λz µ)(z 1) G t z, with boundary condition G(z,0) = z. Show that this PDE has solution G(z,t) = λt(1 z)+z λt(1 z)+1, µ(1 z)e µt (µ λz)e λt, λ(1 z)e µt (µ λz)e λt µ = λ µ = λ iii) Compute E 1 X t by differentiating G appropriately. Compute lim t E 1 X t. iv) Compute the extinction probability P 1 {X t =0}, as well as lim t P 1 {X t =0} (use G). What conditions on λ and µ ensure that the cell population dies out a.s.? Exercise 3.27 Suppose that X is a real-valued canonical continuous Feller-Dynkin process, with generator Af(x) =α(x)f (x)+ 1 2 f (x), x R, for f D= {g : R R g, g,g C 0 (R)}, whereα is an arbitrary but fixed continuous, bounded function on R. Suppose that there exists a function f D, f 0, such that Af(x) =0, x R. (3.11.1) Then the martingale M f t has a simpler structure, namely M f t = f(x t ) f(x 0 ). i) Show that for f D(A), satisfying (3.11.1), Dynkin s formula holds, for all x E. Hence the requirement that E x τ< is not necessary! Let (a, b) R, a<b.putτ =inf{t >0 X t (,a] [b, )}. Definep x = {X τ = b}. ii) Assume that τ <, -a.s. for all x (a, b). Prove that p x = f(x) f(a) f(b) f(a), x (a, b), provided that f(b) = f(a). iii) Let X be a real-valued canonical, right-continuous Feller-Dynkin process, such that X t = X 0 + βt + σw t,wherex 0 and (W t ) t are independent, and (W t ) t a standard BM. Show for the generator A that D(A) D= {g : R R f,f,f C 0 (R) and is given by Af = βf + 1 2 σ2 f for f D) (you may use the generator of BM). Show that τ<, -a.s., x (a, b). Determine p x for x (a, b). Hint: you have to solve a simple differential equation to find f with βf + σ 2 f /2 = 0. This f is not a C 2 0 (R) function. Explain that this is no problem since X t only lives on [a, b] untilthe stopping time.

3.11. EXERCISES 125 iv) Let X be the Ornstein-Uhlenbeck process (cf. Example 3.1.5 (B) and 3.3.14). Show that τ<, -a.s. and determine p x for x (a, b). You may use the result of Exercise 3.18 on the generator of the Ornstein-Uhlenbeck process. See also hint of (iii). Notice that the solution can only be represented as an integral. Exercise 3.28 We want to construct a standard BM in R d (d< ): this is an R d -valued process W =(W 1,...,W d ), where W 1,...,W d are independent standard BMin R. i) Sketch how to construct d-dimensional BM. ii) Show that W has stationary, independent increments. iii) Show that W is a Feller-Dynkin process with respect to the natural filtration, with transition function 1 P t f(x) = (2πt) d/2 f(y)e y x 2 /2t dy, R d where y =(y 1,...,y d ), x =(x 1,...,x d ) R d and y x = L 2 (R d )-norm. d i=1 (y i x i ) 2 is the Exercise 3.29 (Continuation of Exercise 3.28) Let X be an R d -valued canonical continuous Feller-Dynkin process, such that X t = X 0 + W t,wherex 0 is an R d -valued r.v. and (W t ) t a standard d-dimensional BM that is independent of X 0. Notice that X is strong Markov. We would like to show that the generator is defined by Af(x) = 1 2 f(x), (3.11.2) where f(x) = d i=1 2 f(x) is the Laplacian of f, for f D= {f : R d R f, x 2 x i i f, 2 x i x j f C 0 (R d ),i,j =1,...,d}. We again want to use the characteristic operator. To this end, define for r>0 τ r =inf{t 0 X t X 0 r}. i) Argue that τ r is a finite (F t ) t -stopping time. Show that E x τ r = r 2 /d (by using optional stopping). Argue that X τr has the uniform distribution on {y y x = r}. ii) Show the validity of (3.11.2) for f D(use the characteristic operator). Argue that this implies D(A) D. iii) For 0 <a< x <b, show that {T a <T b } = log b log x log b log a, d =2 x 2 d b 2 d a 2 d b 2 d, d 3, where T a =inf{t 0 x a} and T b =inf{t 0 x b}. Hint: a similar procedure as in Exercise 3.27. iv) Compute {τ a < } for x with a< x.

126 CHAPTER 3. MARKOV PROCESSES Exercise 3.30 Prove (3.7.1) in the proof of Lemma 3.7.1. Exercise 3.31 Suppose that E R d. Show that every countable, dense subset H of the space C + 0 (E) of non-negative functions in C 0(E) separates the points of E δ. This means that for all x = y in E there exists a function h H, such that h(x) = h(y), and for all x E there exists a function h H, such that h(x) = h(δ) = 0. Exercise 3.32 Let (X, d) be a compact metric space (with metric d). Let H be a class of nonnegative, continuous functions on X that separates the points of X. Prove that d(x n,x) 0 if and only if h(x n ) h(x) for all h H. Hint: suppose that H = {h 1,h 2,...}, endowr with the product topology and consider the map A(x) =(h 1 (x),h 2 (x),...). Exercise 3.33 Let X, Y be two random variables defined on the same probability space, taking values in the Polish space E equipped with the Borel-σ-algebra. Show that X = Y a.s. if and only if Ef(X)g(Y )=Ef(X)g(X) for all C 0 (E) functions f and g on E. Hint: use the monotone class theorem (see BN) and consider the class H = {h : E E R h E E measurable, h <, Eh(X, Y )=Eh(X, X)}. Exercise 3.34 Let (F t ) t be the usual augmentation of the natural filtration of a canonical, cadlag Feller-Dynkin process. Show that for every nonnegative, F t -measurable random variable Z and every finite stopping time τ, the random variable Z τ is F τ+t -measurable. Hint: first prove it for Z = 1 {A}, A Ft X. Next, prove it for Z = 1 {A}, A F t, and use the fact that A Ft ν if and only if there exists B Ft X and C, D N ν, such that B \ C A B D (this follows from Problem 10.1 in BN). Finally prove it for arbitrary Z. Exercise 3.35 Let X be a Feller-Dynkin canonical cadlag process and let (F t ) t be the usual augmentation. Suppose that we have (F t ) t -stopping times τ n τ a.s. Show that lim n X τn = X τ a.s. on {τ < }. This is called the quasi-left continuity of Feller-Dynkin processes. Hint: first argue that it is sufficient to show the result for bounded τ. Next, put Y =lim n X τn and explain why this limit exists. Use the strong Markov property to show for f,g C 0 (E δ ) that E x f(y )g(x τ )=lim t 0 lim n E x f(x τn )g(x τn+t) =E x f(y )g(y ). The claim then follows from Exercise 3.33. Exercise 3.36 Let X be a progressively measurable cacnonical process w.r.t. the natural filtration. Let σ be a finite F X t -stopping time. Suppose that Z is an F X t -measurable random variable. Show that Z θ σ is F X σ+t-measurable.