Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Similar documents
Analysis on Graphs. Alexander Grigoryan Lecture Notes. University of Bielefeld, WS 2011/12

2. Transience and Recurrence

P (A G) dp G P (A G)

MARKOV CHAINS: STATIONARY DISTRIBUTIONS AND FUNCTIONS ON STATE SPACES. Contents

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

2 PROBABILITY Figure 1. A graph of typical one-dimensional Brownian motion. Technical Denition: If then [f(b(t + s)) j F 0 s ] = E B (s)f(b L ); F 0 s

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Applied Stochastic Processes

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Note that in the example in Lecture 1, the state Home is recurrent (and even absorbing), but all other states are transient. f ii (n) f ii = n=1 < +

Lecture 7. We can regard (p(i, j)) as defining a (maybe infinite) matrix P. Then a basic fact is

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

semi-group and P x the law of starting at x 2 E. We consider a branching mechanism : R +! R + of the form (u) = au + bu 2 + n(dr)[e ru 1 + ru]; (;1) w

Lecture 12. F o s, (1.1) F t := s>t

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M.

4th Preparation Sheet - Solutions

Math 6810 (Probability) Fall Lecture notes

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

QUESTIONS AND SOLUTIONS

Lecture 7. 1 Notations. Tel Aviv University Spring 2011

STOCHASTIC DIFFERENTIAL EQUATIONS WITH EXTRA PROPERTIES H. JEROME KEISLER. Department of Mathematics. University of Wisconsin.

Probability Theory. Richard F. Bass

Exercises Measure Theoretic Probability

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Brownian Motion and Stochastic Calculus

Midterm 1. Every element of the set of functions is continuous

Stochastic Processes

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

{σ x >t}p x. (σ x >t)=e at.

Useful Probability Theorems

1. Stochastic Processes and filtrations

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 65

A Representation of Excessive Functions as Expected Suprema

Exercises Measure Theoretic Probability

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Harmonic Functions and Brownian Motion in Several Dimensions

Math Homework 5 Solutions

1 Introduction This work follows a paper by P. Shields [1] concerned with a problem of a relation between the entropy rate of a nite-valued stationary

A D VA N C E D P R O B A B I L - I T Y

Measure and Integration: Solutions of CW2

Random Process Lecture 1. Fundamentals of Probability

Plan Martingales cont d. 0. Questions for Exam 2. More Examples 3. Overview of Results. Reading: study Next Time: first exam

Stochastic Processes (Week 6)

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Markov Chains and Stochastic Sampling

Chapter 1. Poisson processes. 1.1 Definitions

New concepts: Span of a vector set, matrix column space (range) Linearly dependent set of vectors Matrix null space

Lecture 17 Brownian motion as a Markov process

Chapter 2: Markov Chains and Queues in Discrete Time

Lectures on Markov Chains

Stanford Mathematics Department Math 205A Lecture Supplement #4 Borel Regular & Radon Measures

Introduction to Real Analysis

Nonexistence results and estimates for some nonlinear elliptic problems

1 Math 285 Homework Problem List for S2016

Stochastic Processes

Measure Theory on Topological Spaces. Course: Prof. Tony Dorlas 2010 Typset: Cathal Ormond

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10

Harmonic Functions and Brownian motion

1 Stat 605. Homework I. Due Feb. 1, 2011

ERRATA: Probabilistic Techniques in Analysis

ECON 2530b: International Finance

Discrete time Markov chains. Discrete Time Markov Chains, Limiting. Limiting Distribution and Classification. Regular Transition Probability Matrices

N.G.Bean, D.A.Green and P.G.Taylor. University of Adelaide. Adelaide. Abstract. process of an MMPP/M/1 queue is not a MAP unless the queue is a

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Lecture 7. µ(x)f(x). When µ is a probability measure, we say µ is a stationary distribution.

A PROBABILISTIC MODEL FOR CASH FLOW

Solutions to the Exercises in Stochastic Analysis

4.6 Montel's Theorem. Robert Oeckl CA NOTES 7 17/11/2009 1

1 Stochastic Dynamic Programming

1 Solutions to selected problems

Metric Spaces and Topology

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

RENEWAL THEORY STEVEN P. LALLEY UNIVERSITY OF CHICAGO. X i

LEBESGUE INTEGRATION. Introduction

A slow transient diusion in a drifted stable potential

( f ^ M _ M 0 )dµ (5.1)

Positive and null recurrent-branching Process

Stochastic Process (ENPC) Monday, 22nd of January 2018 (2h30)

ABSTRACT INTEGRATION CHAPTER ONE

A theorem on summable families in normed groups. Dedicated to the professors of mathematics. L. Berg, W. Engel, G. Pazderski, and H.- W. Stolle.

UCLA Chemical Engineering. Process & Control Systems Engineering Laboratory

l(y j ) = 0 for all y j (1)

Stochastic integration. P.J.C. Spreij

290 J.M. Carnicer, J.M. Pe~na basis (u 1 ; : : : ; u n ) consisting of minimally supported elements, yet also has a basis (v 1 ; : : : ; v n ) which f

Real Analysis Problems

Chapter 5. Weak convergence

25.1 Ergodicity and Metric Transitivity

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Functional Analysis. Franck Sueur Metric spaces Definitions Completeness Compactness Separability...

The main results about probability measures are the following two facts:

MATH 202B - Problem Set 5

A primer on basic probability and Markov chains

PROPERTY OF HALF{SPACES. July 25, Abstract

Notes on Iterated Expectations Stephen Morris February 2002

Power Domains and Iterated Function. Systems. Abbas Edalat. Department of Computing. Imperial College of Science, Technology and Medicine

On It^o's formula for multidimensional Brownian motion

A PECULIAR COIN-TOSSING MODEL

Transcription:

Institute for Applied Mathematics WS17/18 Massimiliano Gubinelli Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. [version 1, 2017.11.1] We introduce the notion of martingale problem for Markov chains and use it to discuss stochastic stability and recurrence properties of discrete state chains. See also Chapter 1 of Prof. Eberle's Markov Process course. 1 Martingale problem Let ( n ) n be a Markov chain on E and let f: E! R be a bounded function. One way to describe the dynamics associated to the process ( n ) n is to use the decomposition of the process f( n ) as a martingale and a predictable process: f( n ) = M f f n + A n 1 ; n > 1: f There is only one such decomposition and it follows that A n 1 = Pn 1 k=0 (Lf)( k ), where Lf = ( I)f, = P 1 the one step transition kernel and I the identity transition kernel. The operator L: F b (E)! F b (E) is called the generator of the discrete time chains. The martingale property of (M n f ) n and the generator characterise completely the Markov chain. Theorem 1. ( n ) n is a Markov chain with onestep transition kernel i for any f 2 F b (E) the process is a martingale. n 1 M n f := f( n ) k=0 (Lf)( k ); n > 0; The proof is left as an exercise. We postpone to discuss the continuous time formulation of the martingale problem which requires some analytical sophistication. Now we concentrate to illustrate the connection between martingale and Markov properties and use martingale to establish basic criteria for stability and recurrence of Markov chains. The exterior boundary @D of a set D 2 E w.r.t. the Markov chain is given by @D = [ x2d fsupp (x; )gnd: 1

where the support supp of the measure is dened as the smallest closed set such that jj(a c ) = 0. Open sets contained in (D [ @D) c cannot be reached by the chain in one step starting from D. Example 2. For the simple random walk on Z d : @D = fy 2 Z d : 9x 2 D: jx yj = 1g. Recall that we denote with T D the hitting time of D: T D = inf fn > 0: n 2 Dg. Interesting quantities of a Markov chain are: a) The exit probability from D starting at x 2 D : P x (T D c < 1); b) The law of the exit point: P x ( TD c 2 B; T D c < 1); c) The mean exit time: E x [T D c]; d) The average occupation of B before exiting D: T D c 1 G D (x; B) = E x [ I B ( k )] = P x ( k 2 B; k < T D c); k=0 k>0 (Green kernel of D); e) The Laplace transform of the exit time: E x [e T D c ]; f) The Laplace transform of the occupation time of B: E x he P T D c 1 k=0 IB ( k )i ; Let v; c 2 F + (E) and consider the process n 1 M n := v( n ) + k=0 n 1 c( k ) = M n v + k=0 (Lv + c)( k ) where we made explicit its Doob's decomposition. If Lv + c 6 0 we deduce that (M n ) n is a non-negative supermartingale, the same applies to the stopped process M n T = M n^t where T is a stopping time. By the supermartingale convergence theorem the limit M 1 := lim n M n exists almost surely and in particular we have M n T! M T a.s. everywhere, even on ft = +1g. Let now D 2 E, T = T D c, v; f 2 F + (D) such that Lv + c 6 0 on D and v > f on @D. Then M T > v( T )I T <1 + k=0 c( k ) > f( T )I T <1 + k=0 c( k ) and by Fatou we conclude u(x) := E x " # f( T )I T <1 + c( k ) 6 E x [M T ] 6 E x [M 0 ] = v(x); x 2 D: k=0 2

Then all nonnegative solutions to Lv + c = 0 v = f on D on @D (1) dominate u. Let now assume that T < 1 a.s., then by the Markov property we have, for all x 2 Dn@D E x [f( T ) + k=0 c( k )jf 1 ] = E x [f( T ) + k=1 =E x [(f( T ) + k=0 c( k )) 1 jf 1 ] + c( 0 ) c( k )jf 1 ] + c( 0 ) =E 1 [f( T ) + k=0 c( k )] + c( 0 ) = u( 1 ) + c( 0 ) since T > 1 P x a:s: if x 2 Dn@D. From this we conclude that 0 = E x [u( 1 ) + c(x) u(x)] = (Lu + c)(x); x 2 Dn@D and moreoever u(x) = f(x) if x 2 @D. So u is a solution of the problem (1). On the other hand, if v is a solution to (1) and f is a bounded function then we have u = v since all the inequalities in the above supermartingale argument become equalitites. That is, solution of the above problem are unique. Remark 3. Extension to absorbed chains are discussed in Eberle's notes. 2 Recurrence for countable Markov chains These results hints to the link between superharmonic functions (i.e. Lv 6 0) and asymptotic behaviour of the process. In order to understand better this connection we consider the case of Markov chains on a countable state space E. Introduce the rst return time to A 2 E as and let T + x = T + fxg for x 2 E. T A + := inf fn > 1: n 2 Ag Denition 4. A state x 2 E is recurrent if P x (T x + < 1) = 1 otherwise is transient. Is positive recurrent if E x [T x + ] < +1. Let N x = #fn > 1: n = xg = P n>1 I n =x the number of visits of x 2 E. By strong Markov, recurrence is equivalent to require that P x (N x = +1) = 1. If x is transient we have P x (N x = +1) = 0 (see below for the proof). 3

Introduce a sequence of passage times at x 2 E: T x 0 = 0, T x n = inffk > T x n 1 : k = xg, n > 1. For n > 1, if T x n 1 < +1 let x n := T x n T x n 1. Proposition 5. (Regeneration) Let x 2 E and n > 1. Conditionally on ft x n < +1g the law of x n+1 is independent of (T x 1 ; :::; T x n ) and P( x n+1 = kjt x n < +1) = P x (T x = k); k 2 N [ f+1g: Proof. Exercise. Lemma 6. For n > 0 we have that P x (N x > n) = f x n with f x := P x (T x < +1). Proof. Use strong Markov. Remark 7. For any r.v. :! N we have E[] = E[ k>0 1 k6 ] = k>0 P( > k) (2) Theorem 8. There is the following dychotomy: i. P x (T x < 1) = 1)P x (N x = 1) = 1 and P n>1 n (x; x) = +1; ii. P x (T x < 1) < 1) P x (N x = 1) = 0 and P n>1 n (x; x) < +1. Proof. If f x = P x (T x < 1) = 1 then by Lemma 6 we have P x (N x = +1) = lim n!1 P x(n x > n) = lim n!1 f x n = 1 and then P x (N x = 1) = 1 and 1 = E x [N x ] = E x [ P n>1 1 n =x] = P n>1 n (x; x). On the other hand if f x < 1 then by eq. (2) and Lemma 6, n (x; x) = E x [N x ] = P x (N x > n) = n>1 n>0 n>0 f x n = 1 1 f x < +1; which implies that P x (N x = +1) = 0. Introduce a transitive relation x! y on states x; y 2E when one of the following equivalent condition holds a) P x (T y < +1) = 1; b) n (x; y) > 0 for some n > 0; c) there exists a sequence of states (x k ) k=0;:::;n such that x 0 = x; x n = y and (x k ; fx k+1 g) > 0. 4

Let x $ y the equivalence relation given by x! y and y! x. This equivalence relation induce a partition of E into equivalence classes, usually called communication classes. When there is only a class we say that the chain is irreducible. Theorem 9. All the states in the same class are of the same type (either transient or recurrent). Proof. If x $ y then there are N ; M such that N (x; y) > 0 et M (y; x) > 0. A simple bound gives 2N +n+2m (x; x) > N (x; y) N +n+m (y; y) M (y; x) > [ N (x; y) M (y; x)] 2 n (x; x) for all n > 1. Let = N (x; y) M (y; x) > 0, then k (x; x) > k (x; x) > k>0 k>2n +2M k>n +M k (y; y) > 2 k>0 k (x; x) and then the states x; y are both either transient or recurrent. Remark 10. An irreducible chain is either recurrent or transient. Proposition 11. A nite set A E such that (x; A c ) = 0 for all x 2 A contains at least one recurrent state. A nite irreducible chain is recurrent. Proof. Let #A < +1 and assume that for all z 2 A, P z (N z = +1) = 0. Fix x 2 A, for all z 2 A eq. (?) gives P x (N z > r) = P x (T y < +1)P z (N z > r): Taking limits for r! +1 we get P x (N z = +1) = P x (T z < +1)P z (N z = +1) = 0 for all z 2 A and as a consequence 1 = P x (\ z2a fn z < +1g) = P x ( z2a N z < +1) = P x ( n>0 1 n 2A < +1) since P z2a N z = P P z2a n>0 1 n =z = P n>0 1 n 2A is the time passed in A by the chain. Since the set A is closed we have P x ( n 2 A) = 1 for all n > 0 which should imply that the time spent in A is innite. A contradiction. 3 ForsterLyapounouv criteria for recurrence We assume that the chain is irreducible. Theorem 12. A discrete Markov chain is a) Transient i there exists V 2 F + (E) and a set A E and a state y 2 E such that LV 6 0 on A c and V (y) < [inf A V ]; 5

b) Recurrent i there exists V 2F + (E) such that #flv > 0g<1 and #fv 6 N g <1 for all N > 0. c) Positive recurrent i there exists V 2 F + (E) such that #flv > 1g < 1. Proof. Case (a). Consider the process M n = V ( n^ta ) is a nonnegative supermartingale. By optional stopping h i V (y) > E y [V ( n^ta )] > E y [V ( TA )I TA <1] > inf V P y (T A < 1); A so if V (y) < [inf A V ] we obtain P y (T A < 1) < 1, that is the chain is transient. Conversely, if the chain is transient, we can take V (x) = P x (T A < +1). Case (b). Let A = fx: LV > 0g and let D N = fx: V (x) > N g which by assumption has nite complement for all N. Then P x (T DN < 1) = 1 for all x 2 E. By optional stopping, for all x 2 E, V (x) > E x [V ( TDN^T A )] > E x [V ( TDN )I TDN 6T A ] > NP x (T DN 6 T A ) > NP x (T A = +1); and taking N!1 we obtain P x (T A = +1)=0, so the chain is recurrent. Now assume that the chain is recurrent and consider a nite set A and a decreasing sequence of sets (B N ) N with nite complements and such that \ N B N =? and let V N (x)=p x (T BN <T A ) which is an harmonic function on En(A[D N ) such that V N = 1 on B N and V N =0 on A. By recurrence we have V N (x) & P x (+1 = T A ) = 0 for all x 2 E and we can nd a sequence (N k ) k such that V (x) = P k V N k (x) < 1 for every x (by a diagonal argument). This function satises the required conditions since V (x) > N k on D Nk and if x 2 D N we have LV N (x) 6 1 1 = 0 so LV 6 0 on A c. Case (c). We let A = flv > 1g and consider V (x) > lim n E x [V ( n^ta ) + (n ^ T A )] > E x [T A ] which implies then that the chain is positive recurrent. On the other hand if the chain is positive recurrent, then let V (x) = E x (T A ) for an arbitrary nite set A and check that V satises the assumptions. 3.1 Recurrence and transience of random walks The random walk on Z d has generator Lf(x) = 1 2d y:yx [f(y) f(x)] where y x, jx yj = 1. The chain is irreducible. We want to show that this chain is recurrent only if d 6 2. If f varies slowly then we have Lf(x) ' f(x) and if we let V (x) = jxj 2 we have LV (x) ' (2)(d + 2 2)jxj 2 2 6

so for d > 2 we can nd < 0 such that LV 6 0 outside a ball B(0; r 0 ) (taking accounts errors in the above approximations) and V is decreasing at innity so for any x we can arrange to have V (x) < inf B(0;r0 )V. When d = 1 we need to choose 2 (0; 1 / 2) to ensure V (x)! +1 as jxj! +1. For d = 2 the right function to look for is of the form V (x) = (logjxj 2 ) so that V (x) ' 4( 1) jxj 2 (logjxj 2 ) 2 and taking 2 (0; 1) gives a function which satises the recurrence criterion. 3.2 Harris recurrence of sets in general state spaces We consider now the situation where the state space is a general Polish space (and E the Borel -algebra). In this case the ForsterLyapounouv conditions are not as tight. However the existence of certain superharmonic functions implies Harris recurrence. Denition 13. A set A 2 E is Harris recurrent if P x (T A + < 1)=1 for all x 2 A. Is positive recurrent if E x [T A + ] < 1 for all x 2 A. Proposition 14. a) If there exists a function V 2F + (E) such that LV 60 on A c and T fv >cg <1 P x a.s. for all x 2 E and c > 0 then the set A is Harris recurrent. b) If there exists a function V 2 F + (E) such that LV 6 1 on A c and V < 1 on A then the set A is positive recurrent. Proof. For (a) the proof goes as in the discrete setting. For (b) we deduce rst that E x [T A ] 6 V (x) and then by a onestep computation deduce that E x [T + A ] = E x [E 1 [T + A ]] = (V )(x) < 1. Example 15. (State space model on R d ) Consider the following Markov chain on R d n+1 = n + b( n ) + W n where (W n ) n are iid with mean zero and covariance Cov(W i ; W j ) = ij. We consider the function V (x) = jxj 2 /". Then "(V )(x) = E[jx + b(x) + W 1 j 2 ] jxj 2 = 2hx; b(x)i + jb(x)j 2 + d: By choosing " small enough we can see that the sucient condition for positive recurrence holds for A = B(0; r) and r suciently large. 7