Notes 15 : UI Martingales

Similar documents
Notes 18 : Optional Sampling Theorem

Notes 13 : Conditioning

Notes 1 : Measure-theoretic foundations I

Lecture 19 : Brownian motion: Path properties I

Math 6810 (Probability) Fall Lecture notes

4 Expectation & the Lebesgue Theorems

Notes 6 : First and second moment methods

A D VA N C E D P R O B A B I L - I T Y

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Modern Discrete Probability Branching processes

Positive and null recurrent-branching Process

Useful Probability Theorems

Martingale Theory and Applications

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Notes 9 : Infinitely divisible and stable laws

18.175: Lecture 14 Infinite divisibility and so forth

Lecture 23. Random walks

JUSTIN HARTMANN. F n Σ.

1. Stochastic Processes and filtrations

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Lectures 22-23: Conditional Expectations

(A n + B n + 1) A n + B n

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

Brownian Motion and Conditional Probability

Advanced Probability

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Solutions to the Exercises in Stochastic Analysis

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

MATH 418: Lectures on Conditional Expectation

Random Process Lecture 1. Fundamentals of Probability

Lecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write

STOCHASTIC MODELS FOR WEB 2.0

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

Lecture 12. F o s, (1.1) F t := s>t

Harmonic functions on groups

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Advanced Probability

Probability and Measure

Martingale Theory for Finance

Lecture 17 Brownian motion as a Markov process

Information Theory and Statistics Lecture 3: Stationary ergodic processes

1 Sequences of events and their limits

Probability Theory. Richard F. Bass

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

Notes 5 : More on the a.s. convergence of sums

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

Probability Theory I: Syllabus and Exercise

ECON 2530b: International Finance

4 Sums of Independent Random Variables

Admin and Lecture 1: Recap of Measure Theory

MATH/STAT 235A Probability Theory Lecture Notes, Fall 2013

Hitting Probabilities

MATH 217A HOMEWORK. P (A i A j ). First, the basis case. We make a union disjoint as follows: P (A B) = P (A) + P (A c B)

Inference for Stochastic Processes

Lecture 11 : Asymptotic Sample Complexity

P (A G) dp G P (A G)

Part III Advanced Probability

Selected Exercises on Expectations and Some Probability Inequalities

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Probability Models of Information Exchange on Networks Lecture 1

STAT 7032 Probability Spring Wlodek Bryc

Probability Theory II. Spring 2016 Peter Orbanz

Random Bernstein-Markov factors

Appendix B for The Evolution of Strategic Sophistication (Intended for Online Publication)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Problem Points S C O R E Total: 120

Math 564 Homework 1. Solutions.

0.1 Uniform integrability

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Optimal stopping for non-linear expectations Part I

It follows from the above inequalities that for c C 1

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

Brownian Motion and Stochastic Calculus

( f ^ M _ M 0 )dµ (5.1)

Entropy and Ergodic Theory Lecture 15: A first look at concentration

Lecture 3 - Expectation, inequalities and laws of large numbers

MATH 6605: SUMMARY LECTURE NOTES

n E(X t T n = lim X s Tn = X s

Mathematical Statistical Physics Solution Sketch of the math part of the Exam

CHAPTER 1. Martingales

Hints/Solutions for Homework 3

(6, 4) Is there arbitrage in this market? If so, find all arbitrages. If not, find all pricing kernels.

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Conditional expectation

Estimates for probabilities of independent events and infinite series

STAT 200C: High-dimensional Statistics

Math 421, Homework #6 Solutions. (1) Let E R n Show that = (E c ) o, i.e. the complement of the closure is the interior of the complement.

18.175: Lecture 3 Integration

Chapter 5. Weak convergence

Probability and Measure

IEOR 6711: Stochastic Models I Fall 2013, Professor Whitt Lecture Notes, Thursday, September 5 Modes of Convergence

Lecture 9 : Identifiability of Markov Models

Random Walks Conditioned to Stay Positive

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Jump Processes. Richard F. Bass

Stochastic integration. P.J.C. Spreij

Transcription:

Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence. First note: LEM 15.1 Let Y L 1. ε > 0, K > 0 s.t. E[ Y ; Y > K] < ε. Proof: Immediate by (MON) to E[ Y ; Y K]. What we need is for this condition to hold uniformly over the sequence: DEF 15.2 (Uniform Integrability) A collection C of RVs on (Ω, F, P) is uniformly integrable (UI) if: ε > 0, K > + s.t. E[ X ; X > K] < ε, X C. THM 15.3 (Necessary and Sufficient Condition for L 1 Convergence) Let {X n } L 1 and X L 1. Then X n X in L 1 if and only if: X n X in prob {X n } is UI. Before giving the proof, we look at a few examples. EX 15.4 (L 1 -bddness is not sufficient) Let C is UI and X C. Note that E X E[ X ; X K] + E[ X ; X < K] ε + K < +, so UI implies L 1 -bddness. But the opposite is not true by our last example (take n 2 > K). 1

Lecture 15: UI Martingales 2 EX 15.5 (L p -bdd RVs) But L p -bddness works for p > 1. Let C be L p -bdd and X C. Then as K +. E[ X ; X > K] E[K 1 p X p ; X > K ] K 1 p A 0, EX 15.6 (Dominated RVs) Assume Y L 1 s.t. X Y X C. Then and apply lemma above. E[ X ; X > K] E[Y ; X > K] E[Y ; Y > K], 2 Proof of main theorem Proof: We start with the if part. Fix ε > 0. We want to show that for n large enough: E X n X ε. It is natural to truncate at K to apply the UI property. Let φ K (x) = sgn(x)[ x K]. Then, E X n X E φ K (X n ) X n + E φ K (X) X + E φ K (X n ) φ K (X) E[ X n ; X n > K] + E[ X ; X > K] + E φ K (X n ) φ K (X). The 1st term ε/3 by UI and the 2nd term ε/3 by lemma above. Check, by case analysis, that φ K (x) φ K (y) x y, so φ K (X n ) P φ K (X). By bounded convergence for convergence in probability, the claim is proved. LEM 15.7 (Bounded convergence theorem (convergence in probability version)) Let X n K < + n and X n P X. Then Proof:(Sketch) By E X n X 0. P[ X K + m 1 ] P[ X n X m 1 ], it follows that P[ X K] = 1. Fix ε > 0 E X n X = E[ X n X ; X n X > ε/2] + E[ X n X ; X n X ε/2] 2KP[ X n X > ε/2] + ε/2 < ε,

Lecture 15: UI Martingales 3 for n large enough. Proof of only if part. Suppose X n X in L 1. We know that L 1 implies convergence in probability. So the first claim follows. For the second claim, if n N (large enough), We can choose K large enough so that E X n X ε. E[ X n ; X n > K] < ε, n N. (Because there is only a finite number.) So only need to worry about n > N. To use L 1 convergence, natural to write E[ X n ; X n > K] E[ X n X ; X n > K] + E[ X ; X n > K]. First term ε. The issue with the second term is that we cannot apply the lemma because the event involves X n rather than X. In fact, a stronger version exists: LEM 15.8 (Absolute continuity) Let X L 1. ε > 0, δ > 0, s.t. P[F ] < δ implies E[ X ; F ] < ε. Proof: Argue by contradiction. Suppose there is ε and F n s.t. P[F n ] 2 n and By BC, E[ X ; F n ] ε. P[H] P[F n i.o.] = 0. By reverse Fatou (applied to X 1 H = lim sup X 1 Fn ), a contradiction. To conclude note that E[ X ; H] ε, P[ X n > K] E X n K sup n N E X n sup n N E X + E X n X < δ, K K uniformly in n for K large enough. We are done.

Lecture 15: UI Martingales 4 3 UI MGs THM 15.9 (Convergence of UI MGs) Let M be UI MG. Then M n M, a.s. and in L 1. Moreover, M n = E[M F n ], n. Proof: UI implies L 1 -bddness so we have M n M a.s. By necessary and sufficient condition, we also have L 1 convergence. Now note that for all r n and F F n, we know E[M r F n ] = M n or E[M r ; F ] = E[M n ; F ], by definition of CE. We can take a limit by L 1 convergence. More precisely E[M r ; F ] E[M ; F ] E[ M r M ; F ] E[ M r M ] 0, as r. So plugging above and E[M F n ] = M n. E[M ; F ] = E[M n ; F ], 4 Applications I THM 15.10 (Levy s upward thm) Let Z L 1 and define M n = E[Z F n ]. Then M is a UI MG and M n M = E[Z F ], a.s. and in L 1. Proof: M is a MG by (TOWER). We first show it is UI: LEM 15.11 Let X L 1 (Ω, F, P). Then is UI. {E[X G] : G is a sub-σ-field of F},

Lecture 15: UI Martingales 5 Proof: We use the absolute continuity lemma again. Let Y = E[X G] G. Since { Y > K} G, By Markov and (JENSEN) E[ Y ; Y > K] = E[ E[X G] ; Y > K] P[ Y > K] E Y K E[E[ X G]; Y > K] = E[ X ; Y > K]. E X K δ, for K large enough (uniformly in G). And we are done. In particular, we have convergence a.s. and in L 1 to M F. Let Y = E[Z F ] F. By dividing into negative and positive parts, we assume Z 0. We want to show, for F F, E[Z; F ] = E[M ; F ]. By Uniqueness Lemma, it suffices to prove equality for all F n. If F F n F, then by (TOWER) E[Z; F ] = E[Y ; F ] = E[M n ; F ] = E[M ; F ]. The first equality is by definition of Y ; the second equality is by definition of M n ; the third equality is from our main theorem. THM 15.12 (Levy s 0 1 law) Let A F. Then Proof: Immediate. P[A F n ] 1 A. COR 15.13 (Kolmogorov s 0 1 law) Let X 1, X 2,... be iid RVs. Recall that the tail σ-field is T = n T n = n σ(x n+1, X n+2,...). If A T then P[A] {0, 1}. Proof: Since A T n is independent of F n, P[A F n ] = P[A], n. By Levy s law, P[A] = 1 A {0, 1}.

Lecture 15: UI Martingales 6 5 Applications II THM 15.14 (Levy s Downward Thm) Let Z L 1 (Ω, F, P) and {G n } n 0 a collection of σ-fields s.t. G = k G k G n G 1 F. Define Then a.s. and in L 1. M n = E[Z G n ]. M n M = E[Z G ] Proof: We apply the same argument as in the Martingale Convergence Thm. Let α < β Q and Note that Λ α,β = {ω : lim inf X n < α < β < lim sup X n }. Λ {ω : X n does not converge} = {ω : lim inf X n < lim sup X n } = α<β Q Λ α,β. Let U N [α, β] be the number of upcrossings of [α, β] between time N and 1. Then by the Upcrossing Lemma applied to the MG M N,..., M 1 By (MON) and so that Since (β α)eu N [α, β] α + E M 1 α + E Z. U N [α, β] U [α, β], (β α)eu [α, β] α + E Z < +, P[U [α, β] = ] = 0. Λ α,β {U [α, β] = }, we have P[Λ α,β ] = 0. By countability, P[Λ] = 0. Therefore we have convergence a.s.

Lecture 15: UI Martingales 7 By lemma in previous class, M is UI and hence we have L 1 convergence as well. Finally, for all G G G n, E[Z; G] = E[M n ; G]. Take the limit n + and use L 1 convergence. 5.1 Law of large numbers An application: THM 15.15 (Strong Law; Martingale Proof) Let X 1, X 2,... be iid RVs with E[X 1 ] = µ and E X 1 < +. Let S n = i n X n. Then a.s. and in L 1. Proof: Let n 1 S n µ, G n = σ(s n, S n+1, S n+2,...) = σ(s n, X n+1, X n+2,...), and note that, for 1 i n, E[X 1 G n ] = E[X 1 S n ] = E[X i S n ] = E[n 1 S n S n ] = n 1 S n, by symmetry. By Levy s Downward Thm n 1 S n E[X 1 G ], a.s. and in L 1. But the limit must be trivial by Kolmogorov s 0-1 law and we must have E[X 1 G ] = µ. 5.2 Hewitt-Savage DEF 15.16 Let X 1, X 2,... be iid RVs. Let E n be the σ-field generated by events invariant under permutations of the Xs that leave X n+1, X n+2,... unchanged. The exchangeable σ-field is E = m E m. THM 15.17 (Hewitt-Savage 0-1 law) Let X 1, X 2,... be iid RVs. If A E then P[A] {0, 1}.

Lecture 15: UI Martingales 8 Proof: The idea of the proof is to show that A is independent of itself. Indeed, we then have 0 = P[A] P[A A] = P[A] P[A]P[A] = P[A](1 P[A]). Since A E and A F, it suffices to show that E is independent of F n for every n (by the π-λ theorem). WTS: for every bounded φ, B E, E[φ(X 1,..., X k ); B] = E[φ(X 1,..., X k )]E[B] = E[E[φ(X 1,..., X k )]; B], or equivalently Y = E[φ(X 1,..., X k ) E] = E[φ(X 1,..., X k )]. It suffices to show that Y is independent of F k. Indeed, by the L 2 characterization of conditional expectation and independence, 0 = E[(φ(X 1,..., X k ) Y )Y ] = E[φ(X 1,..., X k )]E[Y ] E[Y 2 ] = Var[Y ], and Y is constant. 1. Since φ is bounded, it is integrable and Levy s Downward Thm implies E[φ(X 1,..., X k ) E n ] E[φ(X 1,..., X k ) E]. 2. We make φ exchangeable by averaging over all configurations and taking a limit as n +. Define A n (φ) = 1 φ(x i1,..., X ik ), (n) k 1 i 1 =i k n where (n) k = n(n 1) (n k + 1). Note by symmetry A n (φ) = E[A n (φ) E n ] = E[φ(X 1,..., X k ) E n ] E[φ(X 1,..., X k ) E]. 3. The reason we did this is that now the first k Xs have little influence on this quantity and therefore the limit is independent of them. However, note that 1 φ(x i1,..., X ik ) k(n 1) k 1 sup φ = k sup φ 0, (n) k (n) k n 1 i so that the limit of A n (φ) is independent of X 1 and and by induction E[φ(X 1,..., X k ) E] σ(x 2,...), Y = E[φ(X 1,..., X k ) E] σ(x k+1,...).

Lecture 15: UI Martingales 9 6 Optional Sampling 6.1 Review: Stopping times Recall: DEF 15.18 A random variable T : Ω Z + {0, 1,..., + } is called a stopping time if {T = n} F n, n Z +. EX 15.19 Let {A n } be an adapted process and B B. Then is a stopping time. T = inf{n 0 : A n B}, THM 15.20 Let {M n } be a MG and T be a stopping time. Then M T is integrable and E[M T ] = E[X 0 ]. if one of the following holds: 1. T is bounded. 2. M is bounded and T is a.s. finite. 3. E[T ] < + and M has bounded increments. 4. M is UI. (This one is new. The proof follows from the Optional Sampling Theorem below.) DEF 15.21 (F T ) Let T be a stopping time. Denote by F T the set of all events F such that n Z + F {T = n} F n. 6.2 More on the σ-field F T The following two lemmas help clarify the definition of F T : LEM 15.22 F T = F n if T n, F T = F if T and F T F for any T.

Lecture 15: UI Martingales 10 Proof: In the first case, note F {T = k} is empty if k n and is F if k = n. So if F F T then F = F {T = n} F n and if F F n then F = F {T = n} F n. Moreover F n so we have proved both inclusions. This works also for n =. For the third claim note F = k Z+ F {T = n} F. LEM 15.23 If X is adapted and T is a stopping time then X T F T (where we assume that X F, e.g., X = lim inf X n ). Proof: For B B {X T B} {T = n} = {X n B} {T = n} F n. LEM 15.24 If S, T are stopping times then F S T F T. Proof: Let F F S T. Note that F {T = n} = k n [(F {S T = k}) {T = n}] F n. Indeed, the expression in parenthesis is in F k F n and {T = n} F n. 6.3 Optional Sampling Theorem (OST) THM 15.25 (Optional Sampling Theorem) If M is a UI MG and S, T are stopping times with S T a.s. then E M T < + and E[M T F S ] = M S. Proof: Since M is UI, M L 1 s.t. M n M a.s. and in L 1. We prove a more general claim: LEM 15.26 E[M F T ] = M T. Indeed, we then get the theorem by (TOWER) and (JENSEN). Proof:(Lemma) Wlog we assume M 0 so that M n = E[M F n ] 0 n. Let F F T. Then (trivially) E[M ; F {T = }] = E[M T ; F {T = }]

Lecture 15: UI Martingales 11 so STS E[M ; F {T < + }] = E[M T ; F {T < + }]. In fact, by (MON), STS E[M ; F {T k}] = E[M T ; F {T k}] = E[M T k ; F {T k}], k. To conclude we make two observations: 1. F {T k} F T k. Indeed if n k F {T k} {T k = n} = F {T = n} F n, and if n > k = F n. 2. E[M F T k ] = M T k. Since E[M F k ] = M k, STS E[M k F T k ] = M T k. But note that if G F T k E[M k ; G] = l k E[M k ; G {T k = l}] = l k E[M l ; G {T k = l}] = E[M T k ; G] since G {T k = l} F l. 6.4 Example: Biased RW DEF 15.27 The asymmetric simple RW with parameter 1/2 < p < 1 is the process {S n } n 0 with S 0 = 0 and S n = k n X k where the X k s are iid in { 1, +1} s.t. P[X 1 = 1] = p. Let q = 1 p. Let φ(x) = (q/p) x and ψ n (x) = x (p q)n. THM 15.28 Let {S n } as above. Let a < 0 < b. Define T x = inf{n 0 : S n = x}. Then 1. We have P[T a < T b ] = φ(b) φ(0) φ(b) φ(a). In particular, P[T a < + ] = 1/φ(a) and P[T b < ] = 1. 2. We have E[T b ] = b 2p 1.

Lecture 15: UI Martingales 12 Proof: There are two MGs here: and E[φ(S n ) F n 1 ] = p(q/p) S n 1+1 + q(q/p) S n 1 1 = φ(s n 1 ), E[ψ n (S n ) F n 1 ] = p[s n 1 +1 (p q)(n)]+q[s n 1 1 (p q)(n)] = ψ n 1 (S n 1 ). Let N = T a T b. Now note that φ(s N n ) is a bounded MG and therefore applying the MG property at time n and taking limits as n (using (DOM)) φ(0) = E[φ(S N )] = P[T a < T b ]φ(a) + P[T a > T b ]φ(b), where we need to prove that N < + a.s. Indeed, since (b a) +1-steps always take us out of (a, b), P[T b > n(b a)] (1 q b a ) n, so that E[T b ] = k 0 P[T b > k] n (b a)(1 q b a ) n < +. In particular T b < + a.s. and N < + a.s. Rearranging the formula above gives the first result. (For the second part of the first result, take b + and use monotonicity.) For the third one, note that T b n is bounded so that 0 = E[S Tb n (p q)(t b n)]. By (MON), E[T b n] E[T b ]. Finally, using P[ inf n S n a] = P[T a < + ], and the fact that inf n S n 0 shows that E[ inf n S n ] < +. Hence, we can use (DOM) with S Tb n max{b, inf n S n }. References [Dur10] Rick Durrett. Probability: theory and examples. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, 2010. [Wil91] David Williams. Probability with martingales. Cambridge Mathematical Textbooks. Cambridge University Press, Cambridge, 1991.