Notes 18 : Optional Sampling Theorem

Similar documents
Notes 15 : UI Martingales

Notes 13 : Conditioning

Notes 1 : Measure-theoretic foundations I

Math 6810 (Probability) Fall Lecture notes

Notes 6 : First and second moment methods

Modern Discrete Probability Branching processes

Lecture 19 : Brownian motion: Path properties I

Notes 9 : Infinitely divisible and stable laws

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Lectures 22-23: Conditional Expectations

4 Expectation & the Lebesgue Theorems

Martingale Theory and Applications

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Positive and null recurrent-branching Process

Limiting Distributions

Notes 5 : More on the a.s. convergence of sums

1 Sequences of events and their limits

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

1 Generating functions

4 Sums of Independent Random Variables

1. Stochastic Processes and filtrations

Advanced Probability Theory (Math541)

Lecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write

1 Presessional Probability

Entropy and Ergodic Theory Lecture 15: A first look at concentration

A D VA N C E D P R O B A B I L - I T Y

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Assignment 2 : Probabilistic Methods

Notes on the second moment method, Erdős multiplication tables

P (A G) dp G P (A G)

n E(X t T n = lim X s Tn = X s

C.7. Numerical series. Pag. 147 Proof of the converging criteria for series. Theorem 5.29 (Comparison test) Let a k and b k be positive-term series

Inference for Stochastic Processes

Useful Probability Theorems

Martingale Theory for Finance

Probability and Measure

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Errata for FIRST edition of A First Look at Rigorous Probability

Admin and Lecture 1: Recap of Measure Theory

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Lecture 11 : Asymptotic Sample Complexity

MATH 418: Lectures on Conditional Expectation

Lecture 3 - Expectation, inequalities and laws of large numbers

Lecture 9 : Identifiability of Markov Models

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Limiting Distributions

1 Exercises for lecture 1

Brownian Motion and Conditional Probability

Solutions to the Exercises in Stochastic Analysis

7 Convergence in R d and in Metric Spaces

Lecture 12. F o s, (1.1) F t := s>t

(A n + B n + 1) A n + B n

Probability Models of Information Exchange on Networks Lecture 1

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Brownian Motion and Stochastic Calculus

STAT 200C: High-dimensional Statistics

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

Homework for 1/13 Due 1/22

IEOR 6711: Stochastic Models I Fall 2013, Professor Whitt Lecture Notes, Thursday, September 5 Modes of Convergence

Advanced Probability

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Stochastic Convergence, Delta Method & Moment Estimators

Solution. 1 Solution of Homework 7. Sangchul Lee. March 22, Problem 1.1

0.1 Uniform integrability

4 Branching Processes

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Exponential Metrics. Lecture by Sarah Cannon and Emma Cohen. November 18th, 2014

The expansion of random regular graphs

Part II Probability and Measure

MATH 101, FALL 2018: SUPPLEMENTARY NOTES ON THE REAL LINE

Lecture 22 Girsanov s Theorem

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1

UND INFORMATIK. Maximal φ-inequalities for Nonnegative Submartingales. G. Alsmeyer und U. Rösler

It follows from the above inequalities that for c C 1

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

MAS113 Introduction to Probability and Statistics

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

Mathematical Statistical Physics Solution Sketch of the math part of the Exam

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

Lecture 17 Brownian motion as a Markov process

The Canonical Gaussian Measure on R

Selected Exercises on Expectations and Some Probability Inequalities

Proving the central limit theorem

MATH 217A HOMEWORK. P (A i A j ). First, the basis case. We make a union disjoint as follows: P (A B) = P (A) + P (A c B)

JUSTIN HARTMANN. F n Σ.

(6, 4) Is there arbitrage in this market? If so, find all arbitrages. If not, find all pricing kernels.

STOCHASTIC MODELS FOR WEB 2.0

Central limit theorem for random multiplicative functions

Lecture 22: Variance and Covariance

Simulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data

Lecture 3: Lower Bounds for Bandit Algorithms

Stochastic Models (Lecture #4)

Banach Spaces II: Elementary Banach Space Theory

3 Multiple Discrete Random Variables

Appendix B for The Evolution of Strategic Sophistication (Intended for Online Publication)

Probabilistic Combinatorics. Jeong Han Kim

Transcription:

Notes 18 : Optional Sampling Theorem Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Chapter 14], [Dur10, Section 5.7]. Recall: DEF 18.1 (Uniform Integrability) A collection C of RVs on (Ω, F, P) is uniformly integrable (UI) if: ε > 0, K > + s.t. E[ X ; X > K] < ε, X C. THM 18.2 (Necessary and Sufficient Condition for L 1 Convergence) Let {X n } L 1 and X L 1. Then X n X in L 1 if and only if the following two conditions hold: X n X in probability {X n } is UI THM 18.3 (Convergence of UI MGs) Let {M n } be UI MG. Then a.s. and in L 1. Moreover, M n M F = σ ( n F n ), M n = E[M F n ], n. THM 18.4 (Lévy s upward theorem) Let Z L 1 and define M n = E[Z F n ]. Then {M n } is a UI MG and a.s. and in L 1. M n M = E[Z F ], 1

Lecture 18: Optional Sampling Theorem 2 1 Optional Sampling Theorem 1.1 Review: Stopping times Recall: DEF 18.5 A random variable T : Ω Z + {0, 1,..., + } is called a stopping time if {T = n} F n, n Z +. EX 18.6 Let {A n } be an adapted process and B B. Then is a stopping time. T = inf{n 0 : A n B}, LEM 18.7 (Stopping Time Lemma) Let {M n } be a MG and T be a stopping time. Then the stopped process {M T n } is a MG and in particular E[M T n ] = E[M 0 ]. THM 18.8 Let {M n } be a MG and T be a stopping time. Then M T L 1 and if any of the following conditions holds: 1. T is bounded E[M T ] = E[M 0 ]. 2. {M n } is bounded and T is a.s. finite 3. E[T ] < + and {M n } has bounded increments 4. {M n } is UI. (This one is new. The proof follows from the Optional Sampling Theorem below.) Proof: From the previous theorem, we have ( ) E[M T n M 0 ] = 0. 1. Take n = N in ( ) where T N a.s. 2. Take n to + and use (DOM).

Lecture 18: Optional Sampling Theorem 3 3. Note that M T n M 0 = (M i M i 1 ) KT, i T n where M n M n 1 K a.s. Use (DOM). DEF 18.9 (F T ) Let T be a stopping time. Denote by F T the set of all events F such that n Z + F {T = n} F n. 1.2 More on the σ-field F T The following two lemmas help clarify the definition of F T : LEM 18.10 F T = F n if T n, F T = F if T and F T F for any T. Proof: In the first case, note F {T = k} is empty if k n and is F if k = n. So if F F T then F = F {T = n} F n and if F F n then F = F {T = n} F n. Moreover F n so we have proved both inclusions. This works also for n =. For the third claim note F = k Z+ F {T = n} F. LEM 18.11 If {X n } is adapted and T is a stopping time then X T F T (where we assume that X F, e.g., X = lim inf n X n ). Proof: For B B {X T B} {T = n} = {X n B} {T = n} F n. LEM 18.12 If S, T are stopping times, then S T is a stopping time and F S T F T. Proof: We first show that S T is a stopping time. Note that {S T = k} = [{S = k} {T k}] [{S k} {T = k}] F k, since all event above are in F k by the fact that S and T are themselves stopping times. For the second claim, let F F S T. Note that F {T = n} = k n [(F {S T = k}) {T = n}] F n. Indeed, the expression in parenthesis is in F k F n and {T = n} F n.

Lecture 18: Optional Sampling Theorem 4 1.3 Optional Sampling Theorem (OST) We show that the MG property extends to stopping times under UI MGs. THM 18.13 (Optional Sampling Theorem) If {M n } is a UI MG and S, T are stopping times with S T a.s. then E M T < + and E[M T F S ] = M S. Proof: Since {M n } is UI, M L 1 s.t. M n M a.s. and in L 1. We prove a more general claim: LEM 18.14 E[M F T ] = M T. Indeed, we then get the theorem by (TOWER) (and (JENSEN) for the integrability claim). Proof:(of the lemma) We divide M = M + M X Y into positive and negative parts and write M n = E[M F n ] = E[X F n ] E[Y F n ] X n Y n, by linearity. We show that E[X F T ] = X T. The same argument holds for {Y n }, which then implies E[M F T ] = X T Y T = M T, as claimed. We have of course that X 0, and hence X n = E[X F n ] 0 n. Let F F T. Then E[X ; F {T = }] = E[X T ; F {T = }], since X n = E[X F n ] X a.s. by Lévy s Upward Theorem. Therefore it suffices to show E[X ; F {T < + }] = E[X T ; F {T < + }]. In fact, by (MON), it suffices to show i 0 E[X ; F {T = i}] = i 0 But note that F {T = i} F i so that E[X T ; F {T = i}]. E[X T ; F {T = i}] = E[X i ; F {T = i}] = E[X ; F {T = i}], since X i = E[X F i ]. That concludes the proof of the stronger claim.

Lecture 18: Optional Sampling Theorem 5 2 Wald s identities Often additional properties of T hold (typically E[T ] < + ), which can be taken advantage of by considering instead the MG {M T n } in the OST above (noticing of course that M T S = M S and M T T = M T whenever S T a.s.). In that context, the following is useful. LEM 18.15 Suppose {M n } is a MG such that E[ M n+1 M n F n ] B a.s. for all n. Suppose T is a stopping time with E[T ] < +. Then the stoppped MG {M T n } is UI. Proof: Assume WLOG that M 0 = 0, to simplify. Observe first that M T n M m+1 M m 1 {T >m}, n. Taking expectations on the RHS (which does not depend on n), we get E [ M m+1 M m 1 {T >m} ] = = = B E [ M m+1 M m 1 {T >m} ] E [ E [ M m+1 M m 1 {T >m} F m ]] E [ E [ M m+1 M m F m ] 1 {T >m} ] E [ B1 {T >m} ] P [T > m] B E[T ] < +, where we used that {T > m} F m. As an application, we recover Wald s first identity. For X 1, X 2,... R, let S n = n i=1 X i. THM 18.16 (Wald s first identity) Let X 1, X 2,... L 1 be i.i.d. with µ = E[X 1 ] and let T L 1 be a stopping time. Then E[S T ] = µe[t ].

Lecture 18: Optional Sampling Theorem 6 Proof: Recall that M n = S n nµ is a MG. By LEM 18.15 and the assumption T L 1, the MG {M T n } is UI. Indeed E [ M n+1 M n F n ] = E [ X n+1 µ F n ] µ + E X 1 B < +, by the triangle inequality and the Role of independence lemma. Apply THM 18.8 to {M T n }. We also recall Wald s second identity. We give a MG-based proof (but argue about convergence directly rather than using THM 18.8). THM 18.17 (Wald s second identity) Let X 1, X 2,... L 2 be i.i.d. with E[X 1 ] = 0 and σ 2 = Var[X 1 ] and let T L 1 be a stopping time. Then E[S 2 T ] = σ 2 E[T ]. Proof: Recall that M n = S 2 n nσ 2 is a MG. Hence so is M T n and 0 = E[M T n ] = E[S 2 T n (T n)σ 2 ] = E[S 2 T n] σ 2 E[T n]. (1) We have that E[T n] E[T ] as n + by (MON). To argue about the convergence of E[ST 2 n ] we note that, by the assumption E[X 1 ] = 0, it follows that {S n } is a MG and hence so is {S T n }. The latter is bounded in L 2 since, by (1), we have E[S 2 T n] = σ 2 E[T n] σ 2 E[T ] < +, for all n. Hence S T n converges a.s. and in L 2 to S T (since T < + a.s. by assumption). Convergence in L 2 also implies convergence of the second moment. Indeed, by the triangle inequality, Hence, S T n 2 S T 2 S T n S T 2 0. 0 = E[S 2 T n] σ 2 E[T n] E[S 2 T ] σ 2 E[T ], which concludes the proof. To establish E[T ] < +, the following lemma can be used. LEM 18.18 (Waiting for the inevitable) Let T be a stopping time. Assume there is N Z + and ε > 0 such that for every n P[T n + N F n ] > ε a.s. then E[T ] < +.

Lecture 18: Optional Sampling Theorem 7 Proof: For any integer m 1, P[T > mn T > (m 1)N] 1 ε, by assumption. (Indeed, by definition of Z = P[T > n + N F n ] 1 ε with n = (m 1)N, we have for F = {T > (m 1)N} F n P[T > mn] = E[1{T > n + N}; F ] = E[Z; F ] (1 ε)p[f ], and apply the definition of the conditional probability.) By the multiplication rule (i.e., the undergraduate rule P[A 1 A n ] = n i=1 P[A i A 1 A i 1 ]) and the monotonicity of the events {T > mn}, we have P[T > mn] (1 ε) m. We conclude using E[T ] = k 1 P[T k]. 3 Application I: Simple RW DEF 18.19 Simple RW on Z is the process {S n } n 0 with S 0 = 0 and S n = k n X k where the X k s are iid in { 1, +1} s.t. P[X 1 = 1] = 1/2. THM 18.20 Let {S n } as above. Let a < 0 < b. Define T x = inf{n 0 : S n = x} and T = T a T b. Then we have 1. 2. 3. 4. T < + a.s. P[T a < T b ] = E[T ] = ab b b a T a < + a.s. but E[T a ] = + Proof: 1) Apply the Waiting for the inevitable lemma with N = b a and ε = (1/2) b a (corresponding to moving right b a times in a row which takes you to b, no matter where you are within the {a,..., b} interval). That shows E[T ] < +, from which the claim holds.

Lecture 18: Optional Sampling Theorem 8 2) By Wald s first identity, E[S T ] = 0 or a P[S T = a] + b P[S T = b] = 0, that is (taking b in the second expression) P[T a < T b ] = b b a and P[T a < + ] P[T a < T b ] 1. 3) Wald s second identity says that E[S 2 T ] = E[T ] (by σ2 = 1). Also so that E[T ] = ab. E[S 2 T ] = b b a a2 + a b a b2 = ab, 4) Taking b + above shows that E[T a ] = + by monotone convergence. (Note that this case shows that the L 1 condition on the stopping time is necessary in Wald s second identity.) 4 Application II: Biased RW DEF 18.21 Biased simple RW on Z with parameter 1/2 < p < 1 is the process {S n } n 0 with S 0 = 0 and S n = k n X k where the X k s are iid in { 1, +1} s.t. P[X 1 = 1] = p. Let q = 1 p. Let φ(x) = (q/p) x and ψ n (x) = x (p q)n. THM 18.22 Let {S n } as above. Let a < 0 < b. Define T x = inf{n 0 : S n = x} and T = T a T b. Then we have 1. 2. 3. 4. T < + a.s. P[T a < T b ] = φ(0) φ(b) φ(a) φ(b) P[T a < + ] = 1/φ(a) < 1 and P[T b = + ] = 0 E[T b ] = b 2p 1

Lecture 18: Optional Sampling Theorem 9 Proof: There are two MGs here: E[φ(S n ) F n 1 ] = p(q/p) S n 1+1 + q(q/p) S n 1 1 = φ(s n 1 ), (noting that φ(s n ) (p/q) n a.s.) and E[ψ n (S n ) F n 1 ] = p[s n 1 +1 (p q)(n)]+q[s n 1 1 (p q)(n)] = ψ n 1 (S n 1 ), (noting that ψ n (S n ) (1 + p)n a.s.) 1) Follows by the same argument as in the unbiased case. 2) Now note that {φ(s T n )} is a bounded MG and, therefore, by THM 18.8, we get φ(0) = E[φ(S T )] = P[T a < T b ]φ(a) + P[T a > T b ]φ(b), or P[T a < T b ] = φ(b) φ(0) φ(b) φ(a) (where we used 1)). 3) By 2), taking b +, by monotonicity P[T a < + ] = 1 T a = + with positive probability. Similarly take a. φ(a) < 1 so 4) By LEM 18.7 applied to {Ψ n (S n )}, 0 = E[S Tb n (p q)(t b n)]. (We cannot use Wald s first identity directly because it is not immediately clear whether T b is integrable.) By (MON) and the fact that T b < + a.s. from 3), E[T b n] E[T b ]. Finally, inf n S n 0 a.s. and for x 0, P[ inf n S n x] = P[T x < + ] = ( ) q x, p so that E[ inf n S n ] = x 1 P[ inf t S t x] < +. Hence, we can use (DOM) with S Tb n max{b, inf n S n } to deduce that E[T b ] = E[S T b ] p q = b 2p 1.

Lecture 18: Optional Sampling Theorem 10 References [Dur10] Rick Durrett. Probability: theory and examples. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, 2010. [Wil91] David Williams. Probability with martingales. Cambridge Mathematical Textbooks. Cambridge University Press, Cambridge, 1991.