Notes 18 : Optional Sampling Theorem

Size: px
Start display at page:

Download "Notes 18 : Optional Sampling Theorem"

Transcription

1 Notes 18 : Optional Sampling Theorem Math : Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Chapter 14], [Dur10, Section 5.7]. Recall: DEF 18.1 (Uniform Integrability) A collection C of RVs on (Ω, F, P) is uniformly integrable (UI) if: ε > 0, K > + s.t. E[ X ; X > K] < ε, X C. THM 18.2 (Necessary and Sufficient Condition for L 1 Convergence) Let {X n } L 1 and X L 1. Then X n X in L 1 if and only if the following two conditions hold: X n X in probability {X n } is UI THM 18.3 (Convergence of UI MGs) Let {M n } be UI MG. Then a.s. and in L 1. Moreover, M n M F = σ ( n F n ), M n = E[M F n ], n. THM 18.4 (Lévy s upward theorem) Let Z L 1 and define M n = E[Z F n ]. Then {M n } is a UI MG and a.s. and in L 1. M n M = E[Z F ], 1

2 Lecture 18: Optional Sampling Theorem 2 1 Optional Sampling Theorem 1.1 Review: Stopping times Recall: DEF 18.5 A random variable T : Ω Z + {0, 1,..., + } is called a stopping time if {T = n} F n, n Z +. EX 18.6 Let {A n } be an adapted process and B B. Then is a stopping time. T = inf{n 0 : A n B}, LEM 18.7 (Stopping Time Lemma) Let {M n } be a MG and T be a stopping time. Then the stopped process {M T n } is a MG and in particular E[M T n ] = E[M 0 ]. THM 18.8 Let {M n } be a MG and T be a stopping time. Then M T L 1 and if any of the following conditions holds: 1. T is bounded E[M T ] = E[M 0 ]. 2. {M n } is bounded and T is a.s. finite 3. E[T ] < + and {M n } has bounded increments 4. {M n } is UI. (This one is new. The proof follows from the Optional Sampling Theorem below.) Proof: From the previous theorem, we have ( ) E[M T n M 0 ] = Take n = N in ( ) where T N a.s. 2. Take n to + and use (DOM).

3 Lecture 18: Optional Sampling Theorem 3 3. Note that M T n M 0 = (M i M i 1 ) KT, i T n where M n M n 1 K a.s. Use (DOM). DEF 18.9 (F T ) Let T be a stopping time. Denote by F T the set of all events F such that n Z + F {T = n} F n. 1.2 More on the σ-field F T The following two lemmas help clarify the definition of F T : LEM F T = F n if T n, F T = F if T and F T F for any T. Proof: In the first case, note F {T = k} is empty if k n and is F if k = n. So if F F T then F = F {T = n} F n and if F F n then F = F {T = n} F n. Moreover F n so we have proved both inclusions. This works also for n =. For the third claim note F = k Z+ F {T = n} F. LEM If {X n } is adapted and T is a stopping time then X T F T (where we assume that X F, e.g., X = lim inf n X n ). Proof: For B B {X T B} {T = n} = {X n B} {T = n} F n. LEM If S, T are stopping times, then S T is a stopping time and F S T F T. Proof: We first show that S T is a stopping time. Note that {S T = k} = [{S = k} {T k}] [{S k} {T = k}] F k, since all event above are in F k by the fact that S and T are themselves stopping times. For the second claim, let F F S T. Note that F {T = n} = k n [(F {S T = k}) {T = n}] F n. Indeed, the expression in parenthesis is in F k F n and {T = n} F n.

4 Lecture 18: Optional Sampling Theorem Optional Sampling Theorem (OST) We show that the MG property extends to stopping times under UI MGs. THM (Optional Sampling Theorem) If {M n } is a UI MG and S, T are stopping times with S T a.s. then E M T < + and E[M T F S ] = M S. Proof: Since {M n } is UI, M L 1 s.t. M n M a.s. and in L 1. We prove a more general claim: LEM E[M F T ] = M T. Indeed, we then get the theorem by (TOWER) (and (JENSEN) for the integrability claim). Proof:(of the lemma) We divide M = M + M X Y into positive and negative parts and write M n = E[M F n ] = E[X F n ] E[Y F n ] X n Y n, by linearity. We show that E[X F T ] = X T. The same argument holds for {Y n }, which then implies E[M F T ] = X T Y T = M T, as claimed. We have of course that X 0, and hence X n = E[X F n ] 0 n. Let F F T. Then E[X ; F {T = }] = E[X T ; F {T = }], since X n = E[X F n ] X a.s. by Lévy s Upward Theorem. Therefore it suffices to show E[X ; F {T < + }] = E[X T ; F {T < + }]. In fact, by (MON), it suffices to show i 0 E[X ; F {T = i}] = i 0 But note that F {T = i} F i so that E[X T ; F {T = i}]. E[X T ; F {T = i}] = E[X i ; F {T = i}] = E[X ; F {T = i}], since X i = E[X F i ]. That concludes the proof of the stronger claim.

5 Lecture 18: Optional Sampling Theorem 5 2 Wald s identities Often additional properties of T hold (typically E[T ] < + ), which can be taken advantage of by considering instead the MG {M T n } in the OST above (noticing of course that M T S = M S and M T T = M T whenever S T a.s.). In that context, the following is useful. LEM Suppose {M n } is a MG such that E[ M n+1 M n F n ] B a.s. for all n. Suppose T is a stopping time with E[T ] < +. Then the stoppped MG {M T n } is UI. Proof: Assume WLOG that M 0 = 0, to simplify. Observe first that M T n M m+1 M m 1 {T >m}, n. Taking expectations on the RHS (which does not depend on n), we get E [ M m+1 M m 1 {T >m} ] = = = B E [ M m+1 M m 1 {T >m} ] E [ E [ M m+1 M m 1 {T >m} F m ]] E [ E [ M m+1 M m F m ] 1 {T >m} ] E [ B1 {T >m} ] P [T > m] B E[T ] < +, where we used that {T > m} F m. As an application, we recover Wald s first identity. For X 1, X 2,... R, let S n = n i=1 X i. THM (Wald s first identity) Let X 1, X 2,... L 1 be i.i.d. with µ = E[X 1 ] and let T L 1 be a stopping time. Then E[S T ] = µe[t ].

6 Lecture 18: Optional Sampling Theorem 6 Proof: Recall that M n = S n nµ is a MG. By LEM and the assumption T L 1, the MG {M T n } is UI. Indeed E [ M n+1 M n F n ] = E [ X n+1 µ F n ] µ + E X 1 B < +, by the triangle inequality and the Role of independence lemma. Apply THM 18.8 to {M T n }. We also recall Wald s second identity. We give a MG-based proof (but argue about convergence directly rather than using THM 18.8). THM (Wald s second identity) Let X 1, X 2,... L 2 be i.i.d. with E[X 1 ] = 0 and σ 2 = Var[X 1 ] and let T L 1 be a stopping time. Then E[S 2 T ] = σ 2 E[T ]. Proof: Recall that M n = S 2 n nσ 2 is a MG. Hence so is M T n and 0 = E[M T n ] = E[S 2 T n (T n)σ 2 ] = E[S 2 T n] σ 2 E[T n]. (1) We have that E[T n] E[T ] as n + by (MON). To argue about the convergence of E[ST 2 n ] we note that, by the assumption E[X 1 ] = 0, it follows that {S n } is a MG and hence so is {S T n }. The latter is bounded in L 2 since, by (1), we have E[S 2 T n] = σ 2 E[T n] σ 2 E[T ] < +, for all n. Hence S T n converges a.s. and in L 2 to S T (since T < + a.s. by assumption). Convergence in L 2 also implies convergence of the second moment. Indeed, by the triangle inequality, Hence, S T n 2 S T 2 S T n S T = E[S 2 T n] σ 2 E[T n] E[S 2 T ] σ 2 E[T ], which concludes the proof. To establish E[T ] < +, the following lemma can be used. LEM (Waiting for the inevitable) Let T be a stopping time. Assume there is N Z + and ε > 0 such that for every n P[T n + N F n ] > ε a.s. then E[T ] < +.

7 Lecture 18: Optional Sampling Theorem 7 Proof: For any integer m 1, P[T > mn T > (m 1)N] 1 ε, by assumption. (Indeed, by definition of Z = P[T > n + N F n ] 1 ε with n = (m 1)N, we have for F = {T > (m 1)N} F n P[T > mn] = E[1{T > n + N}; F ] = E[Z; F ] (1 ε)p[f ], and apply the definition of the conditional probability.) By the multiplication rule (i.e., the undergraduate rule P[A 1 A n ] = n i=1 P[A i A 1 A i 1 ]) and the monotonicity of the events {T > mn}, we have P[T > mn] (1 ε) m. We conclude using E[T ] = k 1 P[T k]. 3 Application I: Simple RW DEF Simple RW on Z is the process {S n } n 0 with S 0 = 0 and S n = k n X k where the X k s are iid in { 1, +1} s.t. P[X 1 = 1] = 1/2. THM Let {S n } as above. Let a < 0 < b. Define T x = inf{n 0 : S n = x} and T = T a T b. Then we have T < + a.s. P[T a < T b ] = E[T ] = ab b b a T a < + a.s. but E[T a ] = + Proof: 1) Apply the Waiting for the inevitable lemma with N = b a and ε = (1/2) b a (corresponding to moving right b a times in a row which takes you to b, no matter where you are within the {a,..., b} interval). That shows E[T ] < +, from which the claim holds.

8 Lecture 18: Optional Sampling Theorem 8 2) By Wald s first identity, E[S T ] = 0 or a P[S T = a] + b P[S T = b] = 0, that is (taking b in the second expression) P[T a < T b ] = b b a and P[T a < + ] P[T a < T b ] 1. 3) Wald s second identity says that E[S 2 T ] = E[T ] (by σ2 = 1). Also so that E[T ] = ab. E[S 2 T ] = b b a a2 + a b a b2 = ab, 4) Taking b + above shows that E[T a ] = + by monotone convergence. (Note that this case shows that the L 1 condition on the stopping time is necessary in Wald s second identity.) 4 Application II: Biased RW DEF Biased simple RW on Z with parameter 1/2 < p < 1 is the process {S n } n 0 with S 0 = 0 and S n = k n X k where the X k s are iid in { 1, +1} s.t. P[X 1 = 1] = p. Let q = 1 p. Let φ(x) = (q/p) x and ψ n (x) = x (p q)n. THM Let {S n } as above. Let a < 0 < b. Define T x = inf{n 0 : S n = x} and T = T a T b. Then we have T < + a.s. P[T a < T b ] = φ(0) φ(b) φ(a) φ(b) P[T a < + ] = 1/φ(a) < 1 and P[T b = + ] = 0 E[T b ] = b 2p 1

9 Lecture 18: Optional Sampling Theorem 9 Proof: There are two MGs here: E[φ(S n ) F n 1 ] = p(q/p) S n q(q/p) S n 1 1 = φ(s n 1 ), (noting that φ(s n ) (p/q) n a.s.) and E[ψ n (S n ) F n 1 ] = p[s n 1 +1 (p q)(n)]+q[s n 1 1 (p q)(n)] = ψ n 1 (S n 1 ), (noting that ψ n (S n ) (1 + p)n a.s.) 1) Follows by the same argument as in the unbiased case. 2) Now note that {φ(s T n )} is a bounded MG and, therefore, by THM 18.8, we get φ(0) = E[φ(S T )] = P[T a < T b ]φ(a) + P[T a > T b ]φ(b), or P[T a < T b ] = φ(b) φ(0) φ(b) φ(a) (where we used 1)). 3) By 2), taking b +, by monotonicity P[T a < + ] = 1 T a = + with positive probability. Similarly take a. φ(a) < 1 so 4) By LEM 18.7 applied to {Ψ n (S n )}, 0 = E[S Tb n (p q)(t b n)]. (We cannot use Wald s first identity directly because it is not immediately clear whether T b is integrable.) By (MON) and the fact that T b < + a.s. from 3), E[T b n] E[T b ]. Finally, inf n S n 0 a.s. and for x 0, P[ inf n S n x] = P[T x < + ] = ( ) q x, p so that E[ inf n S n ] = x 1 P[ inf t S t x] < +. Hence, we can use (DOM) with S Tb n max{b, inf n S n } to deduce that E[T b ] = E[S T b ] p q = b 2p 1.

10 Lecture 18: Optional Sampling Theorem 10 References [Dur10] Rick Durrett. Probability: theory and examples. Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, [Wil91] David Williams. Probability with martingales. Cambridge Mathematical Textbooks. Cambridge University Press, Cambridge, 1991.

Notes 15 : UI Martingales

Notes 15 : UI Martingales Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.

More information

Notes 13 : Conditioning

Notes 13 : Conditioning Notes 13 : Conditioning Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Sections 0, 4.8, 9, 10], [Dur10, Section 5.1, 5.2], [KT75, Section 6.1]. 1 Conditioning 1.1 Review

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Notes 6 : First and second moment methods

Notes 6 : First and second moment methods Notes 6 : First and second moment methods Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Roc, Sections 2.1-2.3]. Recall: THM 6.1 (Markov s inequality) Let X be a non-negative

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

Lecture 19 : Brownian motion: Path properties I

Lecture 19 : Brownian motion: Path properties I Lecture 19 : Brownian motion: Path properties I MATH275B - Winter 2012 Lecturer: Sebastien Roch References: [Dur10, Section 8.1], [Lig10, Section 1.5, 1.6], [MP10, Section 1.1, 1.2]. 1 Invariance We begin

More information

Notes 9 : Infinitely divisible and stable laws

Notes 9 : Infinitely divisible and stable laws Notes 9 : Infinitely divisible and stable laws Math 733 - Fall 203 Lecturer: Sebastien Roch References: [Dur0, Section 3.7, 3.8], [Shi96, Section III.6]. Infinitely divisible distributions Recall: EX 9.

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Lectures 22-23: Conditional Expectations

Lectures 22-23: Conditional Expectations Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation

More information

4 Expectation & the Lebesgue Theorems

4 Expectation & the Lebesgue Theorems STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does

More information

Martingale Theory and Applications

Martingale Theory and Applications Martingale Theory and Applications Dr Nic Freeman June 4, 2015 Contents 1 Conditional Expectation 2 1.1 Probability spaces and σ-fields............................ 2 1.2 Random Variables...................................

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Positive and null recurrent-branching Process

Positive and null recurrent-branching Process December 15, 2011 In last discussion we studied the transience and recurrence of Markov chains There are 2 other closely related issues about Markov chains that we address Is there an invariant distribution?

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

Notes 5 : More on the a.s. convergence of sums

Notes 5 : More on the a.s. convergence of sums Notes 5 : More o the a.s. covergece of sums Math 733-734: Theory of Probability Lecturer: Sebastie Roch Refereces: Dur0, Sectios.5; Wil9, Sectio 4.7, Shi96, Sectio IV.4, Dur0, Sectio.. Radom series. Three-series

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

1 Generating functions

1 Generating functions 1 Generating functions Even quite straightforward counting problems can lead to laborious and lengthy calculations. These are greatly simplified by using generating functions. 2 Definition 1.1. Given a

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Advanced Probability Theory (Math541)

Advanced Probability Theory (Math541) Advanced Probability Theory (Math541) Instructor: Kani Chen (Classic)/Modern Probability Theory (1900-1960) Instructor: Kani Chen (HKUST) Advanced Probability Theory (Math541) 1 / 17 Primitive/Classic

More information

Lecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write

Lecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write Lecture 3: Expected Value 1.) Definitions. If X 0 is a random variable on (Ω, F, P), then we define its expected value to be EX = XdP. Notice that this quantity may be. For general X, we say that EX exists

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Entropy and Ergodic Theory Lecture 15: A first look at concentration

Entropy and Ergodic Theory Lecture 15: A first look at concentration Entropy and Ergodic Theory Lecture 15: A first look at concentration 1 Introduction to concentration Let X 1, X 2,... be i.i.d. R-valued RVs with common distribution µ, and suppose for simplicity that

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011 Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Assignment 2 : Probabilistic Methods

Assignment 2 : Probabilistic Methods Assignment 2 : Probabilistic Methods jverstra@math Question 1. Let d N, c R +, and let G n be an n-vertex d-regular graph. Suppose that vertices of G n are selected independently with probability p, where

More information

Notes on the second moment method, Erdős multiplication tables

Notes on the second moment method, Erdős multiplication tables Notes on the second moment method, Erdős multiplication tables January 25, 20 Erdős multiplication table theorem Suppose we form the N N multiplication table, containing all the N 2 products ab, where

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

C.7. Numerical series. Pag. 147 Proof of the converging criteria for series. Theorem 5.29 (Comparison test) Let a k and b k be positive-term series

C.7. Numerical series. Pag. 147 Proof of the converging criteria for series. Theorem 5.29 (Comparison test) Let a k and b k be positive-term series C.7 Numerical series Pag. 147 Proof of the converging criteria for series Theorem 5.29 (Comparison test) Let and be positive-term series such that 0, for any k 0. i) If the series converges, then also

More information

Inference for Stochastic Processes

Inference for Stochastic Processes Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

Martingale Theory for Finance

Martingale Theory for Finance Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.

More information

Probability and Measure

Probability and Measure Probability and Measure Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Convergence of Random Variables 1. Convergence Concepts 1.1. Convergence of Real

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Errata for FIRST edition of A First Look at Rigorous Probability

Errata for FIRST edition of A First Look at Rigorous Probability Errata for FIRST edition of A First Look at Rigorous Probability NOTE: the corrections below (plus many more improvements) were all encorporated into the second edition: J.S. Rosenthal, A First Look at

More information

Admin and Lecture 1: Recap of Measure Theory

Admin and Lecture 1: Recap of Measure Theory Admin and Lecture 1: Recap of Measure Theory David Aldous January 16, 2018 I don t use bcourses: Read web page (search Aldous 205B) Web page rather unorganized some topics done by Nike in 205A will post

More information

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments

More information

Lecture 11 : Asymptotic Sample Complexity

Lecture 11 : Asymptotic Sample Complexity Lecture 11 : Asymptotic Sample Complexity MATH285K - Spring 2010 Lecturer: Sebastien Roch References: [DMR09]. Previous class THM 11.1 (Strong Quartet Evidence) Let Q be a collection of quartet trees on

More information

MATH 418: Lectures on Conditional Expectation

MATH 418: Lectures on Conditional Expectation MATH 418: Lectures on Conditional Expectation Instructor: r. Ed Perkins, Notes taken by Adrian She Conditional expectation is one of the most useful tools of probability. The Radon-Nikodym theorem enables

More information

Lecture 3 - Expectation, inequalities and laws of large numbers

Lecture 3 - Expectation, inequalities and laws of large numbers Lecture 3 - Expectation, inequalities and laws of large numbers Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 3 - Expectation, inequalities and laws of large numbersapril 19, 2009 1 / 67 Part

More information

Lecture 9 : Identifiability of Markov Models

Lecture 9 : Identifiability of Markov Models Lecture 9 : Identifiability of Markov Models MATH285K - Spring 2010 Lecturer: Sebastien Roch References: [SS03, Chapter 8]. Previous class THM 9.1 (Uniqueness of tree metric representation) Let δ be a

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

7 Convergence in R d and in Metric Spaces

7 Convergence in R d and in Metric Spaces STA 711: Probability & Measure Theory Robert L. Wolpert 7 Convergence in R d and in Metric Spaces A sequence of elements a n of R d converges to a limit a if and only if, for each ǫ > 0, the sequence a

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Probability Models of Information Exchange on Networks Lecture 1

Probability Models of Information Exchange on Networks Lecture 1 Probability Models of Information Exchange on Networks Lecture 1 Elchanan Mossel UC Berkeley All Rights Reserved Motivating Questions How are collective decisions made by: people / computational agents

More information

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006 Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d

More information

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions 18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach

More information

Homework for 1/13 Due 1/22

Homework for 1/13 Due 1/22 Name: ID: Homework for 1/13 Due 1/22 1. [ 5-23] An irregularly shaped object of unknown area A is located in the unit square 0 x 1, 0 y 1. Consider a random point distributed uniformly over the square;

More information

IEOR 6711: Stochastic Models I Fall 2013, Professor Whitt Lecture Notes, Thursday, September 5 Modes of Convergence

IEOR 6711: Stochastic Models I Fall 2013, Professor Whitt Lecture Notes, Thursday, September 5 Modes of Convergence IEOR 6711: Stochastic Models I Fall 2013, Professor Whitt Lecture Notes, Thursday, September 5 Modes of Convergence 1 Overview We started by stating the two principal laws of large numbers: the strong

More information

Advanced Probability

Advanced Probability Advanced Probability Perla Sousi October 10, 2011 Contents 1 Conditional expectation 1 1.1 Discrete case.................................. 3 1.2 Existence and uniqueness............................ 3 1

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and

More information

Stochastic Convergence, Delta Method & Moment Estimators

Stochastic Convergence, Delta Method & Moment Estimators Stochastic Convergence, Delta Method & Moment Estimators Seminar on Asymptotic Statistics Daniel Hoffmann University of Kaiserslautern Department of Mathematics February 13, 2015 Daniel Hoffmann (TU KL)

More information

Solution. 1 Solution of Homework 7. Sangchul Lee. March 22, Problem 1.1

Solution. 1 Solution of Homework 7. Sangchul Lee. March 22, Problem 1.1 Solution Sangchul Lee March, 018 1 Solution of Homework 7 Problem 1.1 For a given k N, Consider two sequences (a n ) and (b n,k ) in R. Suppose that a n b n,k for all n,k N Show that limsup a n B k :=

More information

0.1 Uniform integrability

0.1 Uniform integrability Copyright c 2009 by Karl Sigman 0.1 Uniform integrability Given a sequence of rvs {X n } for which it is known apriori that X n X, n, wp1. for some r.v. X, it is of great importance in many applications

More information

4 Branching Processes

4 Branching Processes 4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

Exponential Metrics. Lecture by Sarah Cannon and Emma Cohen. November 18th, 2014

Exponential Metrics. Lecture by Sarah Cannon and Emma Cohen. November 18th, 2014 Exponential Metrics Lecture by Sarah Cannon and Emma Cohen November 18th, 2014 1 Background The coupling theorem we proved in class: Theorem 1.1. Let φ t be a random variable [distance metric] satisfying:

More information

The expansion of random regular graphs

The expansion of random regular graphs The expansion of random regular graphs David Ellis Introduction Our aim is now to show that for any d 3, almost all d-regular graphs on {1, 2,..., n} have edge-expansion ratio at least c d d (if nd is

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

MATH 101, FALL 2018: SUPPLEMENTARY NOTES ON THE REAL LINE

MATH 101, FALL 2018: SUPPLEMENTARY NOTES ON THE REAL LINE MATH 101, FALL 2018: SUPPLEMENTARY NOTES ON THE REAL LINE SEBASTIEN VASEY These notes describe the material for November 26, 2018 (while similar content is in Abbott s book, the presentation here is different).

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1 List of Problems jacques@ucsd.edu Those question with a star next to them are considered slightly more challenging. Problems 9, 11, and 19 from the book The probabilistic method, by Alon and Spencer. Question

More information

UND INFORMATIK. Maximal φ-inequalities for Nonnegative Submartingales. G. Alsmeyer und U. Rösler

UND INFORMATIK. Maximal φ-inequalities for Nonnegative Submartingales. G. Alsmeyer und U. Rösler ANGEWANDTE MATHEMATIK UND INFORMATIK Maximal φ-inequalities for Nonnegative Submartingales G. Alsmeyer und U. Rösler Universität Münster und Universität Kiel e-mail: gerolda@math.uni-muenster.de, roesler@math.uni-kiel.de

More information

It follows from the above inequalities that for c C 1

It follows from the above inequalities that for c C 1 3 Spaces L p 1. Appearance of normed spaces. In this part we fix a measure space (, A, µ) (we may assume that µ is complete), and consider the A - measurable functions on it. 2. For f L 1 (, µ) set f 1

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

MAS113 Introduction to Probability and Statistics

MAS113 Introduction to Probability and Statistics MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Mathematical Statistical Physics Solution Sketch of the math part of the Exam

Mathematical Statistical Physics Solution Sketch of the math part of the Exam Prof. B. Paredes, Prof. P. Pickl Summer Term 2016 Lukas Nickel, Carlos Velasco July 22, 2016 Mathematical Statistical Physics Solution Sketch of the math part of the Exam Family Name: First name: Student

More information

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989), Real Analysis 2, Math 651, Spring 2005 April 26, 2005 1 Real Analysis 2, Math 651, Spring 2005 Krzysztof Chris Ciesielski 1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

The Canonical Gaussian Measure on R

The Canonical Gaussian Measure on R The Canonical Gaussian Measure on R 1. Introduction The main goal of this course is to study Gaussian measures. The simplest example of a Gaussian measure is the canonical Gaussian measure P on R where

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Proving the central limit theorem

Proving the central limit theorem SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit

More information

MATH 217A HOMEWORK. P (A i A j ). First, the basis case. We make a union disjoint as follows: P (A B) = P (A) + P (A c B)

MATH 217A HOMEWORK. P (A i A j ). First, the basis case. We make a union disjoint as follows: P (A B) = P (A) + P (A c B) MATH 217A HOMEWOK EIN PEASE 1. (Chap. 1, Problem 2. (a Let (, Σ, P be a probability space and {A i, 1 i n} Σ, n 2. Prove that P A i n P (A i P (A i A j + P (A i A j A k... + ( 1 n 1 P A i n P (A i P (A

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

(6, 4) Is there arbitrage in this market? If so, find all arbitrages. If not, find all pricing kernels.

(6, 4) Is there arbitrage in this market? If so, find all arbitrages. If not, find all pricing kernels. Advanced Financial Models Example sheet - Michaelmas 208 Michael Tehranchi Problem. Consider a two-asset model with prices given by (P, P 2 ) (3, 9) /4 (4, 6) (6, 8) /4 /2 (6, 4) Is there arbitrage in

More information

STOCHASTIC MODELS FOR WEB 2.0

STOCHASTIC MODELS FOR WEB 2.0 STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes

More information

Central limit theorem for random multiplicative functions

Central limit theorem for random multiplicative functions Central limit theorem for random multiplicative functions (Courant Institute) joint work with Kannan Soundararajan (Stanford) Multiplicative functions Many of the functions of interest to number theorists

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

Simulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data

Simulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data Simulation Alberto Ceselli MSc in Computer Science Univ. of Milan Part 4 - Statistical Analysis of Simulated Data A. Ceselli Simulation P.4 Analysis of Sim. data 1 / 15 Statistical analysis of simulated

More information

Lecture 3: Lower Bounds for Bandit Algorithms

Lecture 3: Lower Bounds for Bandit Algorithms CMSC 858G: Bandits, Experts and Games 09/19/16 Lecture 3: Lower Bounds for Bandit Algorithms Instructor: Alex Slivkins Scribed by: Soham De & Karthik A Sankararaman 1 Lower Bounds In this lecture (and

More information

Stochastic Models (Lecture #4)

Stochastic Models (Lecture #4) Stochastic Models (Lecture #4) Thomas Verdebout Université libre de Bruxelles (ULB) Today Today, our goal will be to discuss limits of sequences of rv, and to study famous limiting results. Convergence

More information

Banach Spaces II: Elementary Banach Space Theory

Banach Spaces II: Elementary Banach Space Theory BS II c Gabriel Nagy Banach Spaces II: Elementary Banach Space Theory Notes from the Functional Analysis Course (Fall 07 - Spring 08) In this section we introduce Banach spaces and examine some of their

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Appendix B for The Evolution of Strategic Sophistication (Intended for Online Publication)

Appendix B for The Evolution of Strategic Sophistication (Intended for Online Publication) Appendix B for The Evolution of Strategic Sophistication (Intended for Online Publication) Nikolaus Robalino and Arthur Robson Appendix B: Proof of Theorem 2 This appendix contains the proof of Theorem

More information

Probabilistic Combinatorics. Jeong Han Kim

Probabilistic Combinatorics. Jeong Han Kim Probabilistic Combinatorics Jeong Han Kim 1 Tentative Plans 1. Basics for Probability theory Coin Tossing, Expectation, Linearity of Expectation, Probability vs. Expectation, Bool s Inequality 2. First

More information