1 Martingales. Martingales. (Ω, B, P ) is a probability space.

Size: px
Start display at page:

Download "1 Martingales. Martingales. (Ω, B, P ) is a probability space."

Transcription

1 Martingales January 8, 206 Debdee Pati Martingales (Ω, B, P ) is a robability sace. Definition. (Filtration) filtration F = {F n } n 0 is a collection of increasing sub-σfields such that for m n, we have F m F n. Definition 2. (datation) sequence of random variables X = {X n } is adated if for all n, X n is F n measurable. Definition 3. (Martingales) n adated air (X n, F n ) is called sub / suer / martingale if. X n (P ) for all n 2. EX n+ F n / / = X n a.e. [P ]. Remark. a. X n is n.n.g, 2 sub-martingale, then X 2 n is sub-martingale. b. X n martingale, φ is a convex function. Then φ(x n ) is a sub-martingale. c. X n is a sub-martingale, φ is convex, non-decreasing (say x ), then φ(x n ) is a submartingale. Examle:. {ξ n } i.i.d mean 0, S n = n i= ξ i, F n = σ(ξ,..., ξ n ) = σ(s 0,..., S n ), then (S n, F n ) is a martingale. (For a sequence of random variables {X n }, F n = σ(x 0, X,..., X n ) is called a natural filtration. ) Theorem. (X n, F n ) is a martingale. φ : R R is convex and φ(x n ) for all n. Then (φ(x n ), F n ) is a sub-martingale. Proof. (Use Jensen s Inequality). Proerties:. {X n } is a martingale. Then X n is a sub-martingale. 2. {X n } is a -martingale for >. Then { X n } is a sub-martingale.

2 3. {X n } is a sub-martingale. Then {X + n } is a sub-martingale. 4. {X n } is a non-negative submartingale, >. Then {X n} is a sub-martingale. Examle:. Suose (X n, F n ) is an adated sequence and E(X n+ F n ) = a n X n. How to convert it to a martingale? et Z n = c n X n be a martingale. Then E(Z n+ F n ) = c n+ a n X n which should be equal to c n X n. Hence we should have c n+ /c n = a n for all n. Taking c 0 =, we have c n = n i= a i. Hence X n / n i= a i is a martingale. (Discussed in class. The case of E(X n+ F n ) = a n X n + b n left as an exercise). 2. Suose ([0, ), B, P ) be a robability sace and Q be another robability measure such that Q P. et F 0 = {φ, [0, )} F n = σ({[r2 n, (r + )2 n ) : 0 r 2 n }) called a dyadic filtration. Clearly σ( F n ) = B and X n (ω) = Q()/P () where is an F n -atom containing ω. Clearly, X n is F n -measurable since X n is constant on F n atoms. It is easy to check that (X n, F n ) is a martingale. (Discussed in class).. Doob s maximal inequalities Theorem 2. et (X j, F j ) 0 j n is a sub-martingale and λ R. Then. λp [max 0 j n X j > λ] max X j >λ X ndp E X n. 2. λp [min 0 j n X j λ] min X j λ X ndp E(X n X 0 ) E(X 0 ) E X n. Proof. Proof of a.) et 0 = [X 0 > λ] F 0 = [X 0 λ, X > λ] F. n = [X 0 λ,..., X n λ, X n > λ] F n. Then [max X j > λ] = n j=0 j and n λp [max X j > λ] = λp ( j ) n j=0 j=0 λdp j 2 n j=0 j X j dp = n X n dp = j j=0 max X j >λ X n dp,

3 where the enultimate inequality follows from the fact that since E(X n F j ) X j E(X n F j )dp X jdp for all F j. Hence X ndp X jdp for all F j. Proof of b.) 0 = [X 0 λ] F 0 = [X 0 > λ, X λ] F. n = [X 0 > λ,..., X n > λ, X n λ] F n n+ = [X 0 > λ,..., X n > λ] F n. Then EX 0 = n+ X 0 dp = X 0 dp j=0 j = X 0 dp + X 0 dp 0 c 0 λp ( 0 ) + X dp c 0 λp ( 0 ) + X dp + X dp { 0 } c λp ( 0 ) + X 2 dp { 0 } c. λp (min X j λ) + X n dp n+ = λp (min X j λ) + X n dp c n+ X n dp. This imlies λp [min 0 j n X j λ] min X j λ X ndp E(X n X 0 ) E(X 0 ) E X n. 2 Convergence of Martingales Goal: What conditions do we need so that an bounded martingale converges in? Definition 4. subset S is called uniformly integrable if given ɛ > 0, there exists c > 0 such that su f S f >c f dp < ɛ. 3

4 Definition 5. subset S is called bounded if su f S E f <. Examles:. ny finite -subset is uniformly integrable. 2. If S is dominated by an function, then S is uniformly integrable. 3. ny uniformly integrable subset is bounded. (Converse is not true in general: construct an bounded subset which is not uniformly integrable (Discussed in class). Theorem 3. S is uniformly integrable subset iff:. S is bounded. 2. For all ɛ > 0, there exists δ > 0 such that P () < δ, then su S f < ɛ. Proof. if art: Suose su f S E f = M <. Fix ɛ > 0. Choose δ from 2. Observe that su f S P ( f > c) su f S E f /c M/c. Choose c = M/δ with = { f > c} to get the result. only if art: Fix ɛ > 0. Choose c such that su f S f >c f < ɛ/2. Then f dp f + f < cp () + ɛ/2. f c Choose δ = ɛ/(2c) to get the result. Theorem 4. f. et S = {E C f := E(f C), Ca sub-σ-field of F}. Then S is uniformly integrable. Proof. Observe that E C f >c E C f So. is satisfied. To verify 2., note that E C f >c f >c E C f = su P ( E C f E f > c) su C C c E C f >c = E f. c Choose c large enough such that su C P ( E C f > c) is sufficiently small to have E C f >c f < ɛ/2. Observe that for any set with P () < ɛ/(2c), f f + f < cp () + ɛ/2 < ɛ. E C f c E C f >c f. 4

5 Theorem 5. f n, f n a.s. f. Then {f n } is uniformly integrable iff f and f n f. Proof. only if art: Fix ɛ > 0. By Fatou s emma and boundedness of {f n }, f. Next we show that f n f. To that end, observe that E f n f = E f n fn c f f c + E f n fn >c + E f f >c. Choose c such that P ( f = c) = 0, E f n fn >c < ɛ/3, E f f >c < ɛ/2 for all n. Since {ω : f n (ω) > c, f n (ω) f(ω) = c} { f = c}, we have f n fn c f f c 0 Choose N large enough such that for all n N, E f n fn c f f c < ɛ/3 by DCT. almost surely. Hence for large enough n, E f n f ɛ/3. if art: ssume f n f. Then {fn } is bounded and f. Fix ɛ > 0. Then f n f n f + f. Choose N such that for all n N, f n f < ɛ/2 and choose δ 0 with P () < δ 0 imlies f < ɛ/2. Then for n N, P () δ0 imlies f n < ɛ, imlying {f n } uniformly integrable. Theorem 6. f n 0, f n a.s. f, f n, f in. Then {f n } is uniformly integrable if and only if Ef n Ef. Proof. The only if art follows from Theorem 5. if art: It is enough to show that f n f (This follows from Scheff s emma). Note that E f n f = Ef n + Ef 2E min{f n, f}. Since Ef n Ef by assumtion and E min{f n, f} Ef by DCT, imlying f n f. Theorem 7. (X n, F n ) n 0 is a martingale. et F = σ( n 0 F n ) and su E X n = K. (E X n K since X n is a submartingale). Then the following are equivalent. i. K <, X n X. ii. (X n, F n ) 0 n is a martingale. 5

6 iii. K <, E X = K. iv. {X n } n 0 is uniformly integrable. Proof. (Comlete verification is left as an exercise). (i) = (iv): Since K <, X n X a.e. (Since bounded martingale converges almost surely, using Doobs s ucrossing inequality). Since X n X, from Theorem 5, {X n } n 0 is uniformly integrable. (iv) = (i): Since {X n } n 0 is uniformly integrable, {X n } n 0 is bounded imlying a.s K <. Since X n X, X n X by Theorem 5. Hence (i) (iv). (ii) = (iv): This follows directly from Theorem 4. (iv) = (ii): We will infact show (iv), (i) = (ii). X is the almost sure limit of X n and hence F n measurable for all n < and hence F measurable. lso (i) imlies X. It is enough to check that E(X F n ) = X n a.e. We already know that for m > n, X n = E(X m F n ) a.s. It is enough to show that E(X m X F n ) 0 a.e. as m. We know that X m X. Then E E[X m X F n ] EE[ X m X F n ] = E[ X m X ] 0 as m, imlying E X n E[X F n ] = 0 imlying X n = EX F n a.e. Hence we have shown that (i), (ii) and (iv) are equivalent. (iii) = (iv): Since X n is a sub-martingale, E X n K imlying E X n E X. By Scheffe s theorem X n X. By Theorem 6 { X n } and hence {X n } is uniformly integrable. Now we will show that (i) and (iv) imlies (iii). Since X n X, by Scheffe s emma and Fatou s emma E X n E X <. lso E X n K. Hence K < and E X = K. Theorem 8. (X n, F n ) 0 n N is a non-negative martingale. et >. Then max X n 0 n N X N Proof. The roof follows from the following emma and Doob s maximal inequality. emma. U, V non-negative random variables. et λ > 0, P (U > λ) (/λ) U>λ V dp. Then, for >, U V. 6

7 Proof. EU = = = = = = λ P (U > λ)dλ λ 2 V dp dλ U>λ λ 2 V (ω) U>λ (ω)dp (ω)dλ Ω Ω V (ω) λ 2 dλdp (ω) Ω 0 V (ω)u dp = Ω {E(V )} / {EU ( )q } /q {E(V )} / {EU } /q E(V U ) imlying U V if U. Otherwise, work with min{u, n}. Corollary. (X n, F n ) n 0 non-negative -bounded sub-martingale. Then X = su X n (True for bounded martingale with X = su X n ). Proof. et XN = max 0 n N X n, XN X. By MCT, E(XN ) E(X ). Since {X n } are bounded, ( ) E(XN) E X N < M, imlying X. Theorem 9. et >. {X n } is -bounded martingale or a non-negative sub-martingale. Then. {X n } is uniformly integrable. 2. X n X. Proof. Proof is left as an Exercise. Remark 2. Counter examle to show that one cannot get rid of the non-negativity in case of sub-martingale. et ([0, ), B, λ) be the measure sace and F n is a dyadic filtration. et X n = 2 n/2 [0,2 n ) 0 a.s. Check that {X n, F n } is a sub-martingale. Clearly X n 2 0 as X n 2 =. Comlete verification is left as an Exercise. 7

8 3 Examle: Two color urn model Suose (X n, F n ) adated -sequence. EX n+ F n = a n X n + b n a n 0. Find the associated martingale. Start with (W 0, B 0 ) balls with 0 W 0, B 0 and W 0 + B 0 =. t n th stage, W n white, B n black balls are available with W n + B n = n. Draw a ball at random and R 2 2 is a stochastic matrix. If you see a white ball, add R white and R 2 black balls. If you see black ball, add according to second row. X n+ is a vector which is (, 0) if white is drawn in nth stage, (0, ) if black. Z n = (W n, B n ). Since Z n is bounded for each fixed n, Z n. Observe that Z n+ = Z n + X n+r. Find the associated martingale. (Discussed in class). 4 Stoing Time F = (F n ) n 0 is a filtration. τ : Ω {0,, 2,..., } is a measurable random time. σ- field on {0,,..., } is P({0,,..., }). random time is called a stoing time if [τ = n] F n for all n {0,, 2,..., }. Examle: Time at which one starts smoking is a stoing time. However, the time at which one stos smoking is not a stoing time. [τ = n] is equivalent to [τ n] F n for all n {0,, 2,..., }. This is easy to see for discrete dime. For continuous time [τ t] F t for all t 0. [τ = t] = [τ t] n [τ t /n : n N]. In the discrete time case [τ < n] F n, [τ n ] F n F n. Theorem 0. For discrete arameter saces [τ = n] F n [τ n] F n [τ < n] F n. Definition 6. (X n, F n ) adated sequence. τ is a random time, [τ < ] = Ω. Stoing random variable X τ (ω) = X τ(ω) (ω). Theorem. X τ is B measurable. Proof. For B B(R), Xτ (B) = n=0 [X τ B] [τ = n] = n=0 [X n B] [τ = n]. Definition 7. (Stoing σ-field) F τ = { B : [τ = n] F n for all n} Clearly X τ is F τ measurable. 8

9 4. Proerties. τ σ F τ = F σ 2. τ, σ are sto times, then [τ < σ], [τ > σ], [τ = σ] are all F τ F σ -measurable. ([τ < σ] [τ = n] = [σ > n] [τ = n] F n ) 3. τ is F τ measurable. Then [τ = k] [τ = n] F n for k n. 4. τ σ, then F τ F σ. 5. (X n, F n ) 0 n N is a (sub)-martingale. τ σ is a stoing time. (X τ, X σ ) is a (sub)- martingale corresonding to (F τ, F σ ). 4.2 Doob s Ucrossing Inequality {X n } 0 n N, a < b. Define τ 0 = 0 τ = inf{n : X n a}. τ 2k+ = inf{n τ 2k : X n a} τ 2k+2 = inf{n τ 2k+ : X n b} with the convention that inf{φ} = N. Define emma 2. τ i s are sto times. U({X n } 0 n N ; [a, b]) = su{l : X τ2l a < b X τ2l } U({X n } n 0 ; [a, b]) = lim N {l : X τ2l a < b X τ2l }. Proof. τ 0, τ are sto times. ssume τ 2k is a sto time. Then {τ 2k+ = i} = i j=0 {τ 2k+ = i, τ 2k = j} {τ 2k+ = i, τ 2k = j} = {τ 2k = j} {X j+ > a} {X i > a} {X i a} F i For an adated sequence, {τ k } forms a stoing time, which imlies su{l : X τ2l a < b X τ2l } is measurable imlying U({X n } n 0 ; [a, b]) is measurable. 9

10 Theorem 2. (Doob s Ucrossing Inequality). {X n, F n } is a submartingale and a < b. Then EU({X n } 0 n N ; [a, b]) E[(X N a) + ] E[(X 0 a) + ] b a E X N + a. b a Corollary 2. n -bounded martingale converges almost surely. 5 Problem branching rocess is defined as follows: We start with one member, namely the oulation size is Z 0 =. et ξi n, i =, 2,..., n denote the number of children of ith individual in the nth generation. ssume ξi n are indeendent with common mean µ > 0. et Z n denote the oulation size in the nth generation. et k = P (ξi n = k), k 0, µ = E(ξi n). Z n+ = Z n i= ξn i. It is easy to see that X n = Z n /µ n is a martingale.. Does {X n } has a limit? Since X n is non-negative, {X n } converges almost everywhere to an integrable random variable X. 2. If µ <, Z n = 0 almost surely for large n. Z n = X n µ n a.s.. Since X n X and µ n a.s. 0, Z n 0. There exists a P -null set N such that for all ω / N, Z n (ω) 0. Given ɛ = /2, there exists N 0 (ω) such that Z n (ω) < ɛ for all n N 0 (ω) imlying Z n (ω) = 0 for all n N 0 (ω). This imlies Z n converges to 0 with robability for all large n. 3. If µ <, X n = 0 almost surely for large n. n= P (X n > 0) = n= P (Z n > 0) = n= P (Z n ) n= E(Z n) = n= µn <. Hence by Borel Cantelli emma P (lim su n ) = 0 which means P ( n= k n k ) = 0 imlying P ( n occurs infinitely often) = 0 imlying P (X n = 0 for all large n) =. Hence X 0 = 0 eventually with robability. 4. If µ = and P (ξ > ) > 0, then Z n 0 a.s. Z n non-negative martingale, hence Z n Z a.e and Z. It is enough to show that P (Z = k) = 0 for all k, or in other words P (Z n = k eventually) = 0. Observe that P (Z n = k eventually) P (one of ξ n,..., ξn k, eventually). It is enough to show that P (ξ n,..., ξn k >, infinitely often) =. To that end, note that n= P (ξn i > ) k =. The result follows from second Borel Cantelli emma. 0

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Lecture 6. 2 Recurrence/transience, harmonic functions and martingales

Lecture 6. 2 Recurrence/transience, harmonic functions and martingales Lecture 6 Classification of states We have shown that all states of an irreducible countable state Markov chain must of the same tye. This gives rise to the following classification. Definition. [Classification

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

B8.1 Martingales Through Measure Theory. Concept of independence

B8.1 Martingales Through Measure Theory. Concept of independence B8.1 Martingales Through Measure Theory Concet of indeendence Motivated by the notion of indeendent events in relims robability, we have generalized the concet of indeendence to families of σ-algebras.

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Sums of independent random variables

Sums of independent random variables 3 Sums of indeendent random variables This lecture collects a number of estimates for sums of indeendent random variables with values in a Banach sace E. We concentrate on sums of the form N γ nx n, where

More information

Stochastic integration II: the Itô integral

Stochastic integration II: the Itô integral 13 Stochastic integration II: the Itô integral We have seen in Lecture 6 how to integrate functions Φ : (, ) L (H, E) with resect to an H-cylindrical Brownian motion W H. In this lecture we address the

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

STOCHASTIC MODELS FOR WEB 2.0

STOCHASTIC MODELS FOR WEB 2.0 STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes

More information

Solution: (Course X071570: Stochastic Processes)

Solution: (Course X071570: Stochastic Processes) Solution I (Course X071570: Stochastic Processes) October 24, 2013 Exercise 1.1: Find all functions f from the integers to the real numbers satisfying f(n) = 1 2 f(n + 1) + 1 f(n 1) 1. 2 A secial solution

More information

LECTURE 7 NOTES. x n. d x if. E [g(x n )] E [g(x)]

LECTURE 7 NOTES. x n. d x if. E [g(x n )] E [g(x)] LECTURE 7 NOTES 1. Convergence of random variables. Before delving into the large samle roerties of the MLE, we review some concets from large samle theory. 1. Convergence in robability: x n x if, for

More information

Convergence of random variables, and the Borel-Cantelli lemmas

Convergence of random variables, and the Borel-Cantelli lemmas Stat 205A Setember, 12, 2002 Convergence of ranom variables, an the Borel-Cantelli lemmas Lecturer: James W. Pitman Scribes: Jin Kim (jin@eecs) 1 Convergence of ranom variables Recall that, given a sequence

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

ECON 2530b: International Finance

ECON 2530b: International Finance ECON 2530b: International Finance Compiled by Vu T. Chau 1 Non-neutrality of Nominal Exchange Rate 1.1 Mussa s Puzzle(1986) RER can be defined as the relative price of a country s consumption basket in

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Lecture 2: Consistency of M-estimators

Lecture 2: Consistency of M-estimators Lecture 2: Instructor: Deartment of Economics Stanford University Preared by Wenbo Zhou, Renmin University References Takeshi Amemiya, 1985, Advanced Econometrics, Harvard University Press Newey and McFadden,

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Probability and Measure

Probability and Measure Probability and Measure Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Convergence of Random Variables 1. Convergence Concepts 1.1. Convergence of Real

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Elementary Analysis in Q p

Elementary Analysis in Q p Elementary Analysis in Q Hannah Hutter, May Szedlák, Phili Wirth November 17, 2011 This reort follows very closely the book of Svetlana Katok 1. 1 Sequences and Series In this section we will see some

More information

Martingale Theory for Finance

Martingale Theory for Finance Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

Analysis of some entrance probabilities for killed birth-death processes

Analysis of some entrance probabilities for killed birth-death processes Analysis of some entrance robabilities for killed birth-death rocesses Master s Thesis O.J.G. van der Velde Suervisor: Dr. F.M. Sieksma July 5, 207 Mathematical Institute, Leiden University Contents Introduction

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Maximum Likelihood Asymptotic Theory. Eduardo Rossi University of Pavia

Maximum Likelihood Asymptotic Theory. Eduardo Rossi University of Pavia Maximum Likelihood Asymtotic Theory Eduardo Rossi University of Pavia Slutsky s Theorem, Cramer s Theorem Slutsky s Theorem Let {X N } be a random sequence converging in robability to a constant a, and

More information

Probability Theory I: Syllabus and Exercise

Probability Theory I: Syllabus and Exercise Probability Theory I: Syllabus and Exercise Narn-Rueih Shieh **Copyright Reserved** This course is suitable for those who have taken Basic Probability; some knowledge of Real Analysis is recommended( will

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Self-normalized laws of the iterated logarithm

Self-normalized laws of the iterated logarithm Journal of Statistical and Econometric Methods, vol.3, no.3, 2014, 145-151 ISSN: 1792-6602 print), 1792-6939 online) Scienpress Ltd, 2014 Self-normalized laws of the iterated logarithm Igor Zhdanov 1 Abstract

More information

CHAPTER 1. Martingales

CHAPTER 1. Martingales CHAPTER 1 Martingales The basic limit theorems of probability, such as the elementary laws of large numbers and central limit theorems, establish that certain averages of independent variables converge

More information

Marcinkiewicz-Zygmund Type Law of Large Numbers for Double Arrays of Random Elements in Banach Spaces

Marcinkiewicz-Zygmund Type Law of Large Numbers for Double Arrays of Random Elements in Banach Spaces ISSN 995-0802, Lobachevskii Journal of Mathematics, 2009, Vol. 30, No. 4,. 337 346. c Pleiades Publishing, Ltd., 2009. Marcinkiewicz-Zygmund Tye Law of Large Numbers for Double Arrays of Random Elements

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Introduction to Banach Spaces

Introduction to Banach Spaces CHAPTER 8 Introduction to Banach Saces 1. Uniform and Absolute Convergence As a rearation we begin by reviewing some familiar roerties of Cauchy sequences and uniform limits in the setting of metric saces.

More information

Martingale Theory and Applications

Martingale Theory and Applications Martingale Theory and Applications Dr Nic Freeman June 4, 2015 Contents 1 Conditional Expectation 2 1.1 Probability spaces and σ-fields............................ 2 1.2 Random Variables...................................

More information

Probability Theory II. Spring 2016 Peter Orbanz

Probability Theory II. Spring 2016 Peter Orbanz Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Econometrics I. September, Part I. Department of Economics Stanford University

Econometrics I. September, Part I. Department of Economics Stanford University Econometrics I Deartment of Economics Stanfor University Setember, 2008 Part I Samling an Data Poulation an Samle. ineenent an ientical samling. (i.i..) Samling with relacement. aroximates samling without

More information

17. Convergence of Random Variables

17. Convergence of Random Variables 7. Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then lim f n = f if lim f n (x) = f(x) for all x in R. This

More information

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)

More information

Elementary theory of L p spaces

Elementary theory of L p spaces CHAPTER 3 Elementary theory of L saces 3.1 Convexity. Jensen, Hölder, Minkowski inequality. We begin with two definitions. A set A R d is said to be convex if, for any x 0, x 1 2 A x = x 0 + (x 1 x 0 )

More information

Problem Set 2 Solution

Problem Set 2 Solution Problem Set 2 Solution Aril 22nd, 29 by Yang. More Generalized Slutsky heorem [Simle and Abstract] su L n γ, β n Lγ, β = su L n γ, β n Lγ, β n + Lγ, β n Lγ, β su L n γ, β n Lγ, β n + su Lγ, β n Lγ, β su

More information

MATH 418: Lectures on Conditional Expectation

MATH 418: Lectures on Conditional Expectation MATH 418: Lectures on Conditional Expectation Instructor: r. Ed Perkins, Notes taken by Adrian She Conditional expectation is one of the most useful tools of probability. The Radon-Nikodym theorem enables

More information

PETER J. GRABNER AND ARNOLD KNOPFMACHER

PETER J. GRABNER AND ARNOLD KNOPFMACHER ARITHMETIC AND METRIC PROPERTIES OF -ADIC ENGEL SERIES EXPANSIONS PETER J. GRABNER AND ARNOLD KNOPFMACHER Abstract. We derive a characterization of rational numbers in terms of their unique -adic Engel

More information

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006 Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Part III Advanced Probability

Part III Advanced Probability Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after

More information

Advanced Probability

Advanced Probability Advanced Probability University of Cambridge, Part III of the Mathematical Tripos Michaelmas Term 2006 Grégory Miermont 1 1 CNRS & Laboratoire de Mathématique, Equipe Probabilités, Statistique et Modélisation,

More information

CHAPTER 3: LARGE SAMPLE THEORY

CHAPTER 3: LARGE SAMPLE THEORY CHAPTER 3 LARGE SAMPLE THEORY 1 CHAPTER 3: LARGE SAMPLE THEORY CHAPTER 3 LARGE SAMPLE THEORY 2 Introduction CHAPTER 3 LARGE SAMPLE THEORY 3 Why large sample theory studying small sample property is usually

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

A PECULIAR COIN-TOSSING MODEL

A PECULIAR COIN-TOSSING MODEL A PECULIAR COIN-TOSSING MODEL EDWARD J. GREEN 1. Coin tossing according to de Finetti A coin is drawn at random from a finite set of coins. Each coin generates an i.i.d. sequence of outcomes (heads or

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

On Wald-Type Optimal Stopping for Brownian Motion

On Wald-Type Optimal Stopping for Brownian Motion J Al Probab Vol 34, No 1, 1997, (66-73) Prerint Ser No 1, 1994, Math Inst Aarhus On Wald-Tye Otimal Stoing for Brownian Motion S RAVRSN and PSKIR The solution is resented to all otimal stoing roblems of

More information

4 Expectation & the Lebesgue Theorems

4 Expectation & the Lebesgue Theorems STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does

More information

1 Basics of probability theory

1 Basics of probability theory Examples of Stochastic Optimization Problems In this chapter, we will give examples of three types of stochastic optimization problems, that is, optimal stopping, total expected (discounted) cost problem,

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Probability Theory II. Spring 2014 Peter Orbanz

Probability Theory II. Spring 2014 Peter Orbanz Probability Theory II Spring 2014 Peter Orbanz Contents Chapter 1. Martingales, continued 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Notions of convergence for martingales 3 1.3. Uniform

More information

Martingales. Chapter Definition and examples

Martingales. Chapter Definition and examples Chapter 2 Martingales 2.1 Definition and examples In this chapter we introduce and study a very important class of stochastic processes: the socalled martingales. Martingales arise naturally in many branches

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

CONVERGENCE OF RANDOM SERIES AND MARTINGALES CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Lectures 22-23: Conditional Expectations

Lectures 22-23: Conditional Expectations Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation

More information

Itô isomorphisms for L p -valued Poisson stochastic integrals

Itô isomorphisms for L p -valued Poisson stochastic integrals L -valued Rosenthal inequalities Itô isomorhisms in L -saces One-sided estimates in martingale tye/cotye saces Itô isomorhisms for L -valued Poisson stochastic integrals Sjoerd Dirksen - artly joint work

More information

Stochastic Processes in Discrete Time

Stochastic Processes in Discrete Time Stochastic Processes in Discrete Time May 5, 2005 Contents Basic Concepts for Stochastic Processes 3. Conditional Expectation....................................... 3.2 Stochastic Processes, Filtrations

More information

1 Extremum Estimators

1 Extremum Estimators FINC 9311-21 Financial Econometrics Handout Jialin Yu 1 Extremum Estimators Let θ 0 be a vector of k 1 unknown arameters. Extremum estimators: estimators obtained by maximizing or minimizing some objective

More information

Convergence Concepts of Random Variables and Functions

Convergence Concepts of Random Variables and Functions Convergence Concepts of Random Variables and Functions c 2002 2007, Professor Seppo Pynnonen, Department of Mathematics and Statistics, University of Vaasa Version: January 5, 2007 Convergence Modes Convergence

More information

Extremal Polynomials with Varying Measures

Extremal Polynomials with Varying Measures International Mathematical Forum, 2, 2007, no. 39, 1927-1934 Extremal Polynomials with Varying Measures Rabah Khaldi Deartment of Mathematics, Annaba University B.P. 12, 23000 Annaba, Algeria rkhadi@yahoo.fr

More information

STOR 635 Notes (S13)

STOR 635 Notes (S13) STOR 635 Notes (S13) Jimmy Jin UNC-Chapel Hill Last updated: 1/14/14 Contents 1 Measure theory and probability basics 2 1.1 Algebras and measure.......................... 2 1.2 Integration................................

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem Coyright c 2017 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins

More information

Math 635: An Introduction to Brownian Motion and Stochastic Calculus

Math 635: An Introduction to Brownian Motion and Stochastic Calculus First Prev Next Go To Go Back Full Screen Close Quit 1 Math 635: An Introduction to Brownian Motion and Stochastic Calculus 1. Introduction and review 2. Notions of convergence and results from measure

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

7 Convergence in R d and in Metric Spaces

7 Convergence in R d and in Metric Spaces STA 711: Probability & Measure Theory Robert L. Wolpert 7 Convergence in R d and in Metric Spaces A sequence of elements a n of R d converges to a limit a if and only if, for each ǫ > 0, the sequence a

More information

Notes on Stochastic Calculus

Notes on Stochastic Calculus Notes on Stochastic Calculus David Nualart Kansas University nualart@math.ku.edu 1 Stochastic Processes 1.1 Probability Spaces and Random Variables In this section we recall the basic vocabulary and results

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales

More information

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011 Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding

More information

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 ) 1 Introduction These problems are meant to be practice problems for you to see if you have understood the material reasonably well. They are neither exhaustive (e.g. Diffusions, continuous time branching

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

1 Probability Spaces and Random Variables

1 Probability Spaces and Random Variables 1 Probability Saces and Random Variables 1.1 Probability saces Ω: samle sace consisting of elementary events (or samle oints). F : the set of events P: robability 1.2 Kolmogorov s axioms Definition 1.2.1

More information

(6, 4) Is there arbitrage in this market? If so, find all arbitrages. If not, find all pricing kernels.

(6, 4) Is there arbitrage in this market? If so, find all arbitrages. If not, find all pricing kernels. Advanced Financial Models Example sheet - Michaelmas 208 Michael Tehranchi Problem. Consider a two-asset model with prices given by (P, P 2 ) (3, 9) /4 (4, 6) (6, 8) /4 /2 (6, 4) Is there arbitrage in

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

18.175: Lecture 3 Integration

18.175: Lecture 3 Integration 18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability

More information

Tsung-Lin Cheng and Yuan-Shih Chow. Taipei115,Taiwan R.O.C.

Tsung-Lin Cheng and Yuan-Shih Chow. Taipei115,Taiwan R.O.C. A Generalization and Alication of McLeish's Central Limit Theorem by Tsung-Lin Cheng and Yuan-Shih Chow Institute of Statistical Science Academia Sinica Taiei5,Taiwan R.O.C. hcho3@stat.sinica.edu.tw Abstract.

More information

Lecture 21: Expectation of CRVs, Fatou s Lemma and DCT Integration of Continuous Random Variables

Lecture 21: Expectation of CRVs, Fatou s Lemma and DCT Integration of Continuous Random Variables EE50: Probability Foundations for Electrical Engineers July-November 205 Lecture 2: Expectation of CRVs, Fatou s Lemma and DCT Lecturer: Krishna Jagannathan Scribe: Jainam Doshi In the present lecture,

More information

Existence Results for Quasilinear Degenerated Equations Via Strong Convergence of Truncations

Existence Results for Quasilinear Degenerated Equations Via Strong Convergence of Truncations Existence Results for Quasilinear Degenerated Equations Via Strong Convergence of Truncations Youssef AKDIM, Elhoussine AZROUL, and Abdelmoujib BENKIRANE Déartement de Mathématiques et Informatique, Faculté

More information

The Kolmogorov continuity theorem, Hölder continuity, and the Kolmogorov-Chentsov theorem

The Kolmogorov continuity theorem, Hölder continuity, and the Kolmogorov-Chentsov theorem The Kolmogorov continuity theorem, Hölder continuity, and the Kolmogorov-Chentsov theorem Jordan Bell jordan.bell@gmail.com Department of Mathematics, University of Toronto June 11, 2015 1 Modifications

More information

Probability A exam solutions

Probability A exam solutions Probability A exam solutions David Rossell i Ribera th January 005 I may have committed a number of errors in writing these solutions, but they should be ne for the most part. Use them at your own risk!

More information

MATH 829: Introduction to Data Mining and Analysis Consistency of Linear Regression

MATH 829: Introduction to Data Mining and Analysis Consistency of Linear Regression 1/9 MATH 829: Introduction to Data Mining and Analysis Consistency of Linear Regression Dominique Guillot Deartments of Mathematical Sciences University of Delaware February 15, 2016 Distribution of regression

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

Math 735: Stochastic Analysis

Math 735: Stochastic Analysis First Prev Next Go To Go Back Full Screen Close Quit 1 Math 735: Stochastic Analysis 1. Introduction and review 2. Notions of convergence 3. Continuous time stochastic processes 4. Information and conditional

More information

Advanced Probability

Advanced Probability Advanced Probability Perla Sousi October 10, 2011 Contents 1 Conditional expectation 1 1.1 Discrete case.................................. 3 1.2 Existence and uniqueness............................ 3 1

More information

Outline. Martingales. Piotr Wojciechowski 1. 1 Lane Department of Computer Science and Electrical Engineering West Virginia University.

Outline. Martingales. Piotr Wojciechowski 1. 1 Lane Department of Computer Science and Electrical Engineering West Virginia University. Outline Piotr 1 1 Lane Department of Computer Science and Electrical Engineering West Virginia University 8 April, 01 Outline Outline 1 Tail Inequalities Outline Outline 1 Tail Inequalities General Outline

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d

More information

WALD S EQUATION AND ASYMPTOTIC BIAS OF RANDOMLY STOPPED U-STATISTICS

WALD S EQUATION AND ASYMPTOTIC BIAS OF RANDOMLY STOPPED U-STATISTICS PROCEEDINGS OF HE AMERICAN MAHEMAICAL SOCIEY Volume 125, Number 3, March 1997, Pages 917 925 S 0002-9939(97)03574-0 WALD S EQUAION AND ASYMPOIC BIAS OF RANDOMLY SOPPED U-SAISICS VICOR H. DE LA PEÑA AND

More information