Exercises Measure Theoretic Probability

Similar documents
Exercises Measure Theoretic Probability

Probability Theory. Richard F. Bass

A D VA N C E D P R O B A B I L - I T Y

Useful Probability Theorems

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

P (A G) dp G P (A G)

Part II Probability and Measure

1. Stochastic Processes and filtrations

THEOREMS, ETC., FOR MATH 515

Stochastic integration. P.J.C. Spreij

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Stat 8112 Lecture Notes Weak Convergence in Metric Spaces Charles J. Geyer January 23, Metric Spaces

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 2017 Nadia S. Larsen. 17 November 2017.

Chapter 1. Poisson processes. 1.1 Definitions

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Probability and Measure

Random Process Lecture 1. Fundamentals of Probability

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

STOCHASTIC MODELS FOR WEB 2.0

Course 212: Academic Year Section 1: Metric Spaces

4 Expectation & the Lebesgue Theorems

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

Math 735: Stochastic Analysis

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Brownian Motion and Stochastic Calculus

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Verona Course April Lecture 1. Review of probability

Wiener Measure and Brownian Motion

Math 6810 (Probability) Fall Lecture notes

Independent random variables

Advanced Probability

4th Preparation Sheet - Solutions

Combinatorics in Banach space theory Lecture 12

Integration on Measure Spaces

Homework Assignment #2 for Prob-Stats, Fall 2018 Due date: Monday, October 22, 2018

Measure Theoretic Probability. P.J.C. Spreij

The Caratheodory Construction of Measures

Weak convergence. Amsterdam, 13 November Leiden University. Limit theorems. Shota Gugushvili. Generalities. Criteria

Probability Theory II. Spring 2016 Peter Orbanz

Lecture 2: Random Variables and Expectation

Tools from Lebesgue integration

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

ABSTRACT INTEGRATION CHAPTER ONE

Lecture 6 Basic Probability

Lecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

STAT 7032 Probability Spring Wlodek Bryc

(A n + B n + 1) A n + B n

MA651 Topology. Lecture 10. Metric Spaces.

7 Convergence in R d and in Metric Spaces

Exercises in stochastic analysis

The main results about probability measures are the following two facts:

From now on, we will represent a metric space with (X, d). Here are some examples: i=1 (x i y i ) p ) 1 p, p 1.

CHAPTER 1. Martingales

MATH 6605: SUMMARY LECTURE NOTES

Part III Advanced Probability

1 Independent increments

Lecture 22: Variance and Covariance

Doléans measures. Appendix C. C.1 Introduction

Metric Spaces and Topology

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

Problem set 1, Real Analysis I, Spring, 2015.

Martingales, standard filtrations, and stopping times

Martingale Theory and Applications

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Weak convergence and Compactness.

Notes 1 : Measure-theoretic foundations I

Lecture 19 L 2 -Stochastic integration

Self-normalized laws of the iterated logarithm

JUSTIN HARTMANN. F n Σ.

Probability and Measure

Applications of Ito s Formula

Measure Theoretic Probability. P.J.C. Spreij

STOR 635 Notes (S13)

Jump Processes. Richard F. Bass

x n x or x = T -limx n, if every open neighbourhood U of x contains x n for all but finitely many values of n

MATHS 730 FC Lecture Notes March 5, Introduction

Economics 574 Appendix to 13 Ways

Elementary Probability. Exam Number 38119

Advanced Analysis Qualifying Examination Department of Mathematics and Statistics University of Massachusetts. Tuesday, January 16th, 2018

Mathematics for Economists

g 2 (x) (1/3)M 1 = (1/3)(2/3)M.

1. Aufgabenblatt zur Vorlesung Probability Theory

An essay on the general theory of stochastic processes

Recall that if X is a compact metric space, C(X), the space of continuous (real-valued) functions on X, is a Banach space with the norm

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

3 Integration and Expectation

PROBABILITY THEORY II

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Transcription:

Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family of σ-algebras is again a σ- algebra. Characterize σ(c) for a given collection C 2 Ω. (c) If C 1 and C 2 are collections of subsets of Ω ith C 1 C 2, then d(c 1 ) d(c 2 ). 2. Let G and H be to σ-algebras on Ω. Let C = {G H : G G, H H}. Sho that C is a π-system and that σ(c) = σ(g, H). 3. Sho that D 2 (Williams, page 194) is a π-system. 4. If h 1 and h 2 are measurable functions, then h 1 h 2 is measurable too. 5. Let Ω be a countable set. Let F = 2 Ω and let p : Ω [0, 1] satisfy ω Ω p(ω) = 1. Put P(A) = ω A p(ω) for A F. Sho that P is a probability measure. 6. Let Ω be a countable set. Let A be the collection of A Ω such that A or its complement has finite cardinality. Sho that A is an algebra. What is d(a)? Week 2 1. Let X be a random variable. Sho that Π(X) := {X 1 (, x] : x R} is a π-system and that it generates σ(x). 2. Let {Y γ : γ C} be an arbitrary collection of random variables and {X n : n N} be a countable collection of random variables, all defined on the same probability space. (a) Sho that σ{y γ : γ C} = σ{yγ 1 (B) : γ C, B B}. (b) Let X n = σ{x 1,..., X n } (n N) and A = n=1 X n. Sho that A is an algebra and that σ(a) = σ{x n : n N}. 3. Sho that the X + and X are measurable functions and that X + is right-continuous and X is left-continuous (notation as in section 3.12). 4. Williams, exercise E4.1. 1

5. Williams, exercise E4.6. 6. Let F be a σ-algebra on Ω ith the property that for all F F it holds that P(F ) {0, 1}. Let X : Ω R be F-measurable. Sho that for some c R one has P(X = c) = 1. (Hint: P(X x) {0, 1} for all x.) Week 3 1. Let X and Y be simple random variables. Sho that E X doesn t depend on the chosen representation of X. Sho also that E (X +Y ) = E X +E Y. 2. Sho that the expectation is a linear operator on L 1 (Ω, F, P). 3. Let (Ω, F, P) = ([0, 1], B, Leb) and let f, f 1, f 2,... be densities (nonnegative measurable functions that integrate to 1) on [0, 1]. Assume that f n f a.s. Sho that for all x [0, 1] it holds that F n (x) F (x), here F n (x) = [0,x] f n(t) dt and F (x) = f(t) dx. [0,t] 4. (a) Sho that for X L 1 (Ω, F, P) it holds that E X E X. (b) Prove the second part of Scheffé s lemma for random variables X and X n (see page 55). 5. Let X L 1 (Ω, F, P). Sho that lim n np( X > n) = 0. Week 4 1. Prove the assertions (a)-(d) of section 5.14. 2. Prove the assertions (a)-(c) of section 6.5. 3. Prove lemma 6.12 (you find the standard machine in section 5.12). 4. Prove part (b) of Fubini s theorem in section 8.2. 5. Complete the proof of theorem 6.11: sho the a.s. uniqueness of Y and sho that if X Y Z for all Z in K, then X Y 2 = inf{ X Y 2 : Y K}. Week 5 1. Finish the proof of theorem 9.2: Take arbitrary X L 1 (Ω, F, P) and sho that the existence of the conditional expectation of X follos from the existence of the conditional expectations of X + and X. 2. Prove the conditional version of Fatou s lemma, property (f) on page 88. 3. Prove the conditional Dominated Convergence theorem, property (g) on page 88. 2

4. Exercise E9.1. 5. Exercise E9.2 6. Let (X, Y ) have a bivariate normal distribution ith E X = µ X, E Y = µ Y, Var X = σx 2, Var Y = σ2 Y and Cov (X, Y ) = c. Let ˆX = µ x + c σ 2 Y (Y µ Y ). Sho that E (X ˆX)Y = 0. Sho also (use a special property of the bivariate normal distribution) that E (X ˆX)g(Y ) = 0 if g is a Borelmeasurable function such that E g(y ) 2 <. Conclude that ˆX is a version of E [X Y ]. Week 6 In all exercises belo e consider a probability space (Ω, F, P) ith a filtration F. 1. Let X be an adapted process and T a stopping time that is finite. Sho that X T is F-measurable. Sho also that for arbitrary stopping times T (so the value infinity is also alloed) the stopped process X T is adapted. 2. Sho that an adapted process X is a martingale iff E[X n+m F n ] = X n for all n, m 0. 3. Read first the definition of a submartingale. Let f be a convex function on R. Suppose that X is a martingale and that E f(x n ) < for all n. Sho that the process (f(x n )) n 0 is a submartingale. 4. For every n e have a measurable function f n on R n. Let Z 1, Z 2,... be independent random variables and F n = σ(z 1,..., Z n ). Sho that (you may assume sufficient integrability) that X n = f n (Z 1,..., Z n ) defines a martingale under the condition that Ef n (z 1,..., z n 1, Z n ) = f n 1 (z 1,..., z n 1 ) for every n. 5. If S and T are stopping times, then also S + T, S T and S T are stopping times. Sho this. Week 7 In all exercises belo e consider a probability space (Ω, F, P) ith a filtration F. 1. (a) If X is a martingale is and f a convex function such that E f(x n ) <, then Y defined by Y n = f(x n ) is a submartingale. Sho this (keyord: Jensen). 3

(b) Sho that Y is a submartingale, if X is one and f is an increasing function. 2. Prove Corollaries (c) and (d) on page 101. 3. Sho that the process C of section 11.1 is previsible. 4. Let X be a submartingale ith sup n 0 E X n <. Sho that there exists a random variable X such that X n X a.s. 5. Sho that for a supermartingale X the condition sup{e X n : n N} < is equivalent to the condition sup{e X n : n N} <. 6. Consider the probability space (Ω, F, P) ith Ω = [0, 1), F the Borelsets of [0, 1) and P the Lebesgue measure. Let Ik n = [k2 n, (k + 1)2 n ) for k = 0,..., 2 n 1 and F n be the σ-algebra by the Ik n for k = 0,..., 2n 1. Define X n = 1 I n 0 2 n. Sho that X n is a martingale is and that the conditions of theorem 11.5 are satisfied. What is X in this case? Do e have L X 1 n X? (This has something to do ith 11.6). Week 8 1. Let Y L 1, (F n ) and define for all n N the random variable X n = E [Y F n ]. We kno that there is X such that X n X a.s. Sho that for Y L 2 L, e have X 2 n X. Find a condition such that X = Y. Give also an example in hich P (X = Y ) = 0. 2. Sho that every finite collection in L 1 is uniformly integrable. 3. Williams, exercise E13.1. 4. Let C be a uniformly integrable collection of random variables. (a) Consider C, the closure of C in L 1. Use E13.1 to sho that also C is uniformly integrable. (b) Let D be the convex hull of C. Then both D and its closure in L 1 are uniformly integrable 5. In this exercise you prove (fill in the details) the folloing characterization: a collection C is uniformly integrable iff there exists a function G : R + R + G(t) such that lim t t = and M := sup{eg( X ) : X C} <. The necessity you prove as follos. Let ε > 0 choose a = M/ε and c such a for all t > c. To prove uniform integrability of C you use that X G( X ) a on the set { X c}. It is less easy to prove sufficieny. Proceed as follos. Suppose that e have a sequence (g n ) ith g 0 = 0 and lim n g n =. Define g(t) = that G(t) t n=0 1 [n,n+1)(t)g n and G(t) = t 0 g(s)ds. Check that lim t G(t) t =. 4

With a n (X) = P( X > n), it holds that E G( X ) n=1 g na n ( X ). Furthermore, for every k N e have X k X dp m=k a m(x). Pick for every n a constant c n N such that X c n X dp 2 n. Then m=c n a m (X) 2 n and hence n=1 m=c n a m (X) 1. Choose then the sequence (g n ) as the inverse of (c n ): g n = #{k : c k n}. 6. Prove that a collection C is uniformly integrable iff there exists an increasing and convex function G : R + R + G(t) such that lim t t = and M := sup{g( X ) : X C} <. Let D be the closure of the convex hull of a uniformly integrable collection C in L 1. With the function G as above e have sup{eg( X ) : X D} = M, hence also D is unifromly integrable. 7. Let p 1 and let X, X 1, X 2,... be random variables. Then X n converges to X in L p iff the folloing to conditions are satisfied. Week 9 (a) X n X in probability, (b) The collection { X n p : n N} is uniformly integrable. 1. Sho that R, defined on page 142 of Williams, is equal to qe X p 1 Y. Sho also that the hypothesis of lemma 14.10 is true for X n if it is true for X and complete the proof of this lemma. 2. Exercise E14.2 of Williams. 3. Let X = (X n ) n 0 a (backard) supermartingale. (a) Sho equivalence of the next to properties: (i) sup n E X n < and (ii) lim n EX n <. (Use that x x + is convex and increasing.) (b) Under the condition sup n E X n =: A < the supermartingale X is uniformly integrable. To sho this, you may proceed as follos (but other solutions are equally elcome). Let ε > 0 and choose K Z such that for all n < K one has 0 EX n EX K < ε. It is then sufficient to sho that (X n ) n K is uniformly integrable. Let c > 0 be arbitrary and F n = { X n > c}. Using the supermartingale inequality you sho that X n dp X K dp + ε. F n F n Because P(F n ) A c you conclude the proof. 5

Week 10 1. Let µ, µ 1 µ 2,... be probability measures on R. Sho that µ n µ iff for all bounded Lipschitz continuous functions one has f dµ n f dµ. (Hint: for one implication the proof of lemma 17.2 is instructive.) 2. Sho the if part of lemma 17.2. 3. If the random variables X, X 1, X 2,... are defined on the same probability P space and if X n X, then Xn X. Prove this. 4. Suppose that X n X and that the collection {Xn, n 1} is uniformly integrable (you make a minor change in the definition of this notion if the X n are defined on different probability spaces). Use the Skorohod representation to sho that X n X implies EX n EX. 5. Sho the folloing variation on Fatou s lemma: if X n X, then E X lim inf n E X n. 6. Sho that the eak limit of a sequence of probability measures is unique. Week 11 1. Consider the N(µ n, σ 2 n) distributions, here the µ n are real numbers and the σ 2 n nonnegative. Sho that this family is tight iff the sequences (µ n ) and (σ 2 n) are bounded. Under hat condition do e have that the N(µ n, σ 2 n) distributions converge to a (eak) limit? What is this limit? 2. For each n e have a sequence ξ n1,..., ξ nkn of independent random variables ith Eξ nj = 0 and k n j=1 Var ξ nj = 1. If k n j=1 E ξ nj 2+δ 0 as n for some δ > 0, then k n j=1 ξ nj N(0, 1). Sho that this follos from the Lindeberg Central Limit Theorem. 3. The classical central limit theorem says that 1 σ n n j=1 (X j µ) N(0, 1), if the X j are iid ith EX j = µ and 0 < Var X j = σ 2 <. Sho that this follos from the Lindeberg Central Limit Theorem. 4. Sho that X n X iff Ef(X n ) Ef(X) for all bounded uniformly continuous functions f. 5. Let X and Y be independent, assume that Y has a N(0, 1) distribution. Let σ > 0. Let φ be the characteristic function of X: φ(u) = E exp(iux). (a) Sho that Z = X+σY has density p(z) = 1 σ 2π E exp( 1 2σ (z X) 2 ). 2 (b) Sho that p(z) = 1 2πσ φ( y/σ) exp(iyz/σ 1 2 y2 ) dy. 6

6. Let X, X 1, X 2,... be a sequence of random variables and Y a N(0, 1)- distributed random variable independent of that sequence. Let φ n be the characteristic function of X n and φ that of X. Let p n be the density of X n + σy and p the density of X + σy. (a) If φ n φ pointise, then p n p pointise. Invoke the previous exercise and the dominated convergence theorem to sho this. (b) Let f C b (R) be bounded by B. Sho that Ef(X n +σy ) Ef(X + σy ) 2B (p(z) p n (z)) + dz. (c) Sho that Ef(X n + σy ) Ef(X + σy ) 0 if φ n φ pointise. (d) Prove the folloing theorem: X n Week 12 X iff φn φ pointise. 1. Consider the sequence of tents (X n ), here Xt n = nt for t [0, 1 2n ], Xt n = 1 nt for t [ 1 2n, 1 n ], and zero elsehere (there is no randomness here). Sho that all finite dimensional distributions of the X n converge, but X n does not converge in distribution. 2. Sho that ρ as in (1.1) defines a metric. 3. Suppose that the ξ i of section 4 are iid normally distributed random variables. Use Doob s inequality to obtain P(max j n S j > γ) 3γ 4 n 2. 4. Sho that a finite dimensional projection on C[0, ) (ith the metric ρ) is continuous. 5. Consider C[0, ) ith the Borel σ-algebra B induced by ρ and some probability space (Ω, F, P). If X : (Ω, F) (C[0, ), B) is measurable, then all maps ω X t (ω) are random variables. Sho this, as ell as its converse. For the latter you need separability that allos you to say that the Borel σ-algebra B is a product σ-algebra (see also Williams, page 82). 6. Prove proposition 2.2 of the lecture notes. 7