Independent random variables

Similar documents
Section 2: Classes of Sets

STA 711: Probability & Measure Theory Robert L. Wolpert

Stat 451: Solutions to Assignment #1

Abstract Measure Theory

Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 2017 Nadia S. Larsen. 17 November 2017.

18.175: Lecture 2 Extension theorems, random variables, distributions

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

The Kolmogorov extension theorem

Building Infinite Processes from Finite-Dimensional Distributions

Lecture 7. Sums of random variables

13. Examples of measure-preserving tranformations: rotations of a torus, the doubling map

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.

REAL ANALYSIS I Spring 2016 Product Measures

Probability and Measure

MATH 418: Lectures on Conditional Expectation

Measure and Integration: Concepts, Examples and Exercises. INDER K. RANA Indian Institute of Technology Bombay India

Measures and Measure Spaces

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

Annalee Gomm Math 714: Assignment #2

Admin and Lecture 1: Recap of Measure Theory

RS Chapter 1 Random Variables 6/5/2017. Chapter 1. Probability Theory: Introduction

Measure and integration

µ (X) := inf l(i k ) where X k=1 I k, I k an open interval Notice that is a map from subsets of R to non-negative number together with infinity

Probability Theory. Richard F. Bass

II - REAL ANALYSIS. This property gives us a way to extend the notion of content to finite unions of rectangles: we define

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

1 Measurable Functions

Exercises Measure Theoretic Probability

Construction of a general measure structure

(U) =, if 0 U, 1 U, (U) = X, if 0 U, and 1 U. (U) = E, if 0 U, but 1 U. (U) = X \ E if 0 U, but 1 U. n=1 A n, then A M.

1.1. MEASURES AND INTEGRALS

Lebesgue measure on R is just one of many important measures in mathematics. In these notes we introduce the general framework for measures.

MATH41011/MATH61011: FOURIER SERIES AND LEBESGUE INTEGRATION. Extra Reading Material for Level 4 and Level 6

1 Sequences of events and their limits

HILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define

Chapter 4. Measure Theory. 1. Measure Spaces

Advanced Probability

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

1.4 Outer measures 10 CHAPTER 1. MEASURE

36-752: Lecture 1. We will use measures to say how large sets are. First, we have to decide which sets we will measure.

Quick review on Discrete Random Variables

Random Process Lecture 1. Fundamentals of Probability

Basic Measure and Integration Theory. Michael L. Carroll

18.175: Lecture 3 Integration

1. Probability Measure and Integration Theory in a Nutshell

Notes 1 : Measure-theoretic foundations I

4. Ergodicity and mixing

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

Measures. 1 Introduction. These preliminary lecture notes are partly based on textbooks by Athreya and Lahiri, Capinski and Kopp, and Folland.

Lebesgue Measure on R n

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

1 Probability space and random variables

Exercises for Unit VI (Infinite constructions in set theory)

CHANGE OF MEASURE. D.Majumdar

Lecture 4: Conditional expectation and independence

Real Variables: Solutions to Homework 3

µ X (A) = P ( X 1 (A) )

4th Preparation Sheet - Solutions

Mathematics Course 111: Algebra I Part I: Algebraic Structures, Sets and Permutations

Measure Theory on Topological Spaces. Course: Prof. Tony Dorlas 2010 Typset: Cathal Ormond

2 Construction & Extension of Measures

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Hints/Solutions for Homework 3

MATH31011/MATH41011/MATH61011: FOURIER ANALYSIS AND LEBESGUE INTEGRATION. Chapter 2: Countability and Cantor Sets

Continuum Probability and Sets of Measure Zero

Lebesgue Measure. Dung Le 1

Probability and Measure. November 27, 2017

Chapter 1: Probability Theory Lecture 1: Measure space, measurable function, and integration

Reminder Notes for the Course on Measures on Topological Spaces

CW complexes. Soren Hansen. This note is meant to give a short introduction to CW complexes.

Lecture 17 Brownian motion as a Markov process

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Random experiments may consist of stages that are performed. Example: Roll a die two times. Consider the events E 1 = 1 or 2 on first roll

Three hours THE UNIVERSITY OF MANCHESTER. 31st May :00 17:00

MATH/STAT 235A Probability Theory Lecture Notes, Fall 2013

MATHS 730 FC Lecture Notes March 5, Introduction

consists of two disjoint copies of X n, each scaled down by 1,

5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing.

Lecture 11: Random Variables

Probability and Random Processes

Integral Jensen inequality

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

2 Probability, random elements, random sets

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Probability and Measure

Brownian Motion and Conditional Probability

Compendium and Solutions to exercises TMA4225 Foundation of analysis

An Introduction to Entropy and Subshifts of. Finite Type

Exercises Measure Theoretic Probability

Measure Theoretic Probability. P.J.C. Spreij

Final. due May 8, 2012

Indeed, if we want m to be compatible with taking limits, it should be countably additive, meaning that ( )

FE 5204 Stochastic Differential Equations

THEOREMS, ETC., FOR MATH 516

Basic Definitions: Indexed Collections and Random Functions

Lecture 6 Basic Probability

Introduction to Ergodic Theory

Transcription:

CHAPTER 2 Independent random variables 2.1. Product measures Definition 2.1. Let µ i be measures on (Ω i,f i ), 1 i n. Let F F 1... F n be the sigma algebra of subsets of Ω : Ω 1... Ω n generated by all rectangles A 1... A n with A i F i. Then, the measure µ on (Ω,F ) such that µ(a 1... A n ) n i1 µ i(a i ) whenever A i F i is called a product measure and denoted µ µ 1... µ n. The existence of product measures follows along the lines of the Caratheodary construction starting with the π-system of rectangles. We skip details, but in the cases that we ever use, we shall show existence by a much neater method in Proposition 2.8. Uniqueness of product measure follows from the π λ theorem because rectangles form a π-system that generate the σ-algebra F 1... F n. Example 2.2. Let B d,m d denote the Borel sigma algebra and Lebesgue measure on R d. Then, B d B 1... B 1 and m d m 1... m 1. The first statement is clear (in fact B d+d B d B d ). Regarding m d, by definition, it is the unique measure for which m d (A 1... A n ) equals n i1 m 1(A i ) for all intervals A i. To show that it is the d-fold product of m 1, we must show that the same holds for any Borel sets A i. Fix intervals A 2,..., A n and let S : {A 1 B 1 : m d (A 1... A n ) n i1 m 1(A i )}. Then, S contains all intervals (in particular the π-system of semi-closed intervals) and by properties of measures, it is easy to check that S is a λ-system. By the π λ theorem, we get S B 1 and thus, m d (A 1... A n ) n i1 m 1(A i ) for all A 1 B 1 and any intervals A 2,..., A n. Continuing the same argument, we get that m d (A 1... A n ) n i1 m 1(A i ) for all A i B 1. The product measure property is defined in terms of sets. As always, it may be written for measurable functions and we then get the following theorem. Theorem 2.3 (Fubini s theorem). Let µ µ 1 µ 2 be a product measure on Ω 1 Ω 2 with the product σ-algebra. If f : Ω R + is either a non-negative r.v. or integrable w.r.t µ, then, (1) For every x Ω 1, the function y f (x, y) is F 2 -measurable, and the function x f (x, y)dµ 2 (y) is F 1 -measurable. The same holds with x and y interchanged. ) ) (2) f (z)dµ(z) ( f (x, y)dµ 2 (y) dµ 1 (x) ( f (x, y)dµ 1 (x) dµ 2 (y). Ω Ω 1 Ω 2 Ω 2 Ω 1 PROOF. Skipped. Attend measure theory class. Needless to day (self: then why am I saying this?) all this goes through for finite products of σ-finite measures. 25

26 2. INDEPENDENT RANDOM VARIABLES Infinite product measures: Given (Ω i,f i,µ i ), i 1,2,..., let Ω : Ω 1 Ω 2... and let F be the sigma algebra generated by all finite dimensional cylinders A 1... A n Ω n+1 Ω n+2... with A i F i. Does there exist a product measure µ on F? For concreteness take all (Ω i,f i,µ i ) (R,B,ν). What measure should the product measure µ give to the set A R R...? If ν(r) > 1, it is only reasonable to set µ(a R R...) to infinity, and if ν(r) < 1, it is reasonable to set it to 0. But then all cylinders will have zero measure or infinite measure!! If ν(r) 1, at least this problem does not arise. We shall show that it is indeed possible to make sense of infinite products of Thus, the only case when we can talk reasonably about infinite products of measures is for probability measures. 2.2. Independence Definition 2.4. Let (Ω,F,P) be a probability space. Let G 1,...,G k be sub-sigma algebras of F. We say that G i are independent if for every A 1 G 1,..., A k G k, we have P(A 1 A 2... A k ) P(A 1 )...P(A k ). Random variables X 1,..., X n on F are said to be independent if σ(x 1 ),...,σ(x n ) are independent. This is equivalent to saying that P(X i A i i k) k i1 P(X i A i ) for any A i B(R). Events A 1,..., A k are said to be independent if 1 A1,...,1 Ak are independent. This is equivalent to saying that P(A j1... A jl ) P(A j1 )...P(A jl ) for any 1 j 1 < j 2 <... < j l k. In all these cases, an infinite number of objects (sigma algebras or random variables or events) are said to be independent if every finite number of them are independent. Some remarks are in order. (1) As usual, to check independence, it would be convenient if we need check the condition in the definition only for a sufficiently large class of sets. However, if G i σ(s i ), and for every A 1 S 1,..., A k S k if we have P(A 1 A 2... A k ) P(A 1 )...P(A k ), we cannot conclude that G i are independent! If S i are π-systems, this is indeed true (see below). (2) Checking pairwise independence is insufficient to guarantee independence. For example, suppose X 1, X 2, X 3 are independent and P(X i +1) P(X i 1) 1/2. Let Y 1 X 2 X 3, Y 2 X 1 X 3 and Y 3 X 1 X 2. Then, Y i are pairwise independent but not independent. Lemma 2.5. If S i are π-systems and G i σ(s i ) and for every A 1 S 1,..., A k S k if we have P(A 1 A 2... A k ) P(A 1 )...P(A k ), then G i are independent. PROOF. Fix A 2 S 2,..., A k S k and set F 1 : {B G 1 : P(B A 2... A k ) P(B)P(A 2 )...P(A k )}. Then F 1 S 1 by assumption and it is easy to check that F 1 is a λ-system. By the π-λ theorem, it follows that F 1 G 1 and we get the assumptions of the lemma for G 1, S 2,..., S k. Repeating the argument for S 2, S 3 etc., we get independence of G 1,...,G k. Corollary 2.6. (1) Random variables X 1,..., X k are independent if and only if P(X 1 t 1,..., X k t k ) k j1 P(X j t j ). (2) Suppose G α, α I are independent. Let I 1,..., ) I k be pairwise disjoint subsets of I. Then, the σ-algebras F j σ ( α I j G α are independent.

2.3. INDEPENDENT SEQUENCES OF RANDOM VARIABLES 27 (3) If X i, j, i n, j n i, are independent, then for any Borel measurable f i : R n i R, the r.v.s f i (X i,1,..., X i,ni ) are also independent. PROOF. (1) The sets (, t] form a π-system that generates B(R). (2) For j k, let S j be the collection of finite intersections of sets A i, i I j. Then S j are π-systems and σ(s j ) F j. (3) Follows from (2) by considering G i, j : σ(x i, j ) and observing that f i (X i,1,..., X i,k ) σ(g i,1... G i,ni ). So far, we stated conditions for independence in terms of probabilities if events. As usual, they generalize to conditions in terms of expectations of random variables. Lemma 2.7. (1) Sigma algebras G 1,...,G k are independent if and only if for every bounded G i -measurable functions X i, 1 i k, we have, E[X 1... X k ] k i1 E[X i]. (2) In particular, random variables Z 1,..., Z k (Z i is an n i dimensional random vector) are independent if and only if E[ k i1 f i(z i )] k i1 E[f i(z i )] for any bounded Borel measurable functions f i : R n i R. We say bounded measurable just to ensure that expectations exist. The proof goes inductively by fixing X 2,..., X k and then letting X 1 be a simple r.v., a nonnegative r.v. and a general bounded measurable r.v. PROOF. (1) Suppose G i are independent. If X i are G i measurable then it is clear that X i are independent and hence P(X 1,..., X k ) 1 PX1 1... PX 1 k. Denote µ i : PX 1 and apply Fubini s theorem (and change of i variables) to get E[X 1... X k ] c.o.v Fub R k i1 R x i d(µ 1... µ k )(x 1,..., x k )... x i dµ 1 (x 1 )... dµ k (x k ) i1 R i1 udµ i (u) c.o.v R E[X i ]. Conversely, if E[X 1... X k ] k i1 E[X i] for all G i -measurable functions X i s, then applying to indicators of events A i G i we see the independence of the σ-algebras G i. (2) The second claim follows from the first by setting G i : σ(x i ) and observing that a random variable X i is σ(z i )-measurable if and only if X f Z i for some Borel measurable f : R n i R. i1 2.3. Independent sequences of random variables First we make the observation that product measures and independence are closely related concepts. For example, An observation: The independence of random variables X 1,..., X k is precisely the same as saying that P X 1 is the product measure PX1 1 1... PXk, where X (X 1,..., X k ).

28 2. INDEPENDENT RANDOM VARIABLES Consider the following questions. Henceforth, we write R for the countable product space R R... and B(R ) for the cylinder σ-algebra generated by all finite dimensional cylinders A 1... A n R R... with A i B(R). This notation is justified, becaue the cylinder σ-algebra is also the Borel σ-algebra on R with the product topology. Question 1: Given µ i P (R), i 1, does there exist a probability space with independent random variables X i having distributions µ i? Question 2: Given µ i P (R), i 1, does there exist a p.m µ on (R,B(R )) such that µ(a 1... A n R R...) n i1 µ i(a i )? Observation: The above two questions are equivalent. For, suppose we answer the first question by finding an (Ω,F,P) with independent random variables X i : Ω R such that X i µ i for all i. Then, X : Ω R defined by X(ω) (X 1 (ω), X 2 (ω),...) is measurable w.r.t the relevant σ-algebras (why?). Then, let µ : PX 1 be the pushforward p.m on R. Clearly µ(a 1... A n R R...) P(X 1 A 1,..., X n A n ) n n P(X i A i ) µ i (A i ). Thus µ is the product measure required by the second question. Conversely, if we could construct the product measure on (R,B(R )), then we could take Ω R, F B(R ) and X i to be the i th co-ordinate random variable. Then you may check that they satisfy the requirements of the first question. The two questions are thus equivalent, but what is the answer?! It is yes, of course or we would not make heavy weather about it. Proposition 2.8 (Daniell). Let µ i P (R), i 1, be Borel p.m on R. Then, there exist a probability space with independent random variables X 1, X 2,... such that X i µ i. PROOF. We arrive at the construction in three stages. (1) Independent Bernoullis: Consider ([0, 1], B, m) and the random variables X k : [0,1] R, where X k (ω) is defined to be the k th digit in the binary expansion of ω. For definiteness, we may always take the infinite binary expansion. Then by an earlier homework exercise, X 1, X 2,... are independent Bernoulli(1/2) random variables. (2) Independent uniforms: Note that as a consequence, on any probability space, if Y i are i.i.d. Ber(1/2) variables, then U : n1 2 n Y n has uniform distribution on [0, 1]. Consider again the canonical probability space and the r.v. X i, and set U 1 : X 1 /2 + X 3 /2 3 + X 5 /2 5 +..., U 2 : X 2 /2 + X 6 /2 2 +..., etc. Clearly, U i are i.i.d. U[0,1]. (3) Arbitrary distributions: For a p.m. µ, recall the left-continuous inverse G µ that had the property that G µ (U) µ if U U[0,1]. Suppose we are given p.m.s µ 1,µ 2,... On the canonical probability space, let U i be i.i.d uniforms constructed as before. Define X i : G µi (U i ). Then, X i are independent and X i µ i. Thus we have constructed an independent sequence of random variables having the specified distributions. i1 i1 Sometimes in books one finds construction of uncountable product measures too. It has no use. But a very natural question at this point is to go beyond independence. We just state the following theorem which generalizes the previous proposition.

2.3. INDEPENDENT SEQUENCES OF RANDOM VARIABLES 29 Theorem 2.9 (Kolmogorov s existence theorem). For each n 1 and each 1 i 1 < i 2 <... < i n, let µ i1,...,i n be a Borel p.m on R n. Then there exists a unique probability measure µ on (R,B(R )) such that µ(a 1... A n R R...) µ i1,...,i n (A 1... A n ) for all n 1 and all A i B(R), if and only if the given family of probability measures satisfy the consistency condition µ i1,...,i n (A 1... A n 1 R) µ i1,...,i n 1 (A 1... A n 1 ) for any A k B(R) and for any 1 i 1 < i 2 <... < i n and any n 1.