1 / 15 Convergence of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay March 19, 2014
2 / 15 Motivation Theorem (Weak Law of Large Numbers) Let X 1, X 2,... be a sequence of independent identically distributed random variables with finite means µ. Their partial sums S n = X 1 + X 2 + + X n satisfy S n n P µ as n. Theorem (Central Limit Theorem) Let X 1, X 2,... be a sequence of independent identically distributed random variables with finite means µ and finite non-zero variance σ 2. Their partial sums S n = X 1 + X 2 + + X n satisfy ( ) Sn n n µ D N (0, σ 2 ) as n.
Modes of Convergence A sequence of real numbers {x n : n = 1, 2,...} is said to converge to a limit x if for all ε > 0 there exists an m ε N such that x n x < ε for all n m ε. We want to define convergence of random variables but they are functions from Ω to R The solution Derive real number sequences from sequences of random variables Define convergence of the latter in terms of the former Four ways of defining convergence for random variables Convergence almost surely Convergence in r th mean Convergence in probability Convergence in distribution 3 / 15
Convergence Almost Surely Let X, X 1, X 2,... be random variables on a probability space (Ω, F, P) For each ω Ω, X(ω) and X n(ω) are reals X n X almost surely if {ω Ω : X n(ω) X(ω) as n } is an event whose probability is 1 X n X almost surely is abbreviated as X n Example a.s. X Let Ω = [0, 1] and P be the uniform distribution on Ω P (ω [a, b]) = b a for 0 a b 1 Let X n be defined as X n(ω) = { n, ω [ ) 0, 1 n 0, ω [ 1, 1] n Let X(ω) = 0 for all ω [0, 1] X n a.s. X 4 / 15
Convergence in rth Mean Let X, X 1, X 2,... be random variables on a probability space (Ω, F, P) Suppose E [ X r ] < and E [ X r n ] < for all n X n X in rth mean if E ( X n X r ) 0 as n where r 1 r X n X in rth mean is abbreviated as X n X For r = 1, X 1 n X is written as X n X in mean For r = 2, X 2 m.s. n X is written as X n X in mean square or X n X Example Let Ω = [0, 1] and P be the uniform distribution on Ω Let X n be defined as X n(ω) = { n, ω [ ) 0, 1 n 0, ω [ 1, 1] n Let X(ω) = 0 for all ω [0, 1] E[ X n ] = 1 and so X n does not converge in mean to X 5 / 15
Convergence in Probability Let X, X 1, X 2,... be random variables on a probability space (Ω, F, P) X n X in probability if P ( X n X > ɛ) 0 as n for all ɛ > 0 X n X in probability is abbreviated as X n P X Example Let Ω = [0, 1] and P be the uniform distribution on Ω Let X n be defined as X n(ω) = { n, ω [ ) 0, 1 n 0, ω [ 1, 1] n Let X(ω) = 0 for all ω [0, 1] For ε > 0, P[ X n X > ε] = P[ X n > ε] P[X n = n] = 1 n 0 X n P X 6 / 15
Convergence in Distribution Let X, X 1, X 2,... be random variables on a probability space (Ω, F, P) X n X in distribution if P (X n x) P (X x) as n for all points x where F X (x) = P(X x) is continuous X n X in distribution is abbreviated as X n D X Convergence in distribution is also termed weak convergence Example Let X be a Bernoulli RV taking values 0 and 1 with equal probability 1 2. Let X 1, X 2, X 3,... be identical random variables given by X n = X for all n. The X n s are not independent but X n D X. Let Y = 1 X. Then X n D Y. But X n Y = 1 and the X n s do not converge to Y in any other mode. 7 / 15
8 / 15 Relations between Modes of Convergence Theorem (X n a.s. X) (X n r X) P (X n X) D (Xn X) for any r 1.
Convergence in Probability Implies Convergence in Distribution Suppose X n P X Let F n(x) = P(X n x) and F (x) = P(X x) If ε > 0, F n(x) = P(X n x) = P(X n x, X x + ε) + P(X n x, X > x + ε) F(x + ε) + P ( X n X > ε) F(x ε) = P(X x ε) = P(X x ε, X n x) + P(X x ε, X n > x) F n(x) + P ( X n X > ε) Combining the above inequalities we have F(x ε) P ( X n X > ε) F n(x) F(x + ε) + P ( X n X > ε) If F is continuous at x, F(x ε) F(x) and F(x + ε) F(x) as ε 0 Since X n P X, P ( X n X > ε) 0 as n 9 / 15
10 / 15 Convergence in r th Mean Implies Convergence in Probability If r > s 1 and X n r X then X n s X Lyapunov s inequality: If r > s > 0, then (E [ Y s ]) 1 s r If X n X, then E [ X n X r ] 0 and (E [ X n X s ]) 1 s (E [ X n X r ]) 1 r (E [ Y r ]) 1 r If X n 1 X then X n P X By Markov s inequality, we have for all ε > 0 P ( X n X > ε) E ( Xn X ) ε
11 / 15 Convergence Almost Surely Implies Convergence in Probability Let A n(ε) = { X n X > ε} and B m(ε) = n m An(ε) X n a.s. X if and only if P (B m(ε)) 0 as m, for all ε > 0 Let C = {ω Ω : X n(ω) X(ω) as n } A(ε) = {ω Ω : ω A n(ε) for infinitely many values of n} = A n(ε) m n=m X n(ω) X(ω) if and only if ω / A(ε) for all ε > 0 P(C) = 1 if and only if P (A(ε)) = 0 for all ε > 0 B m(ε) is a decreasing sequence of events with limit A(ε) P(A(ε)) = 0 if and only if P (B m(ε)) 0 as m Since A n(ε) B n(ε), we have P ( X n X > ε) = P (A n(ε)) 0 whenever P (B n(ε)) 0 Thus X n a.s. X = X n P X
12 / 15 Some Converses If X n D c, where c is a constant, then X n P c P ( X n c > ε) = P(X n < c ε) + P(X n > c + ε) 0 if X n D c If P n(ε) = P ( X n X > ε) satisfies n Pn(ε) < for all ε > 0, then a.s. X X n Let A n(ε) = { X n X > ε} and B m(ε) = n m An(ε) P (B m(ε)) P (A n(ε)) = n=m P n(ε) 0 as m n=m X n a.s. X if and only P (B m(ε)) 0 as m, for all ε > 0
13 / 15 Borel-Cantelli Lemmas Let A 1, A 2,... be an infinite sequence of events from (Ω, F, P) Consider the event that infinitely many of the A n occur A = {A n i.o.} = n Theorem Let A be the event that infinitely many of the A n occur. Then P(A) = 0 if n P(An) <, P(A) = 1 if n P(An) = and A 1, A 2, A 3,... are independent events Proof of first lemma. We have A Am for all n P(A) A m P(A m) 0 as n 0
14 / 15 Proof of Second Borel-Cantelli Lemma ( P Thus A c m ) = lim r P ( r A c = n A c m ) = lim A c m r r [1 P(A m)] = ( ) exp [ P(A m)] = exp P(A m) = 0 P(A c ) = lim P n ( A c m ) = 0 [1 P(A m)]
15 / 15 Reference Chapter 7, Probability and Random Processes, Grimmett and Stirzaker, Third Edition, 2001.