Lecture 8: Chrcteristic Functions of 9 Course: Theory of Probbility I Term: Fll 203 Instructor: Gordn Zitkovic Lecture 8 Chrcteristic Functions First properties A chrcteristic function is simply the Fourier trnsform, in probbilistic lnguge. Since we will be integrting complex-vlued functions, we define (both integrls on the right need to exist) f dµ = f dµ + i I f dµ, where f nd I f denote the rel nd the imginry prt of function f : C. The reder will esily figure out which properties of the integrl trnsfer from the rel cse. Definition 8.. The chrcteristic function of probbility mesure µ on B() is the function ϕ µ : C given by ϕ µ (t) = e itx µ(dx) When we spek of the chrcteristic function ϕ X of rndom vrible X, we hve the chrcteristic function ϕ µx of its distribution µ X in mind. Note, moreover, tht ϕ X (t) = E[e itx ]. While difficult to visulize, chrcteristic functions cn be used to lern lot bout the rndom vribles they correspond to. We strt with some properties which follow directly from the definition: Proposition 8.2. Let X, Y nd {X n } n N be rndom vribles.. ϕ X (0) = nd ϕ X (t), for ll t. 2. ϕ X (t) = ϕ X (t), where br denotes complex conjugtion. 3. ϕ X is uniformly continuous. 4. If X nd Y re independent, then ϕ X+Y = ϕ X ϕ Y. Lst Updted: November 0, 203
Lecture 8: Chrcteristic Functions 2 of 9 5. For ll t < t 2 < < t n, the mtrix A = ( ij ) i,j n given by Note: We do not prove (or use) it in these notes, but it cn be shown tht jk = ϕ X (t j t k ), is Hermitin nd positive semi-definite, i.e., A = A nd ξ T Aξ 0, for ny ξ C n, 6. If X n D X, then ϕxn (t) ϕ X (t), for ech t. Proof.. Immedite. 2. e itx = e itx. 3. We hve ϕ X (t) ϕ X (s) = (e itx e isx ) µ(dx) h(t s), where h(u) = e iux µ(dx). Since e iux 2, dominted convergence theorem implies tht lim u 0 h(u) = 0, nd, so, ϕ X is uniformly continuous. 4. Independence of X nd Y implies the independence of exp(itx) nd exp(ity). Therefore, ϕ X+Y (t) = E[e it(x+y) ] = E[e itx e ity ] = E[e itx ]E[e ity ] = ϕ X (t)ϕ Y (t). 5. The mtrix A is Hermitin by (2). To see tht it is positive semidefinite, note tht jk = E[e itjx e itkx ], nd so n j= n k= ξ j ξ k jk = E = E[ ( n ( n ξ j e jx) it j= n j= ξ j e it jx 2 ] 0. k= ξ k e it kx) 6. For f C b (), we hve f (X n ) f (X),.s., nd so, by the dominted convergence theorem pplied to the cses f (x) = cos(tx) nd f (x) = sin(tx), we hve ϕ X (t) = E[exp(itX)] = E[lim n exp(itx n )] = lim n E[exp(itX n )] = lim n ϕ Xn (t). Here is simple problem you cn use to test your understnding of the definitions: function ϕ : C, continuous t the origin with ϕ(0) = is chrcteristic function of some probbility mesure µ on B() if nd only if it is positive semidefinite, i.e., if it stisfies prt 5. of Proposition 8.2. This is known s Bochner s theorem. Problem 8.. Let µ nd ν be two probbility mesures on B(), nd let ϕ µ nd ϕ ν be their chrcteristic functions. Show tht Prsevl s Identity holds: e its ϕ µ (t) ν(dt) = ϕ ν (t s) µ(dt), for ll s. Lst Updted: November 0, 203
Lecture 8: Chrcteristic Functions 3 of 9 Our next result shows µ cn be recovered from its chrcteristic function ϕ µ : Theorem 8.3 (Inversion theorem). Let µ be probbility mesure on B(), nd let ϕ = ϕ µ be its chrcteristic function. Then, for < b, we hve T µ((, b)) + 2 µ({, b}) = lim e it e itb ϕ(t) dt. (8.) T it Proof. We strt by picking < b nd noting tht e it e itb it = b e ity dy, so tht, by Fubini s theorem, the integrl in (8.) is well-defined: F(, b, T) = exp( ity)ϕ(t) dy dt, [ T,T] [,b] where T e F(, b, T) = it e itb ϕ(t) dt. T it Another use of Fubini s theorem yields: F(, b, T) = exp( ity) exp(itx) dy dt µ(dx) [ T,T] [,b] ( ) = exp( it(y x)) dy dt µ(dx) [ T,T] [,b] ( ( = e it( x) e it(b x)) ) dt µ(dx). Set f (, b, T) = T T [ T,T] it T it (e it( x) e it(b x) sin(ct) ) dt nd K(T, c) = t dt, 0 nd note tht, since cos is n even nd sin n odd function, we hve T ) f (, b, T; x) = 2 dt 0 ( sin(( x)t) t sin((b x)t) t = 2K(T; x) 2K(T; b x). Since T sin(ct) 0 ct d(ct) = ct sin(s) 0 s ds = K(cT; ), c > 0 K(T; c) = 0, c = 0 K( c T; ), c < 0, (8.2) Note: The integrl T T it exp( it( x)) dt is not defined; we relly need to work with the full f (, b, T; x) to get the right cncelltion. Problem 4. implies tht lim K(T; c) = π 2, c > 0, 0, c = 0, π 2, c < 0. Lst Updted: November 0, 203
Lecture 8: Chrcteristic Functions 4 of 9 nd so lim f (, b, T; x) = 0, x [, b] c, π, x = or x = b,, < x < b. Observe first tht the function T K(T; ) is continuous on [0, ) nd hs finite limit s T so tht sup T 0 K(T; ) <. Furthermore, (8.2) implies tht K(T; c) sup T 0 K(T; ) for ny c nd T 0 so tht sup{ f (, b, T; x) : x, T 0} <. Therefore, we cn use the dominted convergence theorem to get tht lim F(, b, T; x) = lim f (, b, T; x) µ(dx) = lim f (, b, T; x) µ(x) = 2 µ({}) + µ((, b)) + 2 µ({b}). Corollry 8.4. For probbility mesures µ nd µ 2 on B(), the equlity ϕ µ = ϕ µ2 implies tht µ = µ 2. Proof. By Theorem 8.3, we hve µ ((, b)) = µ 2 ((, b)) for ll, b C where C is the set of ll x such tht µ ({x}) = µ 2 ({x}) = 0. Since C c is t most countble, it is strightforwrd to see tht the fmily {(, b) :, b C} of intervls is π-system which genertes B(). Corollry 8.5. Suppose tht ϕµ (t) dt <. Then µ dµ λ nd dλ is bounded nd continuous function given by dµ = f, where f (x) = e itx ϕ µ (t) dt for x. dλ Proof. Since ϕ µ is integrble nd e itx =, f is well defined. For < b we hve b f (x) dx = = = b = lim e itx ϕ µ (t) dt dx ( b ) ϕ µ (t) e itx dx dt e it e itb ϕ(t) dt it T e it e itb ϕ(t) dt it T = µ((, b)) + 2 µ({, b}), (8.2) by Theorem 8.3, where the use of Fubini s theorem bove is justified by the fct tht the function (t, x) e itx ϕ µ (t) is integrble on [, b], Lst Updted: November 0, 203
Lecture 8: Chrcteristic Functions 5 of 9 for ll < b. For, b such tht µ({}) = µ({b}) = 0, the eqution (8.2) implies tht µ((, b)) = b f (x) dx. The clim now follows by the π λ-theorem. Exmple 8.6. Here is list of some common distributions nd the corresponding chrcteristic functions:. Continuous distributions. 2. Discrete distributions. 3. A singulr distribution. Til behvior Nme Prmeters Density f X (x) Ch. function ϕ X (t) Uniform < b b [,b](x) 2 Norml µ, σ > 0 σ 2 e it e itb it(b ) exp( (x µ)2 2σ 2 ) exp(iµt 2 σ2 t 2 ) 3 Exponentil λ > 0 λ exp( λx) [0, ) (x) it 4 Double Exponentil λ > 0 2 λ exp( λ x ) +t 2 5 Cuchy µ, γ > 0 γ π(γ 2 +(x µ) 2 ) exp(iµt γ t ) Nme Prmeters p n = P[X = n], n Z Ch. function ϕ X (t) 6 Dirc m N 0 {m=n} exp(itm) 7 Coin-toss p (0, ) p = p, p = ( p) cos(t) 8 Geometric p (0, ) p n ( p), n N 0 p e it p 9 Poisson λ > 0 e λ λn n!, n N 0 exp(λ(e it )) Nme Ch. function ϕ X (t) 0 Cntor e it/2 k= cos( t 3 k ) We continue by describing severl methods one cn use to extrct useful informtion bout the tils of the underlying probbility distribution from chrcteristic function. Proposition 8.7. Let X be rndom vrible. d n (dt) n ϕ X (t) exists for ll t nd If E[ X n ] <, then d n (dt) n ϕ X (t) = E[e itx (ix) n ]. In prticulr E[X n ] = ( i) n dn (dt) n ϕ X (0). Lst Updted: November 0, 203
Lecture 8: Chrcteristic Functions 6 of 9 Proof. We give the proof in the cse n = nd leve the generl cse to the reder: ϕ(h) ϕ(0) e lim h 0 h = lim ihx e h 0 h µ(dx) = lim ihx h 0 h µ(dx) = ix µ(dx), where the pssge of the limit under the integrl sign is justified by the dominted convergence theorem which, in turn, cn be used since x, nd x µ(dx) = E[ X ] <. emrk 8.8. eihx h. It cn be shown tht for n even, the existence of d n (dt) n ϕ X (0) (in the pproprite sense) implies the finiteness of the n-th moment E[ X n ]. 2. When n is odd, it cn hppen tht d n (dt) n ϕ X (0) exists, but E[ X n ] = - see Problem 8.6. Finer estimtes of the tils of probbility distribution cn be obtined by finer nlysis of the behvior of ϕ round 0: Proposition 8.9. Let µ be probbility mesure on B() nd let ϕ = ϕ µ be its chrcteristic function. Then, for ε > 0 we hve µ([ 2 ε, 2 ε ]c ) ε ε ε ( ϕ(t)) dt. Proof. Let X be rndom vrible with distribution µ. We strt by using Fubini s theorem to get 2ε ε ε ε ( ϕ(t)) dt = 2ε E[ ( e itx ) dt] = ε E[ ε 0 ε ( cos(tx)) dt] = E[ sin(εx) εx ]. It remins to observe tht sin(x) x 0 nd sin(x) x for ll x x. Therefore, if we use the first inequlity on [ 2, 2] nd the second one on [ 2, 2] c, we get sin(x) 2 { x >2} so tht x ε 2ε ( ϕ(t)) dt 2 P[ εx > 2] = 2 µ([ 2 ε, 2 ε ]c ). ε Problem 8.2. Use the inequlity of Proposition 8.9 to show tht if Note: f (t) = g(t) + O(h(t)) mens ϕ(t) = + tht, for some δ > 0, we hve O( t α ) for some α > 0, then x β µ(dx) <, for ll f (t) g(t) β < α. Give n exmple where sup <. x α µ(dx) =. h(t) t δ Problem 8.3 (iemnn-lebesgue theorem). Suppose tht µ λ. Show tht lim ϕ µ(t) = lim ϕ µ(t) = 0. t t Hint: Use (nd prove) the fct tht f L +() cn be pproximted in L () by function of the form n k= α k [k,b k ]. Lst Updted: November 0, 203
Lecture 8: Chrcteristic Functions 7 of 9 The continuity theorem Theorem 8.0 (Continuity theorem). Let {µ n } n N be sequence of probbility distributions on B(), nd let {ϕ n } n N be the sequence of their chrcteristic functions. Suppose tht there exists function ϕ : C such tht. ϕ n (t) ϕ(t), for ll t, nd 2. ϕ is continuous t t = 0. Then, ϕ is the chrcteristic function of probbility mesure µ on B() nd µ n w µ. Proof. We strt by showing tht the continuity of the limit ϕ implies tightness of {µ n } n N. Given ε > 0 there exists δ > 0 such tht ϕ(t) ε/2 for t δ. By the dominted convergence theorem we hve lim sup µ n ([ 2 δ, 2 δ ]c ) lim sup n n = δ δ δ δ δ δ ( ϕ n (t)) dt ( ϕ(t)) dt ε. By tking n even smller δ > 0, we cn gurntee tht sup µ n ([ 2 δ, 2 δ ] c ) ε, n N which, together with the rbitrriness of ε > 0 implies tht {µ n } n N is tight. Let {µ nk } k N be convergent subsequence of {µ n } n N, nd let µ be its limit. Since ϕ nk ϕ, we conclude tht ϕ is the chrcteristic function of µ. It remins to show tht the whole sequence converges to µ wekly. This follows, however, directly from Problem 7.4, since ny convergent subsequence {µ nk } k N hs the sme limit µ. Problem 8.4. Let ϕ be chrcteristic function of some probbility mesure µ on B(). Show tht ˆϕ(t) = e ϕ(t) is lso chrcteristic function of some probbility mesure ˆµ on B(). Additionl Problems Problem 8.5 (Atoms from the chrcteristic function). Let µ be probbility mesure on B(), nd let ϕ = ϕ µ be its chrcteristic function.. Show tht µ({}) = lim 2T T T e it ϕ(t) dt. 2. Show tht if lim t ϕ(t) = lim t ϕ(t) = 0, then µ hs no toms. Lst Updted: November 0, 203
Lecture 8: Chrcteristic Functions 8 of 9 3. Show tht converse of (2) is flse. Hint: Prove tht ϕ(t n ) = long suitbly chosen sequence t n, where ϕ Problem 8.6 (Existence of ϕ X (0) does not imply tht X L ). Let X be rndom vrible which tkes vlues in Z \ { 2,, 0,, 2} with P[X = k] = P[X = k] = C, for k = 3, 4,..., k 2 log(k) where C = 2 ( k 3 k 2 log(k) ) (0, ). Show tht ϕ X (0) = 0, but X L. Hint: Argue tht, in order to estblish tht ϕ X (0) = 0, it is enough to show tht cos(hk) lim h 0 h = k 2 0. log(k) k 3 Then split the sum t k close to 2/h nd use (nd prove) the inequlity cos(x) min(x 2 /2, x). Bounding sums by integrls my help, too. Problem 8.7 (Multivrite chrcteristic functions). Let X = (X,..., X n ) be rndom vector. The chrcteristic function ϕ = ϕ X : n C is given by ϕ(t, t 2,..., t n ) = E[exp(i n k= t k X k )]. is the chrcteristic function of the Cntor distribution. We will lso use the shortcut t for (t,..., t n ) nd t X for the rndom vrible n k= t kx k. Prove the following sttements Note: Tke for grnted the following sttement (the proof of which is similr. ndom vribles X nd Y re independent if nd only if to the proof of the -dimensionl cse): ϕ (X,Y) (t, t 2 ) = ϕ X (t )ϕ Y (t 2 ) for ll t, t 2. 2. ndom vectors X nd X 2 hve the sme distribution if nd only if rndom vribles t X nd t X 2 hve the sme distribution for ll t n. (This fct is known s Wld s device.) An n-dimensionl rndom vector X is sid to be Gussin (or, to hve the multivrite norml distribution) if there exists vector µ n nd symmetric positive semi-definite mtrix Σ n n such tht ϕ X (t) = exp(i t µ 2 tτ Σt), where t is interpreted s column vector, nd () τ is trnsposition. This is denoted s X N(µ, Σ). X is sid to be non-degenerte if Σ is positive definite. Suppose tht X nd X 2 re rndom vectors with ϕ X (t) = ϕ X2 (t) for ll t n. Then X nd X 2 hve the sme distribution, i.e. µ X = µ X2. 3. Show tht rndom vector X is Gussin, if nd only if the rndom Note: Be creful, nothing in the second vector t X is normlly distributed (with some men nd vrince) sttement tells you wht the men nd vrince of t X re. for ech t n. 4. Let X = (X, X 2,..., X n ) be Gussin rndom vector. Show tht X k nd X l, k = l, re independent if nd only if they re uncorrelted. Lst Updted: November 0, 203
Lecture 8: Chrcteristic Functions 9 of 9 5. Construct rndom vector (X, Y) such tht both X nd Y re normlly distributed, but tht X = (X, Y) is not Gussin. 6. Let X = (X, X 2,..., X n ) be rndom vector consisting of n independent rndom vribles with X i N(0, ). Let Σ n n be given positive semi-definite symmetric mtrix, nd µ n given vector. Show tht there exists n ffine trnsformtion T : n n such tht the rndom vector T(X) is Gussin with T(X) N(µ, Σ). 7. Find necessry nd sufficient condition on µ nd Σ such tht the converse of the previous problem holds true: For Gussin rndom vector X N(µ, Σ), there exists n ffine trnsformtion T : n n such tht T(X) hs independent components with the N(0, )-distribution (i.e. T(X) N(0, yi), where yi is the identity mtrix). Problem 8.8 (Slutsky s Theorem). Let X, Y, {X n } n N nd {Y n } n N be rndom vribles defined on the sme probbility spce, such tht Show tht X n D X nd Yn D Y. (8.3). It is not necessrily true tht X n + Y n D X + Y. For tht mtter, we do not necessrily hve (X n, Y n ) D (X, Y) (where the pirs re considered s rndom elements in the metric spce 2 ). 2. If, in ddition to (8.3), there exists constnt c such tht P[Y = Hint: It is enough to show tht c] =, show tht g(x n, Y n ) D g(x, c), for ny continuous function g : 2. Problem 8.9 (Convergence of norml sequence). (X n, Y n ) D (X n, c). Use Problem 8.7).. Let {X n } n N be sequence of normlly-distributed rndom vri- Hint: Use this fct: for sequence bles converging wekly towrds rndom vrible X. Show tht {µ n } n N of rel numbers, the following two sttements re equivlent X must be norml rndom vrible itself. 2. Let X n be sequence of norml rndom vribles such tht X n.s. X. Show tht X n L p X for ll p. () µ n µ, nd (b) exp(itµ n ) exp(itµ), for ll t. You don t need to prove it, but feel free to try. Lst Updted: November 0, 203