STAT/MATH 395 A - PROBABILITY II UW Spring Qurter 6 Néhémy Lim HW3 : Moment functions Solutions Problem. Let X be rel-vlued rndom vrible on probbility spce (Ω, A, P) with moment generting function M X. () Show tht for ny constnts, b R, M X+b (t) e bt M X (t) For given t in the domin of M X, we hve : M X+b (t) E [e t(x+b)] e tb E [ t e X] e tb M X (t) (b) Appliction. Let X follow norml distribution N (µ, σ ), compute the moment generting function of Y X + b nd conclude tht Y follows norml distribution N (µ + b, σ ). For given t R, we hve : M Y (t) M X+b (t) e tb M X (t) e tb exp {µ(t) + σ (t) } exp {(µ + b)t + σ t } We recognize the moment generting function of norml distribution N (µ + b, σ ). By virtue of the uniqueness property of moment generting functions, this proves tht Y X +b follows norml distribution N (µ + b, σ ). Problem. Let X be continuous rndom vrible with probbility density function : f X (x) xe x [, ) (x)
() Show tht M X (t) the moment generting function of X is given by : E[e tx ] ( t) e tx xe x dx xe ( t)x dx The following chnge of vrible : u ( t)x, du ( t) dx gives : for t <, (b) Compute E[X] using M X (t). u du e u t t ( t) ue u du }{{} since f X is pdf ( t) Let us differentite M X (t) : for t <, Thus, M X(t) E[X] M X() (c) Compute Vr(X) using M X (t). ( t) 3 ( ) 3 The second derivtive M X (t) is given by : for t <, M X(t) 6 ( t) 4 Therefore, the second moment of X is : Finlly, E[X ] M X() 6 ( ) 4 6 Vr(X) M X() M X() 6.
Problem 3. Let X be continuous rndom vrible with probbility density function f X nd moment generting function M X defined on neighborhood ( h, h) of zero, for some h >. Show tht P(X ) e t M X (t) for < t < h Hint. Strt from the right-hnd side of the inequlity nd split the integrl defining M X (t) into the intervls (, ) nd (, ). E[e tx ] + Noting tht etx f X (x) dx since e tx nd f X (x), we hve M X (t) since e tx e t for ll x. Therefore, which completes the proof. e t f X (x) dx M X (t) e t f X (x) dx e t P(X ). Problem 4. Let X be rel-vlued rndom vrible on probbility spce (Ω, A, P) with moment generting function M X. The cumulnt generting function of X, denoted K X, is defined by K X (t) ln M X (t) () Show tht K X() E[X] Thus, since M X () E[X] nd M X(). K X(t) M X (t) M X (t) K X() M X () M X () E[X] Actully, this property lso holds for discrete rndom vribles. 3
(b) nd tht K X() Vr(X) K X(t) M X (t)m X(t) M X (t) M X (t) Hence, K X() M X ()M X() M X () M X () E[X ] (E[X]) Vr(X) (c) Appliction. We sy tht rndom vrible X follows geometric distribution G(p), with prmeter p (, ) if its probbility mss function is given by : p X (x) P(X x) ( p) x p, for x N Compute M X (t) the moment generting function of X. e tx p X (x) x e tx ( p) x p x pe t [ ( p)e t ] x x We recognize geometric series with first term [( p)e t ] nd common rtio ( p)e t. This series converges if nd only if < ( p)e t <, tht is if nd only if t < ln( p). In this cse, the moment generting function is defined nd is given by : pe t ( p)e t (d) Compute K X (t) the cumulnt generting function of X. pe t K X (t) ln ( p)e t ln p + t ln ( ( p)e t) 4
(e) Compute E[X] using K X (t). The first derivtive K X (t) is given by : Therefore, K X(t) + E[X] K X() + p ( p)et ( p)e t ( p) ( p) (f) Compute Vr(X) using K X (t). The second derivtive K X (t) is given by : Therefore, K X(t) ( p)et ( ( p)e t ) ( p)e t ( ( p)e t ) ( ( p)e t ) ( p)e t ( ( p)e t ) Vr(X) K X() ( p) ( ( p) ) p p 5