Will Murray s Probability, XXXII. Moment-Generating Functions XXXII. Moment-Generating Functions Premise We have several random variables, Y, Y, etc. We want to study functions of them: U (Y,..., Y n ). Before, we calculated the mean of U and the variance, but that s not enough to determine the whole distribution of U. Goal We want to find the full distribution function F U (u) := P (U u). Then we can find the density function f U (u) = F U(u). We can calculate probabilities: P (a U b) = b a f U (u) du = F U (b) F U (a) Three methods. Distribution functions. (Two lectures ago, using geometric methods from Calculus III.). Transformations. (Previous lecture, using methods from Calculus I.)
Will Murray s Probability, XXXII. Moment-Generating Functions 3. Moment-generating functions. (This lecture.) Review of Moment-Generating Functions Recall: The moment-generating function for a random variable Y is m Y (t) := E ( e ty ). The MGF is a function of t (not y). See previous lecture on MGFs. MGFs for the Discrete Distributions Distribution Binomial Geometric Negative binomial Hypergeometric MGF [ pe t + ( p) ] n pe t ( p)e t [ pe t ( p)e t ] r No closed-form MGF. Poisson e λ(et )
Will Murray s Probability, XXXII. Moment-Generating Functions 3 All are functions of t. In the first three, we could substitute q := p. MGFs for the Continuous Distributions Distribution Uniform MGF e tθ e tθ t (θ θ ) Normal e µt+ t σ Gamma Exponential ( βt) α ( βt) Chi-square ( t) ν Beta No closed-form MGF. Note that exponential is just gamma with α :=, and chi-square is gamma with α := ν and β :=. Useful Formulas with MGFs Let Z := ay + b. Then m Z (t) = e bt m Y (at).
Will Murray s Probability, XXXII. Moment-Generating Functions 4 Suppose Y and Y are independent variables and Z := Y + Y. Then m Z (t) = m Y (t)m Y (t) How to use MGFs Given a function U (Y,..., Y n ), find its MGF m U (t). Use the useful formulas on the previous slide. Then compare it against your charts to see if you recognize it as a known distribution. Example I Let Y be a standard normal variable. Find the density function of U := Y.
Will Murray s Probability, XXXII. Moment-Generating Functions 5 m Y (t) := E := Combine exponents: = = ty ] [e π π π e ty e y dy e y ( t) dy e y ( t) dy To simplify this, recall that σ e y σ dy = π (normal density) = for any temporary σ. Take temporary σ := = σ σ π = σ t : = ( t) e y σ dy From your chart of mgfs, this is chi-square with ν =, so we have the density: Gamma : f(y) := yα e y β β α Γ(α), 0 y < χ is gamma with α := ν, β :=. ( ) f U (u) = u e u ( Γ u > 0 Γ = ), π = u e u π, u > 0 That s one reason why chi-square is important.
Will Murray s Probability, XXXII. Moment-Generating Functions 6 Example II Let Y, Y be independent standard normal variables. Find the density function of U := Y + Y. m U (t) = m Y +Y (t) = m Y (t)m Y (t) = ( t) ( t) from above = ( t) Looking at the chart, this is chi-square with ν =. Gamma : f(y) := yα e y β β α Γ(α), 0 y < χ is gamma with α := ν, β :=. So f U (u) = e u, u > 0. (It s also exponential with β =.) n In general, if Z,..., Z n N(0, ), then χ (n). i= That s one reason why chi-square is important. Z i Example III Let Y,..., Y r be independent binomial variables representing n,..., n r flips of a coin that comes up heads with probability p. Find the probability
Will Murray s Probability, XXXII. Moment-Generating Functions 7 function of U := Y + + Y r. Note that 0 u n + + n r. m Yi (t) = [ pe t + ( p) ] n i m U (t) := m Y + +Y r (t) = m Y (t) m Yr (t) = [ pe t + ( p) ] n [pe t + ( p) ] n r = [ pe t + ( p) ] n + +n r This is binomial with probability p and n := n + + n r, so ( ) n + + n r p(u) = p u q n + +n r y, 0 u n + + n r. u Example IV Let Y and Y be independent Poisson variables with means λ and λ. Find the probability function of U := Y + Y.
Will Murray s Probability, XXXII. Moment-Generating Functions 8 m Yi (t) = e λ i(e t ) m U (t) := m Y +Y (t) = m Y (t)m Y (t) = e λ (e t ) e λ (e t ) = e (λ +λ )(e t ) This is Poisson with mean λ + λ. (If you expect to see λ cars at an intersection and λ trucks, you expect to see λ + λ vehicles total.) p(y) = λy e λ y! p(u) = (λ + λ ) u e λ λ, 0 u < u! Example V Let Y,..., Y n be independent normal variables, each with mean µ and variance σ. Find the distribution of Y := n (Y + + Y n ).
Will Murray s Probability, XXXII. Moment-Generating Functions 9 Let Y := Y + + Y n. m Y (t) = m Y (t) m Yn (t) = ) (e µt+ σ t n = e µnt+ σ t n m ay +b (t) = e bt m Y (at) ( ) t m Y = m Y = e µt+ σ t n n This is a normal distribution with mean µ, variance σ n. Example VI Let Y and Y be independent exponential variables, each with mean 3. Find the density function of U := Y + Y.
Will Murray s Probability, XXXII. Moment-Generating Functions 0 m Yi (t) = ( 3t) m U (t) := m Y +Y (t) = m Y (t)m Y (t) = ( 3t) ( 3t) = ( 3t) This is gamma with α =, β = 3: f U (u) = uα e u β β α Γ(α) = ue u 3 3 Γ() = 9 ue u 3, 0 u <