Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Similar documents
Contents 1. Contents

Review for the previous lecture

Lecture 1: August 28

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Lecture 5: Moment generating functions

STAT 3610: Review of Probability Distributions

Chapter 3 Common Families of Distributions

[Chapter 6. Functions of Random Variables]

i=1 k i=1 g i (Y )] = k

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

LIST OF FORMULAS FOR STK1100 AND STK1110

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

Sampling Distributions

Stat 515 Midterm Examination II April 4, 2016 (7:00 p.m. - 9:00 p.m.)

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

APPENDICES APPENDIX A. STATISTICAL TABLES AND CHARTS 651 APPENDIX B. BIBLIOGRAPHY 677 APPENDIX C. ANSWERS TO SELECTED EXERCISES 679

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Continuous Distributions

1 Solution to Problem 2.1

7 Random samples and sampling distributions

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

SOLUTION FOR HOMEWORK 12, STAT 4351

Continuous Random Variables

Page Max. Possible Points Total 100

Things to remember when learning probability distributions:

1, 0 r 1 f R (r) = 0, otherwise. 1 E(R 2 ) = r 2 f R (r)dr = r 2 3

STA 256: Statistics and Probability I

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Probability Distributions Columns (a) through (d)

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Stat 5101 Lecture Slides: Deck 8 Dirichlet Distribution. Charles J. Geyer School of Statistics University of Minnesota

18.440: Lecture 28 Lectures Review

Sampling Distributions

MAS223 Statistical Inference and Modelling Exercises

Introduction to Probability

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Continuous random variables

Chapter 5. Chapter 5 sections

Limiting Distributions

Limiting Distributions

Lecture 17: The Exponential and Some Related Distributions

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

The Multivariate Normal Distribution 1

Chapter 8. Some Approximations to Probability Distributions: Limit Theorems

ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 2010 EDITION.

Test Problems for Probability Theory ,

A Few Special Distributions and Their Properties

Chapter 5. Means and Variances

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

Some Special Distributions (Hogg Chapter Three)

Brief Review of Probability

15 Discrete Distributions

Statistics 1B. Statistics 1B 1 (1 1)

3. Probability and Statistics

PROBABILITY DISTRIBUTIONS. J. Elder CSE 6390/PSYC 6225 Computational Modeling of Visual Perception

Chapter 5 continued. Chapter 5 sections

Common probability distributionsi Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

2) Are the events A and B independent? Say why or why not [Sol] No : P (A B) = 0.12 is not equal to P (A) P (B) = =

6. Bernoulli Trials and the Poisson Process

Some Special Distributions (Hogg Chapter Three)

Chapter 4 Multiple Random Variables

18.440: Lecture 28 Lectures Review

Institute of Actuaries of India

More Spectral Clustering and an Introduction to Conjugacy

Continuous Distributions

THE QUEEN S UNIVERSITY OF BELFAST

Joint Probability Distributions, Correlations

Statistics 3657 : Moment Generating Functions

Lecture 3. Univariate Bayesian inference: conjugate analysis

Mathematics 375 Probability and Statistics I Final Examination Solutions December 14, 2009

Expectation, variance and moments

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order.

3 Conditional Expectation

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

STAT 430/510 Probability

STA 2101/442 Assignment 3 1

Partial Solutions for h4/2014s: Sampling Distributions

Chapter 4. Chapter 4 sections

One Parameter Models

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Stat 5101 Notes: Brand Name Distributions

Discrete Distributions

IE 303 Discrete-Event Simulation

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

TABLE OF CONTENTS CHAPTER 1 COMBINATORIAL PROBABILITY 1

Bivariate Normal Distribution

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Chapter 7: Special Distributions

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Preface Introduction to Statistics and Data Analysis Overview: Statistical Inference, Samples, Populations, and Experimental Design The Role of

Severity Models - Special Families of Distributions

Lecture 2: Conjugate priors

Mathematical Statistics 1 Math A 6330

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Solution-Midterm Examination - I STAT 421, Spring 2012

Random Variables Example:

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

Practice Midterm 2 Partial Solutions

M2S2 Statistical Modelling

Transcription:

Will Murray s Probability, XXXII. Moment-Generating Functions XXXII. Moment-Generating Functions Premise We have several random variables, Y, Y, etc. We want to study functions of them: U (Y,..., Y n ). Before, we calculated the mean of U and the variance, but that s not enough to determine the whole distribution of U. Goal We want to find the full distribution function F U (u) := P (U u). Then we can find the density function f U (u) = F U(u). We can calculate probabilities: P (a U b) = b a f U (u) du = F U (b) F U (a) Three methods. Distribution functions. (Two lectures ago, using geometric methods from Calculus III.). Transformations. (Previous lecture, using methods from Calculus I.)

Will Murray s Probability, XXXII. Moment-Generating Functions 3. Moment-generating functions. (This lecture.) Review of Moment-Generating Functions Recall: The moment-generating function for a random variable Y is m Y (t) := E ( e ty ). The MGF is a function of t (not y). See previous lecture on MGFs. MGFs for the Discrete Distributions Distribution Binomial Geometric Negative binomial Hypergeometric MGF [ pe t + ( p) ] n pe t ( p)e t [ pe t ( p)e t ] r No closed-form MGF. Poisson e λ(et )

Will Murray s Probability, XXXII. Moment-Generating Functions 3 All are functions of t. In the first three, we could substitute q := p. MGFs for the Continuous Distributions Distribution Uniform MGF e tθ e tθ t (θ θ ) Normal e µt+ t σ Gamma Exponential ( βt) α ( βt) Chi-square ( t) ν Beta No closed-form MGF. Note that exponential is just gamma with α :=, and chi-square is gamma with α := ν and β :=. Useful Formulas with MGFs Let Z := ay + b. Then m Z (t) = e bt m Y (at).

Will Murray s Probability, XXXII. Moment-Generating Functions 4 Suppose Y and Y are independent variables and Z := Y + Y. Then m Z (t) = m Y (t)m Y (t) How to use MGFs Given a function U (Y,..., Y n ), find its MGF m U (t). Use the useful formulas on the previous slide. Then compare it against your charts to see if you recognize it as a known distribution. Example I Let Y be a standard normal variable. Find the density function of U := Y.

Will Murray s Probability, XXXII. Moment-Generating Functions 5 m Y (t) := E := Combine exponents: = = ty ] [e π π π e ty e y dy e y ( t) dy e y ( t) dy To simplify this, recall that σ e y σ dy = π (normal density) = for any temporary σ. Take temporary σ := = σ σ π = σ t : = ( t) e y σ dy From your chart of mgfs, this is chi-square with ν =, so we have the density: Gamma : f(y) := yα e y β β α Γ(α), 0 y < χ is gamma with α := ν, β :=. ( ) f U (u) = u e u ( Γ u > 0 Γ = ), π = u e u π, u > 0 That s one reason why chi-square is important.

Will Murray s Probability, XXXII. Moment-Generating Functions 6 Example II Let Y, Y be independent standard normal variables. Find the density function of U := Y + Y. m U (t) = m Y +Y (t) = m Y (t)m Y (t) = ( t) ( t) from above = ( t) Looking at the chart, this is chi-square with ν =. Gamma : f(y) := yα e y β β α Γ(α), 0 y < χ is gamma with α := ν, β :=. So f U (u) = e u, u > 0. (It s also exponential with β =.) n In general, if Z,..., Z n N(0, ), then χ (n). i= That s one reason why chi-square is important. Z i Example III Let Y,..., Y r be independent binomial variables representing n,..., n r flips of a coin that comes up heads with probability p. Find the probability

Will Murray s Probability, XXXII. Moment-Generating Functions 7 function of U := Y + + Y r. Note that 0 u n + + n r. m Yi (t) = [ pe t + ( p) ] n i m U (t) := m Y + +Y r (t) = m Y (t) m Yr (t) = [ pe t + ( p) ] n [pe t + ( p) ] n r = [ pe t + ( p) ] n + +n r This is binomial with probability p and n := n + + n r, so ( ) n + + n r p(u) = p u q n + +n r y, 0 u n + + n r. u Example IV Let Y and Y be independent Poisson variables with means λ and λ. Find the probability function of U := Y + Y.

Will Murray s Probability, XXXII. Moment-Generating Functions 8 m Yi (t) = e λ i(e t ) m U (t) := m Y +Y (t) = m Y (t)m Y (t) = e λ (e t ) e λ (e t ) = e (λ +λ )(e t ) This is Poisson with mean λ + λ. (If you expect to see λ cars at an intersection and λ trucks, you expect to see λ + λ vehicles total.) p(y) = λy e λ y! p(u) = (λ + λ ) u e λ λ, 0 u < u! Example V Let Y,..., Y n be independent normal variables, each with mean µ and variance σ. Find the distribution of Y := n (Y + + Y n ).

Will Murray s Probability, XXXII. Moment-Generating Functions 9 Let Y := Y + + Y n. m Y (t) = m Y (t) m Yn (t) = ) (e µt+ σ t n = e µnt+ σ t n m ay +b (t) = e bt m Y (at) ( ) t m Y = m Y = e µt+ σ t n n This is a normal distribution with mean µ, variance σ n. Example VI Let Y and Y be independent exponential variables, each with mean 3. Find the density function of U := Y + Y.

Will Murray s Probability, XXXII. Moment-Generating Functions 0 m Yi (t) = ( 3t) m U (t) := m Y +Y (t) = m Y (t)m Y (t) = ( 3t) ( 3t) = ( 3t) This is gamma with α =, β = 3: f U (u) = uα e u β β α Γ(α) = ue u 3 3 Γ() = 9 ue u 3, 0 u <