STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it exists, is called the r-th moment of X. In particular, If X is a discrete rrv with pmf p X and X(Ω) is the set of all values that X can take on, then E[X r ] x r p X (x) () x X(Ω) If X is a continuous rrv with pdf f X, then Remarks : The zero-th moment is. E[X r ] The first moment is the expected value of X x r f X (x) dx () Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[(X E[X]) r ], if it exists, is called the r-th moment about the mean or r-th central moment of X. In particular, If X is a discrete rrv with pmf p X and X(Ω) is the set of all values that X can take on, then E [(X E[X]) r ] (x E[X]) r p X (x) (3) x X(Ω) If X is a continuous rrv with pdf f X, then Remarks : E [(X E[X]) r ] The zero-th central moment is. The first central moment is 0. The second central moment is the variance of X. (x E[X]) r f X (x) dx (4)
Moment generating functions Definition.. Let X be a rrv on probability space (Ω, A, P). For a given t R, the moment generating function (m.g.f.) of X, denoted M X (t), is defined as follows M X (t) E [ e tx] (5) where there is a positive number h such that the above summation exists for h < t < h. In particular, If X is a discrete rrv with pmf p X and X(Ω) is the set of all values that X can take on, then M X (t) e tx p X (x) (6) x X(Ω) If X is a continuous rrv with pdf f X, then Remarks : M X (t) The m.g.f. M X (0) always exists and is equal to. e tx f X (x) dx (7) Theorem.. A moment generating function completely determines the distribution of a real-valued random variable. Proposition.. Let X be a rrv on probability space (Ω, A, P). The r-th moment of X can be found by evaluating the r-th derivative of the m.g.f. of X at t 0. M (r) X (0) E[Xr ] (8) Proof. Using the expansion of the exponential function as a series, we have that : e tx (tx) n n! Hence, n0 M X (t) E[e tx ] [ ] (tx) n E n! n0 t n n! E [Xn ] n0 + t E [X] + t E [ X ] + t3 6 E [ X 3] +...
Differentiating M X r times, the first r terms vanish. We now have the following : d M X (t) dt r r(r )... E[X r ] + r! E[X r ] + k t k k! E [ X r+k] (r + )r... (r + )! t E [ X r+] + (r + )(r + )... 3 (r + )! t E [ X r+] +... It now becomes clear that evaluating d M X (t) dt r at t 0 gives the result. Lemma.. Let X be a rrv on probability space (Ω, A, P).. The expected value of X, if it exists, can be found by evaluating the first derivative of the moment generating function at t 0 : E[X] M X(0) (9). The variance of X, if it exists, can be found by evaluating the first and second derivatives of the moment generating function at t 0 : Var(X) M X(0) (M X(0)) (0) 3 Moment generating functions of Common Discrete Distributions 3. Discrete Uniform Distribution Definition 3. (Discrete Uniform Distribution). Let (Ω, A, P) be a probability space and let X be a random variable that can take n N values on X(Ω) {,..., n}. X is said to have a discrete uniform distribution U n if its probability mass function is given by : p X (i) P(X i), for i,..., n () n 3
Computation of the mgf. Let X be a random variable that follows a discrete uniform distribution U n. The mgf of X is given by : M X (t) E [ e tx] e tx p X (x) n n x (e t ) x x }{{} sum of a geometric sequence et ent e t et e (n+)t n( e t ) The above equality is true for all t 0. If t 0, we have : M X (0) n n x x e 0 x In a nutshell, the mgf of X is given by : { e t e (n+)t M X (t) n( e t ) if t 0 if t 0 3. Binomial Distribution Definition 3. (Bernoulli process). A Bernoulli or binomial process has the following features :. We repeat n N identical trials. A trial can result in only two possible outcomes, that is, a certain event E, called success, occurs with probability p, thus event E c, called failure, occurs with probability p 3. The probability of success p remains constant trial after trial. In this case, the process is said to be stationary. 4. The trials are mutually independent. 4
Definition 3.3 (Binomial Distribution). Let (Ω, A, P) be a probability space. Let E A be an event labeled as success, that occurs with probability p. If n N trials are performed according to a Bernoulli process, then the random variable X defined as the number of successes among the n trials, is said to have a binomial distribution Bin(n, p) and its probability mass function is given by : ( ) n p X (x) P(X x) p x ( p) n x for x 0,..., n () x Computation of the mgf. Let X be a random variable that follows a binomial distribution Bin(n, p). The mgf of X is given by : M X (t) e tx p X (x) x0 ( ) n e tx p x ( p) n x x x0 x0 ( ) n (pe t ) x ( p) n x x ( pe t + p ) n where the last equality comes from the binomial theorem. 3.3 Poisson Distribution Definition 3.4 (Poisson Distribution). Let (Ω, A, P) be a probability space. A random variable X is said to have a Poisson distribution P(λ), with λ > 0 if its probability mass function is given by : p X (x) P(X x) e λ λx, for x N (3) x! Computation of the mgf. Let X be a random variable that follows a Poisson distribution P(λ). The mgf of X is given by : M X (t) e tx p X (x) x0 x0 e tx λ λx e x! e λ (λe t ) x x! x0 e λ exp ( λe t) exp ( λ(e t ) ) 5
3.4 Geometric Distribution Definition 3.5 (Geometric Distribution). Let (Ω, A, P) be a probability space. Let E A be an event labeled as success, that occurs with probability p. If all the assumptions of a Bernoulli process are satisfied, except that the number of trials is not preset, then the random variable X defined as the number of trials until the first success is said to have a geometric distribution G(p) and its probability mass function is given by : p X (x) P(X x) ( p) x p, for x N (4) Computation of the mgf. Left as an exercise (Homework 3)! Distribution Probability Mass Function M X (t) E[X] Var(X) p X (x) P(X x) Uniform U n n n N Binomial Bin(n, p) n N, p (0, ) Poisson P(λ) λ > 0 Geometric G(p) ( ) n p x ( p) n x 0 x for x 0,..., n λ λx e x! for x N ( p) x p e t e (n+)t n( e t ) if t 0 if t 0 ( pe t + p ) n n + n np np( p) exp ( λ(e t ) ) λ λ pe t ( p)e t p p (0, ) for x N for t < ln( p) p p 6
4 Moment generating functions of Common Continuous Distributions 4. Continuous Uniform Distribution Definition 4.. A continuous r.r.v. is said to follow a uniform distribution U(a, b) on a segment [a, b], with a < b, if its pdf is f X (x) b a [a,b](x) (5) Computation of the mgf. Let X be a continuous random variable that follows a uniform distribution U(a, b). The mgf of X is given by : M X (t) b a b a e tx f X (x) dx b a e tx t etb e ta t(b a) e tx dx xb xa The above equality holds for t 0. We notice that M X (0). 4. Normal Distribution Definition 4.. A continuous random variable is said to follow a normal (or Gaussian) distribution N (µ, σ ) with parameters, mean µ and variance σ if its pdf f X is given by: { f X (x) σ π exp ( ) } x µ, for x R (6) σ 7
Computation of the mgf. We start by computing the mgf a standard normal random variable Z N (0, ). The mgf of Z is given by : M Z (t) e tx f Z (x) dx e tx π e x dx π e (x xt) dx e t / e t / π e (x t) } {{ } pdf of N (t,) We know (Property 4.5 in Chapter 5) that X µ + σ Z follows a normal distribution N (µ, σ ). Therefore the mgf of a normal distribution N (µ, σ ) is given by : t(µ+σ M µ+σ Z (t) E [e Z)] 4.3 Exponential Distribution e tµ E [ tσ e Z] e tµ M Z (tσ) e tµ e t σ / exp {µt + σ t } Definition 4.3. A continuous random variable is said to follow an exponential distribution E(λ) with λ > 0 if its pdf f X is given by: f X (x) λe λx R+ (x) (7) dx 8
Computation of the mgf. Let X be a continuous random variable that follows an exponential distribution E(λ). The mgf of X is given by : M X (t) 0 e tx f X (x) dx e tx λe λx dx λ e x(λ t) dx 0 λ e x(λ t) (λ t) x0 λ [0 ] (λ t) λ λ t When integrating the exponential, we must be aware that lim x e x(λ t) 0 if and only if λ t > 0. Therefore the derived formula holds if and only if t < λ. Distribution Probability Density Function M X (t) E[X] Var(X) Uniform U(a, b) a, b R with a < b Normal N (µ, σ ) f X (x) b a [a,b](x) { σ π exp µ R, σ > 0 for x R Exponential E(λ) λ > 0 λe λx R+ (x) ( ) } x µ σ e tb e ta t(b a) if t 0 if t 0 exp {µt + σ t } λ λ t for t < λ a + b (b a) µ σ λ λ 9