[Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating functions 1
6.1 Introduction Objective of statistics is to make inferences about a population based on information contained in a sample taken from that population. All quantities used to make inferences about a population are functions of the n random observations that appear in a sample. Consider the problem of estimating a population mean µ. One draw a random sample of n observations, y 1, y 2,..., y n, from the population and employ the sample mean ȳ = y 1 + y 2 + + y n n = 1 n n i=1 for µ. How good is this sample mean for µ? The answer depends on the behavior of the random variables Y 1, Y 2,..., Y n and their effect on the distributions of a random variable Ȳ = (1/n) n i=1 Y i. y i. 2
To determine the probability distribution for a function of n random variables, Y 1, Y 2,..., Y n (say Ȳ ), one must find the joint probability functions for the random variable themselves P (Y 1,..., Y n ) or f(y 1,..., Y n ). The assumption that we will make is that Y 1, Y 2,..., Y n is a random sample from a population with probability function p(y) or probability density function f(y) : the random variables Y 1, Y 2,..., Y n are independent with common probability function p(y) or common density function f(y) : Y 1,..., Y n iid p(y) or f(y) 3
6.2 Finding the probability distribution of a function of random variables We will study two methods for finding the probability distribution for a function of r.v. s. Consider r.v. Y 1, Y 2,..., Y n and a function U(Y 1, Y 2,..., Y n ), denoted simply as U, e.g. U = (Y 1 + Y 2 +... + Y n )/n. Then three methods for finding the probability distribution of U are as follows: The method of distribution functions( ) The method of transformations. The method of moment-generating functions( ) 4
6.3 Method of distribution functions Suppose that we have r.v. Y 1,..., Y n with joint pdf f(y 1,..., y n ). Let U = U(Y 1,..., Y n ) be a function of the r.v. s Y 1, Y 2,..., Y n. 1. Draw the region over which f(y 1,..., y n ) is positive in (y 1, y 2,..., y n ), and find the region in the (y 1, y 2,..., y n ) space for which U = u. 2. Find F U (u) = P (U u) by integrating f(y 1, y 2,..., y n ) over the region for which U u. 3. Find the density function f U (u) by differentiating F U (u). Thus, f U (u) = df U (u)/du. (Example 6.1) Suppose Y has density function given by f(y) = 2y, 0 y 1, 0, elsewhere. Let U be a new random variable defined by U = 3Y 1. Find the probability density function for U. 5
(Example 6.2) Suppose Y 1 and Y 2 have the joint density function given by f(y 1, y 2 ) = 3y 1, 0 y 2 y 1 1, 0, elsewhere. Let U be a new random variable defined by U = Y 1 Y 2. Find the probability density function for U. (Example 6.3)Let (Y 1, Y 2 ) denote a random sample of size n = 2 from the uniform distribution on the interval (0, 1). Find the probability density function for U = Y 1 + Y 2. (Example 6.4) Suppose Y has density function given by f(y) = y+1 2, 1 y 1, 0, elsewhere. Let U be a new random variable defined by U = Y 2. Find the probability density function for U.
(Example 6.5) Let U be a uniform random variable on the interval (0, 1). Find a transformation G(U) such that G(U) possesses an exponential distribution with mean β.
[Method of Transformations] From [Method of distribution functions], we can arrive at a simple method of writing down the density function of U = h(y ) provided h(y) is either decreasing or increasing. Steps to implement the Transformation method: Suppose Y have probability density function f Y (y). Let U = h(y ) where h(y) is either increasing or decreasing for all y such that f Y (y) > 0. 1. Find the inverse function y = h 1 (u). 2. Evaluate dh 1 du = dh 1 (u). du 3. Find f U (u) by f U (u) = f Y ( h 1 (u) ) dh 1 du 6
Why? 1. Let Y have probability density function f Y (y). If h(y) is either increasing or decreasing for all y such that f Y (y) > 0, then U = h(y ) has density function f U (u) = f Y ( h 1 (u) ) dh 1 du where dh 1 du = dh 1 (u) du. (Derivation(p.294 295)) i) Suppose h(y) is a increasing function of y. ii) Suppose h(y) is a decreasing function of y. 7
2. Note that 1) It is not necessary that h(y) be increasing or decreasing for all values of y. h( ) need only be increasing or decreasing for the values of y such that f Y (y) > 0. The set of point {y : f Y (y) > 0} is called the support of the density f Y (y). If y = h 1 (u) is not the support of the density, then f Y (h 1 (u)) = 0. 2) Direct application of this method requires us to check that the function h(y) be either increasing or decreasing for all y such that f Y (y) > 0. If it is not, this method can not be used. 8
(Example 6.7(p.297))Suppose Y function given by f(y) = 2y, 0 y 1, 0, elsewhere. has density Let U be a new random variable defined by U = 4Y + 3. Find the probability density function for U. (Example 6.8(p.298))Suppose Y 1 and Y 2 have a joint density function given by f(y 1, y 2 ) = e (y 1+y 2 ), 0 y 1, 0 y 2 0, elsewhere. Find the density function for U = Y 1 + Y 2. (Example 6.8(p.298))Suppose Y 1 and Y 2 have a joint density function given by f(y 1, y 2 ) = 2(1 y 1 ), 0 y 1 1, 0 y 2 1 0, elsewhere. Find the density function for U = Y 1 Y 2. 9
6.5 The method of Moment Generating Functions This method is based on a uniqueness theorem of M.G.F., which states that, if two r.v. have identical moment-generating functions, the two r.v. s possess the same probability distributions. Let U be a function of the r.v. s Y 1, Y 2,..., Y n. 1. Find the moment generating function for U, m U (t). 2. compare m U (t) with other well-known moment generating functions. If m U (t) = m V (t) for all values of t, then U and V have identical distributions (by uniqueness theorem) 10
(Theorem 6.1)[Uniqueness Theorem] Let m X (t) and m Y (t) denote the moment generating functions of r.v. s X and Y, respectively. If both moment-generating functions exist and m X (t) = m Y (t) for all values of t, then X and Y have the same probability distribution. (Example 6.11)Let Z be a normally distributed random variable with mean 0 and variance 1. Use the method of moment-generating functions to find the probability distribution of Z 2. 11
The moment generating function method is often very useful for finding the distributions of sums of independent r.v. s. (Theorem 6.2(p.304)) Let Y 1, Y 2,..., Y n be independent r.v. s with moment generating functions m Y 1 (t), m Y 2 (t),..., m Y n (t), respectively. If U = Y 1 + Y 2 +... + Y n then m U (t) = m Y 1 (t) m Y 2 (t) m Y n (t). 12
(Example 6.12)The number of customer arrivals at a checkout counter in a given interval of time possesses approximately a Poisson probability distribution. If Y 1 denotes the time until the first arrival, Y 2 denotes the time between the first and second arrival,..., and Y n denotes the time between the (n 1)st and nth arrival, then it can be shown that Y 1, Y 2,..., Y n are independent random variables, with the density function for Y i given by f Yi (y i ) = 1 θ e y i θ, y i > 0 0, elsewhere. Here θ is the average time between arrivals.find the probability density function for the waiting time from the opening of the counter until the nth customer arrives. 13
The m.g.f method can be used to establish some interesting and useful results about the distributions of some functions of normally distributed r.v. s. (Theorem 6.3) Let Y 1, Y 2,..., Y n be independent normally distributed r.v. s with E(Y i ) = µ i and V (Y i ) = σ 2 i, for i = 1, 2,..., n and let a 1, a 2,..., a n be constants. If U = n i=1 a i Y i, then U is a normally distributed random variable with E(U) = n i=1 a i µ i and V (U) = n i=1 a 2 i σ2 i. (Proof) (Exercise 6.35) 14
(Theorem 6.4) Let Y 1, Y 2,..., Y n be independent normally distributed r.v. s with E(Y i ) = µ i and V (Y i ) = σ 2 i, for i = 1, 2,..., n and define Z i by Z i = Y i µ i σ i, i = 1, 2,..., n. Then n i=1 Zi 2 has a χ 2 -distribution with n degrees of freedom. (Proof) (Exercise 6.34) 15
(Exercise 6.43) (Exercise 6.44) 16
Homework : Reading Chapter 6 Homework (Due Dec 4th) : 6.1, 6.8, 6.16, 6.19, 6.24, 6.31, 6.37, 6.40, 6.46, 6.49. 17