[Chapter 6. Functions of Random Variables]

Similar documents
Chapter 6: Functions of Random Variables

Contents 1. Contents

Stat 515 Midterm Examination II April 4, 2016 (7:00 p.m. - 9:00 p.m.)

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).

2) Are the events A and B independent? Say why or why not [Sol] No : P (A B) = 0.12 is not equal to P (A) P (B) = =

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

MATH 360. Probablity Final Examination December 21, 2011 (2:00 pm - 5:00 pm)

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

STA 4321/5325 Solution to Homework 5 March 3, 2017

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Probability and Statistics Notes

Test Problems for Probability Theory ,

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Chapter 6: Functions of Random Variables

MAS223 Statistical Inference and Modelling Exercises

Name of the Student: Problems on Discrete & Continuous R.Vs

Sampling Distributions

Statistics for scientists and engineers

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

2) Are the events A and B independent? Say why or why not [Sol] No : P (A B) =0.12 is not equal to P (A) P (B) = =

SOLUTION FOR HOMEWORK 12, STAT 4351

Chapter 3 Common Families of Distributions

Transform Techniques - CF

Y i. is the sample mean basal area and µ is the population mean. The difference between them is Y µ. We know the sampling distribution

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Transform Techniques - CF

STA 256: Statistics and Probability I

Continuous random variables

Sampling Distributions

Bivariate Normal Distribution

Transform Techniques - CF

STA 256: Statistics and Probability I

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student:

Exponential Distribution and Poisson Process

Ch3 Operations on one random variable-expectation

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

The exponential distribution and the Poisson process

This midterm covers Chapters 6 and 7 in WMS (and the notes). The following problems are stratified by chapter.

Question Points Score Total: 76

Asymptotic Statistics-III. Changliang Zou

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

1. Point Estimators, Review

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

STA 260: Statistics and Probability II

Special Topic: Bayesian Finite Population Survey Sampling

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7

O June, 2010 MMT-008 : PROBABILITY AND STATISTICS

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

STA 584 Supplementary Examples (not to be graded) Fall, 2003

Review Quiz. 1. Prove that in a one-dimensional canonical exponential family, the complete and sufficient statistic achieves the

Chapter 5 Class Notes

Continuous Distributions

Functions of Random Variables Notes of STAT 6205 by Dr. Fan

Master s Written Examination - Solution

7 Random samples and sampling distributions

Normal (Gaussian) distribution The normal distribution is often relevant because of the Central Limit Theorem (CLT):

STA 4322 Exam I Name: Introduction to Statistics Theory

Problems on Discrete & Continuous R.Vs

1 Solution to Problem 2.1

MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators

2 Functions of random variables

i=1 k i=1 g i (Y )] = k

Classical and Bayesian inference

STA 260: Statistics and Probability II

Limiting Distributions

Probability and Distributions

Chapter 4 Multiple Random Variables

Limiting Distributions

Chapter 5 Joint Probability Distributions

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y

Lecture 15. Hypothesis testing in the linear model

Partial Solutions for h4/2014s: Sampling Distributions

THE QUEEN S UNIVERSITY OF BELFAST

Continuous Distributions

Module 6: Methods of Point Estimation Statistics (OA3102)

15 Discrete Distributions

Statistics and Sampling distributions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

2 Continuous Random Variables and their Distributions

Lecture 1: August 28

First Year Examination Department of Statistics, University of Florida

2. The CDF Technique. 1. Introduction. f X ( ).

ISyE 3044 Fall 2017 Test #1a Solutions

Review for the previous lecture

Solutions to COMP9334 Week 8 Sample Problems

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

Stat 704 Data Analysis I Probability Review

STAT 135 Lab 7 Distributions derived from the normal distribution, and comparing independent samples.

Part 2: One-parameter models

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Problem 1 (20) Log-normal. f(x) Cauchy

Statistics Ph.D. Qualifying Exam: Part I October 18, 2003

Order Statistics and Distributions

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Classical and Bayesian inference

Transcription:

[Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating functions 1

6.1 Introduction Objective of statistics is to make inferences about a population based on information contained in a sample taken from that population. All quantities used to make inferences about a population are functions of the n random observations that appear in a sample. Consider the problem of estimating a population mean µ. One draw a random sample of n observations, y 1, y 2,..., y n, from the population and employ the sample mean ȳ = y 1 + y 2 + + y n n = 1 n n i=1 for µ. How good is this sample mean for µ? The answer depends on the behavior of the random variables Y 1, Y 2,..., Y n and their effect on the distributions of a random variable Ȳ = (1/n) n i=1 Y i. y i. 2

To determine the probability distribution for a function of n random variables, Y 1, Y 2,..., Y n (say Ȳ ), one must find the joint probability functions for the random variable themselves P (Y 1,..., Y n ) or f(y 1,..., Y n ). The assumption that we will make is that Y 1, Y 2,..., Y n is a random sample from a population with probability function p(y) or probability density function f(y) : the random variables Y 1, Y 2,..., Y n are independent with common probability function p(y) or common density function f(y) : Y 1,..., Y n iid p(y) or f(y) 3

6.2 Finding the probability distribution of a function of random variables We will study two methods for finding the probability distribution for a function of r.v. s. Consider r.v. Y 1, Y 2,..., Y n and a function U(Y 1, Y 2,..., Y n ), denoted simply as U, e.g. U = (Y 1 + Y 2 +... + Y n )/n. Then three methods for finding the probability distribution of U are as follows: The method of distribution functions( ) The method of transformations. The method of moment-generating functions( ) 4

6.3 Method of distribution functions Suppose that we have r.v. Y 1,..., Y n with joint pdf f(y 1,..., y n ). Let U = U(Y 1,..., Y n ) be a function of the r.v. s Y 1, Y 2,..., Y n. 1. Draw the region over which f(y 1,..., y n ) is positive in (y 1, y 2,..., y n ), and find the region in the (y 1, y 2,..., y n ) space for which U = u. 2. Find F U (u) = P (U u) by integrating f(y 1, y 2,..., y n ) over the region for which U u. 3. Find the density function f U (u) by differentiating F U (u). Thus, f U (u) = df U (u)/du. (Example 6.1) Suppose Y has density function given by f(y) = 2y, 0 y 1, 0, elsewhere. Let U be a new random variable defined by U = 3Y 1. Find the probability density function for U. 5

(Example 6.2) Suppose Y 1 and Y 2 have the joint density function given by f(y 1, y 2 ) = 3y 1, 0 y 2 y 1 1, 0, elsewhere. Let U be a new random variable defined by U = Y 1 Y 2. Find the probability density function for U. (Example 6.3)Let (Y 1, Y 2 ) denote a random sample of size n = 2 from the uniform distribution on the interval (0, 1). Find the probability density function for U = Y 1 + Y 2. (Example 6.4) Suppose Y has density function given by f(y) = y+1 2, 1 y 1, 0, elsewhere. Let U be a new random variable defined by U = Y 2. Find the probability density function for U.

(Example 6.5) Let U be a uniform random variable on the interval (0, 1). Find a transformation G(U) such that G(U) possesses an exponential distribution with mean β.

[Method of Transformations] From [Method of distribution functions], we can arrive at a simple method of writing down the density function of U = h(y ) provided h(y) is either decreasing or increasing. Steps to implement the Transformation method: Suppose Y have probability density function f Y (y). Let U = h(y ) where h(y) is either increasing or decreasing for all y such that f Y (y) > 0. 1. Find the inverse function y = h 1 (u). 2. Evaluate dh 1 du = dh 1 (u). du 3. Find f U (u) by f U (u) = f Y ( h 1 (u) ) dh 1 du 6

Why? 1. Let Y have probability density function f Y (y). If h(y) is either increasing or decreasing for all y such that f Y (y) > 0, then U = h(y ) has density function f U (u) = f Y ( h 1 (u) ) dh 1 du where dh 1 du = dh 1 (u) du. (Derivation(p.294 295)) i) Suppose h(y) is a increasing function of y. ii) Suppose h(y) is a decreasing function of y. 7

2. Note that 1) It is not necessary that h(y) be increasing or decreasing for all values of y. h( ) need only be increasing or decreasing for the values of y such that f Y (y) > 0. The set of point {y : f Y (y) > 0} is called the support of the density f Y (y). If y = h 1 (u) is not the support of the density, then f Y (h 1 (u)) = 0. 2) Direct application of this method requires us to check that the function h(y) be either increasing or decreasing for all y such that f Y (y) > 0. If it is not, this method can not be used. 8

(Example 6.7(p.297))Suppose Y function given by f(y) = 2y, 0 y 1, 0, elsewhere. has density Let U be a new random variable defined by U = 4Y + 3. Find the probability density function for U. (Example 6.8(p.298))Suppose Y 1 and Y 2 have a joint density function given by f(y 1, y 2 ) = e (y 1+y 2 ), 0 y 1, 0 y 2 0, elsewhere. Find the density function for U = Y 1 + Y 2. (Example 6.8(p.298))Suppose Y 1 and Y 2 have a joint density function given by f(y 1, y 2 ) = 2(1 y 1 ), 0 y 1 1, 0 y 2 1 0, elsewhere. Find the density function for U = Y 1 Y 2. 9

6.5 The method of Moment Generating Functions This method is based on a uniqueness theorem of M.G.F., which states that, if two r.v. have identical moment-generating functions, the two r.v. s possess the same probability distributions. Let U be a function of the r.v. s Y 1, Y 2,..., Y n. 1. Find the moment generating function for U, m U (t). 2. compare m U (t) with other well-known moment generating functions. If m U (t) = m V (t) for all values of t, then U and V have identical distributions (by uniqueness theorem) 10

(Theorem 6.1)[Uniqueness Theorem] Let m X (t) and m Y (t) denote the moment generating functions of r.v. s X and Y, respectively. If both moment-generating functions exist and m X (t) = m Y (t) for all values of t, then X and Y have the same probability distribution. (Example 6.11)Let Z be a normally distributed random variable with mean 0 and variance 1. Use the method of moment-generating functions to find the probability distribution of Z 2. 11

The moment generating function method is often very useful for finding the distributions of sums of independent r.v. s. (Theorem 6.2(p.304)) Let Y 1, Y 2,..., Y n be independent r.v. s with moment generating functions m Y 1 (t), m Y 2 (t),..., m Y n (t), respectively. If U = Y 1 + Y 2 +... + Y n then m U (t) = m Y 1 (t) m Y 2 (t) m Y n (t). 12

(Example 6.12)The number of customer arrivals at a checkout counter in a given interval of time possesses approximately a Poisson probability distribution. If Y 1 denotes the time until the first arrival, Y 2 denotes the time between the first and second arrival,..., and Y n denotes the time between the (n 1)st and nth arrival, then it can be shown that Y 1, Y 2,..., Y n are independent random variables, with the density function for Y i given by f Yi (y i ) = 1 θ e y i θ, y i > 0 0, elsewhere. Here θ is the average time between arrivals.find the probability density function for the waiting time from the opening of the counter until the nth customer arrives. 13

The m.g.f method can be used to establish some interesting and useful results about the distributions of some functions of normally distributed r.v. s. (Theorem 6.3) Let Y 1, Y 2,..., Y n be independent normally distributed r.v. s with E(Y i ) = µ i and V (Y i ) = σ 2 i, for i = 1, 2,..., n and let a 1, a 2,..., a n be constants. If U = n i=1 a i Y i, then U is a normally distributed random variable with E(U) = n i=1 a i µ i and V (U) = n i=1 a 2 i σ2 i. (Proof) (Exercise 6.35) 14

(Theorem 6.4) Let Y 1, Y 2,..., Y n be independent normally distributed r.v. s with E(Y i ) = µ i and V (Y i ) = σ 2 i, for i = 1, 2,..., n and define Z i by Z i = Y i µ i σ i, i = 1, 2,..., n. Then n i=1 Zi 2 has a χ 2 -distribution with n degrees of freedom. (Proof) (Exercise 6.34) 15

(Exercise 6.43) (Exercise 6.44) 16

Homework : Reading Chapter 6 Homework (Due Dec 4th) : 6.1, 6.8, 6.16, 6.19, 6.24, 6.31, 6.37, 6.40, 6.46, 6.49. 17