Chapter 6: Functions of Random Variables

Similar documents
[Chapter 6. Functions of Random Variables]

Chapter 3: Discrete Random Variables

Module 6: Methods of Point Estimation Statistics (OA3102)

MATH 360. Probablity Final Examination December 21, 2011 (2:00 pm - 5:00 pm)

Using R in Undergraduate Probability and Mathematical Statistics Courses. Amy G. Froelich Department of Statistics Iowa State University

Contents 1. Contents

Continuous random variables

Non-Parametric Statistics: When Normal Isn t Good Enough"

Random Signal Transformations and Quantization

MAS223 Statistical Inference and Modelling Exercises

Linear Regression Analysis for Survey Data. Professor Ron Fricker Naval Postgraduate School Monterey, California

Today s Outline. Biostatistics Statistical Inference Lecture 01 Introduction to BIOSTAT602 Principles of Data Reduction

Practice Problems Section Problems

Introduction to Survey Analysis!

SOLUTION FOR HOMEWORK 11, ACTS 4306

MAT 271E Probability and Statistics

Probability review. September 11, Stoch. Systems Analysis Introduction 1

Continuous Random Variables

CS145: Probability & Computing Lecture 11: Derived Distributions, Functions of Random Variables

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Chapter 2: Probability

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Stat 451 Lecture Notes Simulating Random Variables

Statistics 3657 : Moment Approximations

Order Statistics and Distributions

STAT 430/510: Lecture 15

Statistics for scientists and engineers

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

COMP90051 Statistical Machine Learning

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Module 9: Nonparametric Statistics Statistics (OA3102)

Contents. Preface to Second Edition Preface to First Edition Abbreviations PART I PRINCIPLES OF STATISTICAL THINKING AND ANALYSIS 1

Probability and Stochastic Processes

Communication Theory II

Regression Methods for Survey Data

EE 302 Division 1. Homework 6 Solutions.

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

1 Solution to Problem 2.1

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

review session gov 2000 gov 2000 () review session 1 / 38

MAT 271E Probability and Statistics

F denotes cumulative density. denotes probability density function; (.)

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Random Variables and Their Distributions

Chapter 5 Class Notes

Continuous random variables

Basic Sampling Methods

CHAPTER 10 Comparing Two Populations or Groups

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Week 2: Review of probability and statistics

Multivariate Distributions

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University. Motivations: Detection & Characterization. Lecture 2.

Classical and Bayesian inference

Conditional independence of blocked ordered data

The data can be downloaded as an Excel file under Econ2130 at

Generalized Linear Models

The Normal Table: Read it, Use it

2 Functions of random variables

Dynamic models 1 Kalman filters, linearization,

MATH Notebook 5 Fall 2018/2019

This midterm covers Chapters 6 and 7 in WMS (and the notes). The following problems are stratified by chapter.

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

lim F n(x) = F(x) will not use either of these. In particular, I m keeping reserved for implies. ) Note:

Module 10: Analysis of Categorical Data Statistics (OA3102)

Part 4: Multi-parameter and normal models

Name of the Student: Problems on Discrete & Continuous R.Vs

arxiv: v1 [stat.ot] 6 Apr 2017

STAT515, Review Worksheet for Midterm 2 Spring 2019

10 Introduction to Reliability

We introduce methods that are useful in:

STA 4321/5325 Solution to Homework 5 March 3, 2017

The Normal Distribuions

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Problem 1 (20) Log-normal. f(x) Cauchy

Properties of Continuous Probability Distributions The graph of a continuous probability distribution is a curve. Probability is represented by area

f (1 0.5)/n Z =

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Statistical Intervals (One sample) (Chs )

i=1 k i=1 g i (Y )] = k

Chapter 4 Multiple Random Variables

STA Module 11 Inferences for Two Population Means

STA Rev. F Learning Objectives. Two Population Means. Module 11 Inferences for Two Population Means

Probabilistic Graphical Models

APPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1

Asymptotic Statistics-III. Changliang Zou

REFERENCES AND FURTHER STUDIES

Gamma and Normal Distribuions

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

Asymptotic Confidence Limits for a Repairable System with Standbys Subject to Switching Failures

C-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A.

Probability and Distributions

Chapter 6: Functions of Random Variables

Transcription:

Chapter 6: Functions of Random Variables Professor Ron Fricker Naval Postgraduate School Monterey, California 3/15/15 Reading Assignment: Sections 6.1 6.4 & 6.8 1

Goals for this Chapter Introduce methods useful for finding the probability distributions of functions of random variables Method of distribution functions Method of transformations 3/15/15 2

Sections 6.1 & 6.2: Functions of Random Variables In OA3102 you re going to learn to do inference Using a sample of data, you re going to make quantitative statements about the population For example, you might want to estimate the mean µ of the population from a sample of data Y 1,, Y n To estimate the mean of the population (distribution) you will use the sample average nx Ȳ = µ i=1 Y i 3/15/15 3

Functions of Random Variables But in doing statistics, you will quantify the goodness of the estimate This error of estimation will have to be quantified probabilistically since the estimates themselves are random That is, the observations (the Y i s) are random observations from the population, so they are random variables, and so are any statistics that are functions of the random variables Ȳ So, is a function of other random variables and the question then is how to find its probability distribution? 3/15/15 4

But Functions of Random Variables Arise in Other Situations As Well You re analyzing a subsurface search and detection operation Data shows that the time to detect one target is uniformly distributed on the interval (0, 2 hrs) But you re interested in analyzing the total time to sequentially detect n targets, so you need to know the distribution of the sum of n detection times You re analyzing a depot-level repair activity It s reasonable to assume that the time to receipt of one repair part follows an exponential distribution with =5 days But the end-item is not repaired until the last part is received, so you need to know the distribution of the maximum receipt time of n parts 3/15/15 5

Functions of Random Variables To determine the probability distribution for a function of n random variables, Y 1, Y 2,, Y n, we must find the joint probability distribution of the r.v.s To do so, we will assume observations are: Obtained via (simple) random sampling Independent and identically distributed Thus: p(y 1,y 2,...,y n )=p(y 1 )p(y 2 ) p(y n ) or f(y 1,y 2,...,y n )=f(y 1 )f(y 2 ) f(y n ) 3/15/15 6

Methods for Finding Joint Distributions Three methods: Method of distribution functions Method of transformations Method of moment-generating functions We ll do the first two Method of moment-generating functions follows from Chapter 3.9 (which we skipped) If two random variables have the same moment-generating functions, then they also have the same probability distributions Beyond the scope of this class (unless you re an OR Ph.D. student) 3/15/15 7

Section 6.3: Method of Distribution Functions The idea: You want to determine the distribution function of a random variable U U is a function of another random variable Y You know the pdf of Y, f (y) The methodology by the numbers : 1. For Y with pdf f (y), specify the function U of Y 2. Then find F U (u) =P (U apple u) by directly integrating f (y) over the region for which 3. Then the pdf for U is found by differentiating F U (u) Let s look at an example U apple u 3/15/15 8

Textbook Example 6.1 For clarity, let s dispense with the story and get right to the problem: Y has pdf 2y, 0 apple y apple 1 f(y) = 0, elsewhere Find the pdf of U, f U (u), where U = 3Y-1. Solution: 3/15/15 9

Textbook Example 6.1 Solution (cont d) 3/15/15 10

Can Also Apply the Method to Bivariate (and Multivariate) Distributions The idea: In the bivariate case, there are random variables Y 1 and Y 2 with joint density f (y 1,y 2 ) There is some function U = h(y 1,Y 2 ) of Y 1 and Y 2 I.e., for every point (y 1,y 2 ) there is one and only one value of U If there exists a region of values (y 1,y 2 ) such that U apple u, then integrating f (y 1,y 2 ) over this region gives P (U apple u) =F U (u) Then get f U (u) by differentiation Again, let s do some examples 3/15/15 11

Textbook Example 6.2 Returning to Example 5.4, we had the following joint density of Y 1 and Y 2 : Find the pdf of U, f U (u), where U = Y 1 -Y 2. Also find E(U). Solution: f(y 1,y 2 )= 3y1, 0 apple y 2 apple y 1 apple 1 0, elsewhere 3/15/15 12 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Textbook Example 6.2 Solution (cont d) 3/15/15 13

Textbook Example 6.2 Solution (cont d) 3/15/15 14

Textbook Example 6.2 Solution (cont d) Graphs of the cdf and pdf of U: 3/15/15 15 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Textbook Example 6.3 Let (Y 1,Y 2 ) denote a random sample of size n = 2 from the uniform distribution on (0,1). Find the pdf for U = Y 1 +Y 2. Solution: 3/15/15 16 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Textbook Example 6.3 Solution (cont d) 3/15/15 17 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Textbook Example 6.3 Solution (cont d) Using a geometry-based approach 3/15/15 18

Textbook Example 6.3 Solution (cont d) 3/15/15 19 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Textbook Example 6.3 Solution (cont d) 3/15/15 20 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Empirically Demonstrating the Result Let s use R to verify the Example 6.3 result: Histogram of u Density 0.0 0.2 0.4 0.6 0.8 1.0 Very close. Of course, there s noise related to the stochastic nature of the demonstration. 0.0 0.5 1.0 1.5 2.0 u 3/15/15 21

Distribution Function Method Summary Let U be a function of the random variables Y 1, Y 2,, Y n. Then: 1. Find the region U = u in the (y 1, y 2,, y n ) space 2. Find the region U apple u 3. Find F U (u) =P (U apple u) by integrating f(y 1,y 2,...,y n ) over the region U apple u 4. Find the density function f U (u) by differentiating F U (u): f U (u) =df U (u)/du 3/15/15 22

Illustrating the Method Consider the case U = h(y )=Y 2, where Y is continuous with cdf F Y (y) and pdf f Y (y) 1. Find the region U = u in the (y 1, y 2,, y n ) space Solution: As Figure 6.7 shows, Y 2 = u or Y = ± p u 3/15/15 23 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Illustrating the Method Continued 2. Find the region where U apple u Solution: By inspection of Figure 6.7, it should be clear that U apple u whenever p p u apple Y apple u 3. Find F U (u) =P (U apple u) by integrating f(y 1,y 2,...,y n ) over the region U apple u Solution: There are two cases to consider. First, if u apple 0 then it must be (see Fig. 6.7) F U (u) =P (U apple u) =P (Y 2 apple u) =0 Second, if u>0 3/15/15 24

Illustrating the Method Continued F U (u) = Z p u p u f(y)dy = F Y ( p u) F Y ( p u) So, we can write the cdf of U as FY ( p p u) F F U (u) = Y ( u), u > 0 0, otherwise 4. Finally, find the pdf by differentiating F U (u): f U (u) = 1 2 p u [f Y ( p u)+f Y ( p u)], u > 0 0, otherwise 3/15/15 25

Textbook Example 6.4 Let Y have pdf f Y (y) = (y + 1)/2, 1 apple y apple 1 0, elsewhere Find the pdf for U = Y 2. Solution: 3/15/15 26

Textbook Example 6.5 Let U be a uniform r.v. on (0,1). Find a transformation G(U) such that G(U) has an exponential distribution with mean. Solution: 3/15/15 27

Textbook Example 6.5 Solution (cont d) 3/15/15 28

Empirically Demonstrating the Result Let s use R to verify the Example 6.5 result: Histogram of y Histogram of exp_rvs Density 0.0 0.2 0.4 0.6 0.8 Density 0.0 0.2 0.4 0.6 0.8 0 2 4 6 8 10 12 y 0 2 4 6 8 10 12 exp_rvs 3/15/15 29

Simulating Exponential R.V.s Using U(0,1)s What the last example shows is that we can use computer subroutines designed to generate U(0,1) random variables to generate other types of random variables It works as long as the cdf F(y) has a unique inverse F 1 ( ) Not always the case; for those there are other methods And, the beauty of R and other modern software packages is that most or all of this work is already done for you 3/15/15 30

Section 6.3 Homework Do problems 6.1, 6.4, 6.11 3/15/15 31

Section 6.4: Method of Transformations A (sometimes simpler) method to use, if U = h(y) is an increasing / decreasing function Remember, an increasing function is one where, if y 1 < y 2, then h(y 1 ) < h(y 2 ) for all y 1 and y 2 : Similarly, a decreasing function is one where, if y 1 > y 2, then h(y 1 ) > h(y 2 ) for all y 1 and y 2 3/15/15 32 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Method of Transformations Note that if h(y) is an increasing function of y then h 1 (u) is an increasing function of u If u 1 < u 2, then h 1 (u 1 )=y 1 <y 2 = h 1 (u 2 ) So, the set of points y such that h(y) apple u 1 the same as the set of points y such that y apple h 1 (u 1 ) is This allows us to derive the following result 3/15/15 33 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Method of Transformations F U (u) =P (U apple u) =P [h(y ) apple u] And, from this result, it follows that f U (u) = df U (u) du = P h 1 [h(y )] apple h 1 (u) = P [Y apple h 1 (u)] = F Y [h 1 (u)] = df Y [h 1 (u)] du = f Y h 1 (u) d[h 1 (u)] du 3/15/15 34

Textbook Example 6.6 Returning to Example 6.1, solve via the Method of Transformations, where Y has pdf 2y, 0 apple y apple 1 f(y) = 0, elsewhere I.e., find the pdf of U, f U (u), where U = 3Y-1. Solution: 3/15/15 35

Textbook Example 6.5 Solution (cont d) 3/15/15 36

Now, for Decreasing Functions Now, if h is a decreasing function (e.g., Figure 6.9) then if u 1 < u 2, we have h 1 (u 1 )=y 1 >y 2 = h 1 (u 2 ) From this, it follows that F U (u) =P (U apple u) =P [Y Differentiating gives h 1 (u)] =1 F Y [h 1 (u)] 3/15/15 37 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7 th edition, Thomson Brooks/Cole.

Putting It All Together f U (u) = f Y [h 1 (u)] d[h 1 (u)] du = f Y [h 1 d[h 1 (u)] (u)] du Theorem: Let Y have pdf f Y (y). If h(y) is either increasing or decreasing for all y such that f Y (y) > 0, * then U = h(y) has pdf f U (u) =f Y [h 1 (u)] d[h 1 (u)] du 3/15/15 38 * Note that the function only has to be increasing or decreasing over the support of the pdf of y: {y : f Y (y) > 0}.

Textbook Example 6.7 Let Y have pdf f(y) = Find the pdf of U = -4Y+3. Solution: 2y, 0 apple y apple 1 0, elsewhere 3/15/15 39

Textbook Example 6.7 Solution (cont d) 3/15/15 40

Textbook Example 6.8 Let Y 1 and Y 2 have the joint pdf e (y 1 +y 2 f(y 1,y 2 )= ), y 1 0,y 2 0 0, otherwise Find the pdf for U = Y 1 + Y 2. Solution: 3/15/15 41

Textbook Example 6.8 Solution (cont d) 3/15/15 42

Textbook Example 6.9 Example 5.19 had the joint pdf 2(1 y1 ), 0 apple y f(y 1,y 2 )= 1 apple 1, 0 apple y 2 apple 1 0, otherwise Find the pdf for U = Y 1 Y 2 as well as E(U). Solution: 3/15/15 43

Textbook Example 6.9 Solution (cont d) 3/15/15 44

Textbook Example 6.9 Solution (cont d) 3/15/15 45

Distribution Function Method Summary Let U = h(y ), where h(y ) is either an increasing or decreasing function of y for all y such that f Y (y) > 0. 1. Find the inverse function 2. Evaluate 3. Find f U (u) by d[h 1 (u)] du f U (u) =f Y [h 1 (u)] y = h 1 (u) d[h 1 (u)] du 3/15/15 46

Section 6.4 Homework Do problems 6.23, 6.24, 6.25 3/15/15 47

Section 6.8: Summary We often need to know the probability distribution for functions of random variables This is a common requirement for statistics and will arise throughout OA3102 and OA3103 It also comes up in lots of other ORSA probability problems Today we can often simulate and get an empirical estimate of a distribution, as we did here for a couple of examples, but a closedform analytical result is better 3/15/15 48

Summary Continued We covered two methods Distribution function method (chpt. 6.3) Transformation method (chpt. 6.4) and we skipped over two methods Moment-generating function method (chpt. 6.5) Multivariable transforms w/ Jacobians (chpt. 6.6) Which is better depends on the problem Sometimes one or more are inapplicable Sometimes one is easier than the other Sometimes one or more works and others don t 3/15/15 49

Summary Continued The sections we skipped prove some results you need to know for OA3102 and beyond: 1. If Y N(µ, 2 )thenz = Y µ N(0, 1) 2. If Z N(0, 1) then Z 2 2 ( = 1) 3. If Z i N(0, 1),i=1,...,n then nx Z 2 i 2 (n) i=1 3/15/15 50

Summary Continued 4. Let Y 1, Y 2,, Y n be independent normally distributed random variables with E(Y i )=µ i and V (Y i )= i 2 for i = 1, 2,, n, and let a 1, a 2,, a n be constants. Then nx U = a i Y i = a 1 Y 1 + a 2 Y 2 + + a n Y n i=1 is a normally distributed random variable with nx nx E(U) = a i µ i and V (U) = i=1 i=1 a 2 i 2 i 3/15/15 51

What We Have Just Learned Introduced methods useful for finding the probability distributions of functions of random variables Method of distribution functions Method of transformations 3/15/15 52