Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Similar documents
Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Continuous random variables

Lecture 1: August 28

Probability and Statistics Notes

Exercises and Answers to Chapter 1

Chapter 4 Multiple Random Variables

MAS223 Statistical Inference and Modelling Exercises

Statistics for scientists and engineers

Review for the previous lecture

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Continuous Distributions

Sampling Distributions

Probability Distributions Columns (a) through (d)

Functions of Random Variables Notes of STAT 6205 by Dr. Fan

Bivariate distributions

2 Functions of random variables

Contents 1. Contents

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

Sampling Distributions

Chapter 3 Common Families of Distributions

Chapter 1. Sets and probability. 1.3 Probability space

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Probability and Distributions

Mathematical statistics

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

Mathematical statistics

A Few Special Distributions and Their Properties

Random Variables and Their Distributions

Continuous Distributions

Limiting Distributions

ECON 5350 Class Notes Review of Probability and Distribution Theory

i=1 k i=1 g i (Y )] = k

Continuous Random Variables

Limiting Distributions

7 Random samples and sampling distributions

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

3 Continuous Random Variables

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )

15 Discrete Distributions

Discrete Distributions

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

Topic 4: Continuous random variables

Question Points Score Total: 76

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Review of Statistics I

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Sampling Distributions of Statistics Corresponds to Chapter 5 of Tamhane and Dunlop

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

THE QUEEN S UNIVERSITY OF BELFAST

Statistics for Economists. Lectures 3 & 4

Lecture 2: Repetition of probability theory and statistics

MATH Notebook 5 Fall 2018/2019

1 Probability and Random Variables

1.1 Review of Probability Theory

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

4. Distributions of Functions of Random Variables

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

2. The CDF Technique. 1. Introduction. f X ( ).

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

STA 256: Statistics and Probability I

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

0, otherwise, (a) Find the value of c that makes this a valid pdf. (b) Find P (Y < 5) and P (Y 5). (c) Find the mean death time.

[Chapter 6. Functions of Random Variables]

Introduction and Overview STAT 421, SP Course Instructor

Chapter 8.8.1: A factorization theorem

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

MATH Solutions to Probability Exercises

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Topic 4: Continuous random variables

Actuarial Science Exam 1/P

Algorithms for Uncertainty Quantification

Week 2. Review of Probability, Random Variables and Univariate Distributions

Multiple Random Variables

Brief Review of Probability

Homework 10 (due December 2, 2009)

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Mathematical Statistics 1 Math A 6330

Test Problems for Probability Theory ,

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

Continuous Random Variables and Continuous Distributions

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Chapter 5. Chapter 5 sections

2 (Statistics) Random variables

STAT 3610: Review of Probability Distributions

Intro to Probability Instructor: Alexandre Bouchard

Stat410 Probability and Statistics II (F16)

1 Review of Probability and Distributions

Chapter 3. Chapter 3 sections

Transcription:

Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique 5.5 Random Functions Associated With Normal Distributions 5.6 The Central Limit Theorem 5.7 Approximations for Discrete Distributions skipped 5.8 Chebyshev s Inequality and Convergence in Probability 5.9 Limiting Moment-Generating Functions 5-1 5.1 Functions of One Random Variable Let X be a random variable. Then Y = u(x) is also a random variable that has its own distribution. The fundamental question here is how to find the distribution of Y, given the distribution of X and function u(x). Distribution function technique - No tricks. Just use def of CDF F(y) = P(Y y) = P(u(X) y), a function of y and then f(y) = F' (y) Example 5.1-1 X ~ Γ(α, θ) & Y = e X, where θ = 1 / λ Find the pdf of Y loggamma Example 5.1-2 W ~ U(-π/2, π/2) & X = tan(w). Find the pdf of X Cauchy 5-2 1

Functions of One Random Variable Change-of-variable technique If Y = u(x) has the inverse function X = v(y), then g(y) = f[v(y)] v'(y) Redo Examples 5.1-1 5.1-2 Example 5.1-3 Let X have the pdf f(x) = 3(1 x) 2, 0< x < 1. Say Y = (1 X) 3. Find the pdf of Y Using definition is the best for 1 var. Examples 5.1-5 5.1-7 Example 5.1-5 Let X have the pdf f(x) = x 2 /3, 1 < x < 2, and Y = X 2. Find the pdf of Y Example 5.1-6 Let X have a Poisson distribution with λ = 4, and Y = X. Find the pdf of Y Example 5.1-7: X ~ B(n, p), and Y = X 2 5-3 Two Theorems related to Simulation Use U(0,1) and a CDF F(x) to generate a random variable X having a CDF F(x) Theorem 5.1-1: Let Y have a distribution that is U(0, 1). Let F(x) have the properties of a distribution function of the continuous type with F(a) = 0, F(b) = 1, and suppose that F(x) is strictly increasing on the support a < x < b, where a and b could be - and, respectively. Then the random variable X defined by X = F -1 (Y) is a continuous-type random variable with distribution function F(x) Example 5.1-4 Use U(0, 1) to generate observations from Cauchy distribution Random variable X having F(x), then Y = F(X) ~ U(0,1) Theorem 5.1-2: Let X have the distribution function F(x) of the continuous type that is strictly increasing on the support a < x < b. Then the random variable Y, defined by Y = F(X), has a distribution that is U(0, 1) 5-4 2

5.2 Transformations of Two Random Variables X 1 and X 2 are two continuous-type random variables with joint pdf f(x 1, x 2 ) and Y 1 = u 1 (X 1, X 2 ), Y 2 = u 2 (X 1, X 2 ). The fundamental question is how to find the joint pdf of (Y 1, Y 2 ) Change-of-variable technique If Y 1 = u 1 (X 1, X 2 ), Y 2 = u 2 (X 1, X 2 ) has the single-valued inverse X 1 = v 1 (Y 1, Y 2 ), X 2 = v 2 (Y 1, Y 2 ), then the joint p. d. f. of Y 1 and Y 2 is gy ( 1, y2) J fv [ 1( y1, y2), v2( y1, y2)], ( y1, y2) S x x 1 y y where J is the determinant of J x x y y It is often the mapping of the support S of X 1, X 2 into S 1 of Y 1, Y 2 which causes the biggest challenge. Examples of 5.2-1 5.2-3 & 5.2-6 1 1 1 2 2 2 1 2 Double exponential and beta distributions 5-5 Example 5.2-1 Examples of Transformation Let X1,X2 have the joint pdf f(x1, x2) = 2, 0 < x1 < x2 < 1. Consider the transformation Y1 = X1/X2, Y2 = X2. Find the joint pdf of Y1 & Y2 Example 5.2-2 Let X1 and X2 be independent unit exponential random variables, and Y1 = X1 X2, Y2 = X1 + X2. Find the joint pdf of Y1 & Y2 Example 5.2-3 Let X1 and X2 have independent gamma distributions with parameters α, θ and β, θ, respectively. Consider Y1 = X1/(X1 + X2), Y2 = X1 + X2 Find the joint pdf of Y1 & Y2 5-6 3

F Distributions Example 5.2-4 F distribution as a ratio of χ 2 distributions F(r 1, r 2 ) is defined as normalized ratio of two independent χ 2 distributions χ 2 (r 1 ) and χ 2 (r 2 ) Its random variable takes positive values It has a long right tail skew to right, not a symmetric distribution The reciprocal of an F(r 1, r 2 ) distribution is also an F(r 2, r 1 ) distribution, with the degrees of freedom switched This property can be used to find probability & quantile Problems related to F distributions Finding probability Finding percentile or quantile Example 5.2-5 W ~ F(4, 6). Find P(1/15.21 W 9.15) & P(1/6.16 W 4.53). F 0.95 (4, 6) = 1/F 0.05 (6, 4) = 1/6.16 = 0.1623 5-7 5.3 Several Random Variables We have discussed how to find the pdf of function of one or two random variables and it can be complicated for two random variables. It will be more complicated for several random variables. However, if we add the independence condition, the problem might be doable. The collection of n independent and identically distributed random variables, X 1, X 2,..., X n, is said to be a random sample of size n from that common distribution population Joint distribution of independent variables Discrete case: p.m.f. f(x 1, x 2,, x n ) = f 1 (x 1 ) f 2 (x 2 ) f n (x n ) Continuous case: p.d.f. f(x 1, x 2,, x n ) = f 1 (x 1 ) f 2 (x 2 ) f n (x n ) Examples 5.3-1 5.3-3 5-8 4

Examples of Several Random Variables Example 5.3-1 Let X1 and X2 be two independent random variables resulting from two rolls of a fair die. Let Y = X1 + X2. Find pdf of Y and mean & variance of Y Example 5.3-2 Let X1, X2, X3 be a random sample from a distribution with pdf f(x) = e x, 0 < x <. Find P(0 < X1 < 1, 2 < X2 < 4, 3 < X3 < 7) Example 5.3-3 An electronic device runs until one of its three components fails. The lifetimes (in weeks), X1, X2, X3, of these components are independent, and each has the Weibull pdf f(x) = 2x exp{ (x/5) 2 }/25, 0 < x <. Find the probability that the device stops running in the first three weeks 5-9 Several Theorems Regarding Mean & Variance THM5.3-0: Let X 1, X 2,..., X n be n independent random variables that have the joint p.m.f. (or p.d.f.) f 1 (x 1 ) f 2 (x 2 ) f n (x n ). Let the random variable Y = u(x 1, X 2,..., X n ) have the p.m.f. (or p.d.f.) g(y). Then, yg( y)... u( x, x,..., x ) f ( x ) f ( x )... f ( x ) E(Y) = y x1 x2 xn 1 2 n 1 1 2 2 n n yg( y) dy... u( x, x,..., x ) f ( x ) f ( x )... f ( x ) dx dx... dx n R 1 2 n 1 1 2 2 n n 1 2 n THM5.3-1: Let X 1, X 2,..., X n be n independent random variables and the random variable Y = u 1 (X 1 )u 2 (X 2 )... u n (X n ). If E[u i (X i )], i = 1, 2,, n, exists, then E(Y) = E[u 1 (X 1 )u 2 (X 2 )... u n (X n )] = E[u 1 (X 1 )] E[u 2 (X 2 )] E[u n (X n )] Are E(X 2 ) and [E(X)][E(X)] = [E(X)] 2 equal because of Theorem 5.3-1???? THM5.3-2: Let {X i } be n independent random variables with respective mean μ i and variance σ i2, i = 1, 2,, n. Then the mean and variance of Y = a 1 X 1 + a 2 X 2 + + a n X n, where {a i } are real constants, are, n n respectively 2 2 2 a and a Y i i Y i i i1 i1 5-10 or 5

Examples 5.3-4 5.3-5 Example 5.3-4 Let the independent random variables X1 and X2 have respective means μ 1 = 4 and μ 2 = 3 and variances σ 2 1 = 4 and σ 2 2 = 9. Find the mean and the variance of Y = 3X 1 2X 2 Example 5.3-5 Let X 1, X 2 be a random sample from a distribution with mean μ and variance σ 2. Let Y = X 1 X 2. Find the mean and the variance of Y Mean and variance of a sample statistic Let X 1,X 2,...,X n, be a random sample from a distribution with mean μ and variance σ 2 X Find the mean and variance of 1 X2... Xn What about sample variance? X n 5-11 5.4 The Moment-Generating Function Technique Key question: Find the distribution of Y = u(x 1,, X n ) Another way to do it: moment-generating function technique THM5.4-1: Let {X i } be n independent random variables with respective moment-generating functions M xi (t), i = 1, 2, 3,..., n. Then the moment-generating function of Y = a 1 X n 1 + a 2 X 2 + + a n X n is M Y () t M X ( a ) i it COROLLARY 5.4-1: If X 1, X 2,..., X n are observations of a random sample from a distribution with momentgenerating function M(t), then the moment-generating function of Y = X 1 + X 2 +... + X n is M Y (t) = M(t) M(t) M(t) = [M(t)] n the moment-generating function of X = (X 1 + X 2 +... + X n ) / n is M () t M t/ n M t/ n M t/ n [ M t/ n ] n X 5-12 i 1 6

Let s See Examples Example 5.4-1 Let X 1 and X 2 be independent random variables with uniform distributions on {1, 2, 3, 4}. Let Y = X 1 + X 2. Find the mgf of Y. Can we find the distribution of Y? Example 5.4-2 Let X 1, X 2,..., X n denote the outcomes of n Bernoulli trials, each with probability of success p. Y = X 1 + X 2 +... + X n. Find the mgf of Y. What is the mgf of X i? What is the distribution of Y? Example 5.4-3 Let X 1, X 2, X 3 be the observations of a random sample of size n = 3 from the exponential distribution having mean θ. Find the mgf of Y = X 1 + X 2 + X 3. What is the mgf of X i? What is the distribution of Y? 5-13 Moment-Generating Function to Distributions THEOREM 5.4-2: Let X 1, X 2,..., X n be independent chisquare random variables with r 1, r 2,, r n degrees of freedom, respectively. Then the distribution of the random variable Y = X 1 + X 2 +... + X n is chi-squared distribution with r 1 + r 2 + + r n degrees of freedom χ 2 (r 1 + r 2 + + r n ) COROLLARY 5.4-2: Let Z 1, Z 2,..., Z n have standard normal distributions, N(0,1). If these random variables are independent, then W, the squared sum of {Z i } has a chisquared distribution with n degrees of freedom χ 2 (n) COROLLARY 5.4-3: If X 1, X 2,..., X n are independent and have normal distributions with mean µ i and standard deviation σ i, i = 1, 2,..., n, respectively, then the standardized squared sum W is chi-squared distribution with n degrees of freedom. 5-14 7

5.5 Random Functions Associated With Normal Distributions THEOREM 5.5-1: If X 1, X 2,..., X n are n mutually independent normal variables with mean µ i and standard deviation σ i, i = 1, 2,..., n, respectively, then the linear function Y = c 1 X 1 + c 2 X 2 + + c n X n has the normal distribution with mean μ Y and standard deviation σ Y, where Example 5.5-1 COROLLARY 5.5-1: If X 1, X 2,..., X n are a random sample from N(μ, σ 2 ), then the distribution of the sample mean X is also a normal distribution. Example 5.5-2 THEOREM 5.5-2: Let X 1, X 2,..., X n be a random sample from N(μ, σ 2 ). Then the sample mean X and the sample variance S 2 are independent and (n 1)S 2 / σ 2 is χ 2 (n-1) Example 5.5-3 5-15 Example 5.5-1 Let s See Examples Assume X 1 ~ N(693.2, 22820), X 2 ~ N(631.7, 19205), and X 1 and X 2 be independent. Find P(X 1 > X 2 ). Hint: find the distribution of Y = X 1 X 2 Example 5.5-2 Let X 1, X 2,..., X n be a random sample from the N(50, 16) distribution. We know that the distribution of X is N(50, 16/n). To illustrate the effect of n, the graph of the pdf of X is given in Figure 5.5-1 at the page 194 for n = 1, 4, 16, and 64. When n = 64, compare the areas that represent P(49 < X < 51) and P(49 < X 1 < 51). Example 5.5-3 Let X 1, X 2, X 3, X 4 be a random sample of size 4 from N(76.4, 383). 4 2 4 2 Let ( X i 76.4) ( Xi X) U, W i1 383 i1 383 Find probabilities P(0.711 U 7.779) & P(0.352 W 6.251). 5-16 8

T Distribution THEOREM 5.5-3 (Student s t distribution): Suppose that Z is a random variable with N(0, 1), U is a random variable with χ 2 (r) and that Z and U are independent. Then T = Z / (U / r) has a t distribution with p. d. f. [( r 1) / 2] 1 f () t, t 2 ( r 1)/2 r( r / 2) (1 t / r) We say T has a T distribution with r degrees of freedom. An important result used often in Statistics: Let X 1, X 2,..., X n be a random sample from N(μ, σ 2 ). Then X T S / n has a t distribution with r = n - 1 degrees of freedom 5-17 T Distribution Plots of T-distributions Example 5.5-4 Let the distribution of T be t(11). Find P( 1.796 T 1.796) Find a such that P( a T a) = 0.95 5-18 9

5.6 The Central Limit Theorem THEOREM 5.6-1 (Central Limit Theorem): If X is the mean of a random sample X 1, X 2,..., X n from a distribution with a finite mean µ and a finite positive variance σ 2, then the distribution of X W / n is N(0, 1) in the limit as n When n >> 1, then W Z. Find various kinds of probabilities Special case of THM 5.6-1: if X 1, X 2,..., X n from N(µ, σ 2 ), then W Z by COROLLARY 5.5-1. 5-19 Let s See Some Examples Example 5.6-1 Let X be the mean of a random sample of n = 25 currents (in milliamperes) in a strip of wire in which each measurement has a mean of 15 and a variance of 4. Find P(14.4 < X < 15.6) Example 5.6-2 Let X 1, X 2,..., X 20 denote a random sample of size 20 from the uniform distribution U(0, 1). Let Y = X 1 + X 2 +... + X 20. Find P(Y 9.1) & P(8.5 Y 11.7) Example 5.6-3 Let X denote the mean of a random sample of size 25 from the distribution whose pdf is f(x) = x 3 /4, 0 < x < 2. Find P(1.5 X 1.65) Example 5.6-5 Let X 1, X 2,..., X n denote a random sample of size n from a chi-square distribution and Y = X 1 + X 2 +... + X n. Then W = (Y n)/ 2n Z 5-20 10