Chapter 5 Joint Probability Distributions

Similar documents
Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 5. Chapter 5 sections

2 (Statistics) Random variables

Continuous r.v practice problems

Stochastic Models of Manufacturing Systems

Chapter 5 Class Notes

Multivariate distributions

Bivariate distributions

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 120 minutes.

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

STA 256: Statistics and Probability I

FACULTY OF ENGINEERING AND ARCHITECTURE. MATH 256 Probability and Random Processes. 04 Multiple Random Variables

Joint Probability Distributions, Correlations

Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators.

Statistical Methods in Particle Physics

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Stat 100a, Introduction to Probability.

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

Homework 10 (due December 2, 2009)

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Things to remember when learning probability distributions:

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

1 Random Variable: Topics

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

Chapter 2: Random Variables

conditional cdf, conditional pdf, total probability theorem?

Ch. 5 Joint Probability Distributions and Random Samples

Chapter 5 continued. Chapter 5 sections

Homework 5 Solutions

Joint Probability Distributions, Correlations

Mathematical Statistics 1 Math A 6330

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Notes for Math 324, Part 19

Chapter 4 Multiple Random Variables

LIST OF FORMULAS FOR STK1100 AND STK1110

Problem 1. Problem 2. Problem 3. Problem 4

Exercises and Answers to Chapter 1

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Discrete Random Variables

ACM 116: Lectures 3 4

Formulas for probability theory and linear models SF2941

18.440: Lecture 28 Lectures Review

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Statistics, Data Analysis, and Simulation SS 2015

Name of the Student: Problems on Discrete & Continuous R.Vs

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Introduction to Probability and Stocastic Processes - Part I

Multivariate Distributions CIVL 7012/8012

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

EE4601 Communication Systems

5. Conditional Distributions

CS145: Probability & Computing

Statistics and data analyses

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

MAS223 Statistical Inference and Modelling Exercises

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

Name of the Student: Problems on Discrete & Continuous R.Vs

Elements of Probability Theory

Random Variables. P(x) = P[X(e)] = P(e). (1)

STAT Chapter 5 Continuous Distributions

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Solution to Assignment 3

Data Analysis and Monte Carlo Methods

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Probability and Distributions

More than one variable

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Let X and Y denote two random variables. The joint distribution of these random

FINAL EXAM: 3:30-5:30pm

General Random Variables

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

3-1. all x all y. [Figure 3.1]

Contents 1. Contents

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Statistics for scientists and engineers

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Multiple Random Variables

Final Exam # 3. Sta 230: Probability. December 16, 2012

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Probability and Statistics Notes

Transcription:

Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions

5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4 Independence 5-1.5 More Than Two Random Variables 5-5 General Functions of Random Variables 5-6 Moment Generating Functions 5-2 Covariance and Correlation 5-3 Common Joint Distributions 5-3.1 Multinomial Probability Distribution 5-3.2 Bivariate Normal Distribution 5-4 Linear Functions of Random Variables Chapter 5 Title and Outline 2

Learning Objectives for Chapter 5 After careful study of this chapter, you should be able to do the following: 1. Use joint probability mass functions and joint probability density functions to calculate probabilities. 2. Calculate marginal and conditional probability distributions from joint probability distributions. 3. Interpret and calculate covariances and correlations between random variables. 4. Use the multinomial distribution to determine probabilities. 5. Properties of a bivariate normal distribution and to draw contour plots for the probability density function. 6. Calculate means and variances for linear combinations of random variables, and calculate probabilities for linear combinations of normally distributed random variables. 7. Determine the distribution of a general function of a random variable. 8. Calculate moment generating functions and use them to determine moments and distributions Chapter 5 Learning Objectives 3

Joint Probability Mass Function Sec 5-1.1 Joint Probability Distributions 4

Joint Probability Density Function The joint probability density function for the continuous random variables X and Y, denotes as f XY (x,y), satisfies the following properties: Figure 5-2 Joint probability density function for the random variables X and Y. Probability that (X, Y) is in the region R is determined by the volume of f XY (x,y) over the region R. Sec 5-1.1 Joint Probability Distributions 5

Example 5-2: Server Access Time-1 Let the random variable X denote the time until a computer server connects to your machine (in milliseconds), and let Y denote the time until the server authorizes you as a valid user (in milliseconds). X and Y measure the wait from a common starting point (x < y). The joint probability density function for X and Y is 0.001 0.002 6, x y for 0 and 610 f x y ke x y k XY Figure 5-4 The joint probability density function of X and Y is nonzero over the shaded region where x < y. Sec 5-1.1 Joint Probability Distributions 6

Example 5-2: Server Access Time-2 The region with nonzero probability is shaded in Fig. 5-4. We verify that it integrates to 1 as follows: 0.001x0.002 y 0.002 y 0.001x f XY x, ydydx ke dy dx k e dy e dx 0 0 0 0 0.002x e 0.001x 0.003x k e dx 0.003 e dx 0.002 0 0 1 0.003 1 0.003 Sec 5-1.1 Joint Probability Distributions 7

Example 5-2: Server Access Time-3 Now calculate a probability: 1000 2000 1000, 2000, P X Y f x y dydx 0 x 1000 2000 0.002 y 0.001x k e dy e dx 0.002 0 1000 0.002x 4 e e 0.001x k e dx 0.003 1000 0.003x 4 0.001x e e e dx 0 3 1 1e 4 1e 0.003 e 0.003 0.001 0.003 316.738 11.578 0.915 0 x XY Figure 5-5 Region of integration for the probability that X < 1000 and Y < 2000 is darkly shaded. Sec 5-1.1 Joint Probability Distributions 8

Marginal Probability Distributions (discrete) The marginal probability distribution for X is found by summing the probabilities in each column whereas the marginal probability distribution for Y is found by summing the probabilities in each row. X Y f x f xy f y f xy y x y = Response time(nearest second) x = Number of Bars of Signal Strength 1 2 3 f (y ) 1 0.01 0.02 0.25 0.28 2 0.02 0.03 0.20 0.25 3 0.02 0.10 0.05 0.17 4 0.15 0.10 0.05 0.30 f (x ) 0.20 0.25 0.55 1.00 Marginal probability distributions of X and Y Sec 5-1.2 Marginal Probability Distributions 9

Marginal Probability Density Function (continuous) If the joint probability density function of random variables X and Y is f XY (x,y), the marginal probability density functions of X and Y are: Sec 5-1.2 Marginal Probability Distributions 10

Example 5-4: Server Access Time-1 For the random variables that denotes times in Example 5-2, find the probability that Y exceeds 2000 milliseconds. Integrate the joint PDF directly using the picture to determine the limits. 2000 PY 2000 f XY x, ydy dx f XY x, ydy dx 0 2000 2000 x Dark region left dark region right dark region Sec 5-1.2 Marginal Probability Distributions 11

Example 5-4: Server Access Time-2 Alternatively, find the marginal PDF and then integrate that to find the desired probability. y 0.001x0.002 y fy y ke dx 0 y 0.002 y 0.001x ke e dx ke 0.002 y 0 0.001x e 0.001 1 e 0.001 0.001y 0.002 y ke y 0 3 0.002 y 0.001y 6 10 e 1 e for y 0 2000 P Y f y dy 2000 Y 3 0.002 y 0.001y 610 e 1 e dy 2000 0.002 y 0.003 y 3 e e 610 0.002 0.003 2000 2000 e e 0.002 0.003 4 6 3 6 10 0.05 Sec 5-1.2 Marginal Probability Distributions 12

Mean & Variance of a Marginal Distribution E(X) and V(X) can be obtained by first calculating the marginal probability distribution of X and then determining E(X) and V(X) by the usual method. E X x f x R V X x f x R 2 2 X X E Y y f y R V Y y f y R X Y 2 2 Y Y Sec 5-1.2 Marginal Probability Distributions 13

Mean & Variance for Example 5-1 y = Response time(nearest second) x = Number of Bars of Signal Strength f (y ) y *f (y ) y 2 *f (y ) 1 2 3 1 0.01 0.02 0.25 0.28 0.28 0.28 2 0.02 0.03 0.20 0.25 0.50 1.00 3 0.02 0.10 0.05 0.17 0.51 1.53 4 0.15 0.10 0.05 0.30 1.20 4.80 f (x ) 0.20 0.25 0.55 1.00 2.49 7.61 x *f (x ) 0.20 0.50 1.65 2.35 x 2 *f (x ) 0.20 1.00 4.95 6.15 E(X) = 2.35 V(X) = 6.15 2.35 2 = 6.15 5.52 = 0.6275 E(Y) = 2.49 V(Y) = 7.61 2.49 2 = 7.61 16.20 = 1.4099 Sec 5-1.2 Marginal Probability Distributions 14

Conditional Probability Density Function Sec 5-1.3 Conditional Probability Distributions 15

Example 5-6: Conditional Probability-1 From Example 5-2, determine the conditional PDF for Y given X=x. f 0.001x0.002 y f X x k e dy x Yx y ke 0.001x e 0.002 0.002 y e 0.002 0.002 0.001x ke x 0.003x 0.003 e for x0 f XY x, y ke f ( x) 0.003e X 0.001x0.002 y 0.003x 0.002x0.002 y 0.002 for 0 e x and x y Sec 5-1.3 Conditional Probability Distributions 16

Example 5-6: Conditional Probability-2 Now find the probability that Y exceeds 2000 given that X=1500: P Y 2000 2000 2000 X 1500 f Y 1500 0.002e 0.002e 3 y dy 0.002 1500 0.002 y 0.002 y e 0.002 2000 4 3 e 1 0.002e e 0.368 0.002 Sec 5-1.3 Conditional Probability Distributions 17

Mean & Variance of Conditional Random Variables The conditional mean of Y given X = x, denoted as E(Y x) or μ Y x is Yx E Y x y f y y The conditional variance of Y given X = x, denoted as V(Y x) or σ 2 Y x is 2 2 2 Y x Y x Y x Y x y V Y x y f y y f y y Sec 5-1.3 Conditional Probability Distributions 18

Example 5-8: Conditional Mean And Variance From Example 5-2 & 5-6, what is the conditional mean for Y given that x = 1500? 0.002 1500 0.002 y 3 0.002 y E Y X 1500 y 0.002e dy 0.002e y e dy 1500 1500 0.002 0.002 y 0.002 y 3 e e e y dy 1500 0.002 y 3 1500 3 e 0.002e e 0.002 0.0020.002 1500 0.002e 3 0.002 0.002 1500 Sec 5-1.3 Conditional Probability Distributions 19 3 1500 3 e e 0.002 0.002 0.002 e 0.002 3 3 0.002e 2000 2000 If the connect time is 1500 ms, then the expected time to be authorized is 2000 ms.

Example 5-9 For the discrete random variables in Exercise 5-1, what is the conditional mean of Y given X=1? y = Response time(nearest second) x = Number of Bars of Signal Strength 1 2 3 1 0.01 0.02 0.25 0.28 2 0.02 0.03 0.20 0.25 3 0.02 0.10 0.05 0.17 4 0.15 0.10 0.05 0.30 f (x ) 0.20 0.25 0.55 y*f(y x=1) y 2 *f(y x=1) 1 0.050 0.080 0.455 0.05 0.05 2 0.100 0.120 0.364 0.20 0.40 3 0.100 0.400 0.091 0.30 0.90 4 0.750 0.400 0.091 3.00 12.00 Sum of f(y x) 1.000 1.000 1.000 3.55 13.35 12.6025 0.7475 The mean number of attempts given one bar is 3.55 with variance of 0.7475. Sec 5-1.3 Conditional Probability Distributions 20 f (y )

Independent Random Variables For random variables X and Y, if any one of the following properties is true, the others are also true. Then X and Y are independent. Sec 5-1.4 Independence 21

Example 5-11: Independent Random Variables Suppose the Example 5-2 is modified such that the joint PDF is: Are X and Y independent? X Find the probability 6 0.001x 0.002 y f x, y 210 e for x 0 and y 0. XY 6 0.001x0.002 y 6 0.001x0.002 210 Y 210 f x e dy 0 0.001x 0.001 e for x0 y f y e dx 0 0.002 y 0.002 e for y > 0 1000, 1000 1000 1000 P X Y P X P Y 1 2 e e 1 0.318 Sec 5-1.4 Independence 22

Joint Probability Density Function The joint probability density function for the continuous random variables X 1, X 2, X 3, X p, denoted as f X X X x, x,..., xp satisfies the following properties: 1 2... p 1 2 Sec 5-1.5 More Than Two Random Variables 23

Example 5-14: Component Lifetimes In an electronic assembly, let X 1, X 2, X 3, X 4 denote the lifetimes of 4 components in hours. The joint PDF is: 1 2 3 4 12 0.001x1 0.002 x2 0.0015 x3 0.003x4 f x, x, x, x 9 10 e for x 0 X X X X 1 2 3 4 What is the probability that the device operates more than 1000 hours? The joint PDF is a product of exponential PDFs. P(X 1 > 1000, X 2 > 1000, X 3 > 1000, X 4 > 1000) = e -1-2-1.5-3 = e -7.5 = 0.00055 i Sec 5-1.5 More Than Two Random Variables 24

Marginal Probability Density Function Sec 5-1.5 More Than Two Random Variables 25

Mean & Variance of a Joint Distribution The mean and variance of X i can be determined from either the marginal PDF, or the joint PDF as follows: Sec 5-1.5 More Than Two Random Variables 26

Example 5-16 Points that have positive probability in the joint probability distribution of three random variables X1, X2, X3 are shown in Figure. Suppose the 10 points are equally likely with probability 0.1 each. The range is the non-negative integers with x 1 +x 2 +x 3 = 3 List the marginal PDF of X 2 f f f f x 3 3 3 P (X 2 = 0) = x1x2 x3(3,0,0) + x1x2 x (0,0,3) + x1x2 x (1,0,2) + 1x2x (2,0,1) = 0.4 P (X 2 = 1) = f x 1x2x (2,1,0) + f (0,1,2) + f 3 x1x2 x3 x 1x2x3(1,1,1) = 0.3 P (X 2 = 2) = f (1,2,0) + f x1x2 x3 x 1x2x3(0,2,1) = 0.2 P (X 2 = 3) = f x x (0,3,0) = 0.1 1 2x3 Also, E(x 2 ) = 0(0.4) + 1(0.3) + 2(0.2) + 3(0.1) = 1 Sec 5-1.5 More Than Two Random Variables 27

Distribution of a Subset of Random Variables Sec 5-1.5 More Than Two Random Variables 28

Conditional Probability Distributions Conditional probability distributions can be developed for multiple random variables by extension of the ideas used for two random variables. Suppose p = 5 and we wish to find the distribution conditional on X 4 and X 5. f x, x, x X X X 1 2 3 4 5 4 5 1 2 3 for f x, x 0. X X X X 4 5 f x, x, x, x, x X X X X X 1 2 3 4 5 4 5 1 2 3 4 5 f x, x X X 4 5 Sec 5-1.5 More Than Two Random Variables 29

Independence with Multiple Variables The concept of independence can be extended to multiple variables. Sec 5-1.5 More Than Two Random Variables 30

Example 5-18: Layer Thickness Suppose X 1,X 2, and X 3 represent the thickness in μm of a substrate, an active layer and a coating layer of a chemical product. Assume that these variables are independent and normally distributed with parameters and specified limits as tabled. What proportion of the product meets all specifications? Answer: 0.7783, 3 layer product. Which one of the three thicknesses has the least probability of meeting specs? Answer: Layer 3 has least prob. Parameters and specified limits Normal Random Variables X 1 X 2 X 3 Mean (μ) 10,000 1,000 80 Std dev (σ) 250 20 4 Lower limit 9,200 950 75 Upper limit 10,800 1,050 85 P(in limits) 0.99863 0.98758 0.78870 P(all in limits) = 0.77783 Sec 5-1.5 More Than Two Random Variables 31

Covariance Covariance is a measure of the relationship between two random variables. First, we need to describe the expected value of a function of two random variables. Let h(x, Y) denote the function of interest. Sec 5-2 Covariance & Correlation 32

Example 5-19: Expected Value of a Function of Two Random Variables For the joint probability distribution of the two random variables in Example 5-1, calculate E [(X-μ X )(Y-μ Y )]. The result is obtained by multiplying x - μ X times y - μ Y, times f xy (X,Y) for each point in the range of (X,Y). First, μ X and μ y were determined previously from the marginal distributions for X and Y: Therefore, μ X = 2.35 and μ y = 2.49 Sec 5-2 Covariance & Correlation 33

Covariance Defined Sec 5-2 Covariance & Correlation 34

Correlation (ρ = rho) Sec 5-2 Covariance & Correlation 35

Example 5-21: Covariance & Correlation Determine the covariance and correlation to the figure below. Figure 5-13 Discrete joint distribution, f(x, y). Joint Marginal Mean StDev x y f(x, y) x-μ X y-μ Y Prod 0 0 0.2-1.8-1.2 0.42 1 1 0.1-0.8-0.2 0.01 1 2 0.1-0.8 0.8-0.07 2 1 0.1 0.2-0.2 0.00 2 2 0.1 0.2 0.8 0.02 3 3 0.4 1.2 1.8 0.88 0 0.2 covariance = 1.260 1 0.2 correlation = 0.926 2 0.2 3 0.4 0 0.2 1 0.2 2 0.2 3 0.4 μ X = 1.8 μ Y = 1.8 σ X = 1.1662 σ Y = 1.1662 Sec 5-2 Covariance & Correlation 36 Note the strong positive correlation.

Independence Implies ρ = 0 If X and Y are independent random variables, σ XY = ρ XY = 0 ρ XY = 0 is necessary, but not a sufficient condition for independence. Sec 5-2 Covariance & Correlation 37

Example 5-23: Independence Implies Zero Covariance Let f xy x y 16 for 0 x 2 and 0 y 4 XY Show that E XY E X E Y 0 E X E Y XY 1 16 00 4 2 2 x ydx dy 4 3 2 1 x y dy 16 3 0 0 16 2 3 6 2 3 0 2 4 1 y 8 1 16 4 1 16 00 4 2 2 xy dx dy 4 2 2 1 2 x y dy 16 2 0 0 3 4 2 y 1 64 8 16 3 8 3 3 0 E XY XY 1 16 00 4 2 2 2 x y dx dy 4 3 2 1 2 x y dy 16 3 0 0 4 1 2 8 y dy 16 3 0 6 3 6 3 9 0 3 4 1 y 1 64 32. E XY E X E Y 32 4 8 0 9 3 3 Figure 5-15 A planar joint distribution. Sec 5-2 Covariance & Correlation 38

Multinomial Probability Distribution Suppose a random experiment consists of a series of n trials. Assume that: 1) The outcome of each trial can be classifies into one of k classes. 2) The probability of a trial resulting in one of the k outcomes is constant, and equal to p 1, p 2,, p k. 3) The trials are independent. The random variables X 1, X 2,, X k denote the number of outcomes in each class and have a multinomial distribution and probability mass function: Sec 5-3.1 Multinomial Probability Distribution 39

Example 5-25: Digital Channel Of the 20 bits received over a digital channel, 14 are of excellent quality, 3 are good, 2 are fair, 1 is poor. The sequence received was EEEEEEEEEEEEEEGGGFFP. Let the random variables X1, X2, X3, and X4 denote the number of bits that are E, G, F, and P, respectively, in a transmission of 20 bits. What is the probability that 12 bits are E, 6 bits are G, 2 are F, and 0 are P? 20! 12 6 2 0 P X1 12, X2 6, X3 2, X4 0 0.6 0.3 0.08 0.02 0.0358 12!6!2!0! Using Excel 0.03582 = (FACT(20)/(FACT(12)*FACT(6)*FACT(2))) * 0.6^12*0.3^6*0.08^2 Sec 5-3.1 Multinomial Probability Distribution 40

Multinomial Mean and Variance The marginal distributions of the multinomial are binomial. If X 1, X 2,, X k have a multinomial distribution, the marginal probability distributions of X i is binomial with: E(X i ) = np i and V(X i ) = np i (1-p i ) Sec 5-3.1 Multinomial Probability Distribution 41

Bivariate Normal Probability Density Function The probability density function of a bivariate normal distribution is 1 f XY x, y; X, X, Y, Y, e 2 2 1 X Y u where u x 2 x y y 1 21 2 2 X X Y Y 2 2 2 X XY Y for x and y. x 0, x, Parameter limits: 1 1 y 0, y, Sec 5-3.2 Bivariate Normal Distribution 42

Marginal Distributions of the Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density function f XY (x,y;σ X,σ Y,μ X,μ Y,ρ) the marginal probability distributions of X and Y are normal with means μ X and μ Y and standard deviations σ X and σ Y, respectively. Sec 5-3.2 Bivariate Normal Distribution 43

Conditional Distribution of Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density f XY (x,y;σ X,σ Y,μ X,μ Y,ρ), the conditional probability distribution of Y given X = x is normal with mean and variance as follows: Y Yx Y x X X 1 2 2 2 Yx Y Sec 5-3.2 Bivariate Normal Distribution 44

Correlation of Bivariate Normal Random Variables If X and Y have a bivariate normal distribution with joint probability density function f XY (x,y;σ X,σ Y,μ X,μ Y,ρ), the correlation between X and Y is ρ. Sec 5-3.2 Bivariate Normal Distribution 45

Bivariate Normal Correlation and Independence In general, zero correlation does not imply independence. But in the special case that X and Y have a bivariate normal distribution, if ρ = 0, then X and Y are independent. If X and Y have a bivariate normal distribution with ρ=0, X and Y are independent. Sec 5-3.2 Bivariate Normal Distribution 46

Linear Functions of Random Variables A function of random variables is itself a random variable. A function of random variables can be formed by either linear or nonlinear relationships. We limit our discussion here to linear functions. Given random variables X 1, X 2,,X p and constants c 1, c 2,, c p Y= c 1 X 1 + c 2 X 2 + + c p X p is a linear combination of X 1, X 2,,X p. Sec 5-4 Linear Functions of Random Variables 47

Mean and Variance of a Linear Function If X 1, X 2,,X p are random variables, and Y= c 1 X 1 + c 2 X 2 + + c p X p, then Sec 5-4 Linear Functions of Random Variables 48

Example 5-31: Error Propagation A semiconductor product consists of three layers. The variances of the thickness of each layer is 25, 40 and 30 nm. What is the variance of the finished product? Answer: Sec 5-4 Linear Functions of Random Variables 49

Mean and Variance of an Average Sec 5-4 Linear Functions of Random Variables 50

Reproductive Property of the Normal Distribution Sec 5-4 Linear Functions of Random Variables 51

Example 5-32: Linear Function of Independent Normal Random variables Let the random variables X 1 and X 2 denote the length and width of a manufactured part. Their parameters are shown in the table. What is the probability that the perimeter exceeds 14.5 cm? Let Y 2X 2 X perimeter 1 2 E Y 2E X 2E X 2 2 2 5 14 cm 1 2 2 2 1 2 0.20 0.4472 cm 2 2 V Y 2 V X 2 V X 4 0.1 4 0.2 0.04 0.16 0.20 SD Y 14.5 14 14.5 1 1 1.1180 0.1318.4472 PY Parameters of X 1 X 2 Mean 2 5 Std Dev 0.1 0.2 Using Excel 0.1318 = 1 - NORMDIST(14.5, 14, SQRT(0.2), TRUE) Sec 5-4 Linear Functions of Random Variables 52

General Function of a Discrete Random Variable Suppose that X is a discrete random variable with probability distribution f X (x). Let Y = h(x) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let this solution be x = u(y), the inverse transform function. Then the probability mass function of the random variable Y is f Y (y) = f X [u(y)] Sec 5-5 General Functions of Random Variables 53

Example 5-34: Function of a Discrete Random Variable Let X be a geometric random variable with probability distribution f X (x) = p(1-p) x-1, x = 1, 2, Find the probability distribution of Y = X 2. Solution: Since X 0, the transformation is one-to-one. The inverse transform function is X = y. y f Y (y) = p(1-p) -1, y = 1, 4, 9, 16, Sec 5-5 General Functions of Random Variables 54

General Function of a Continuous Random Variable Suppose that X is a continuous random variable with probability distribution f X (x). Let Y = h(x) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let this solution be x = u(y), the inverse transform function. Then the probability distribution of Y is f Y (y) = f X [u(y)] J where J = u (y) is called the Jacobian of the transformation and the absolute value of J is used. Sec 5-5 General Functions of Random Variables 55

Example 5-35: Function of a Continuous Random Variable Let X be a continuous random variable with probability distribution: x f X ( x) for 0 x 4 8 Find the probability distribution of Y = h(x) = 2X + 4 Note that Y has a one-to-one relationship to X. y 4 1 x u y and the Jacobian is J u ' y 2 2 y 4 2 1 y 4 f Y y for 4 y 12. 8 2 32 Sec 5-5 General Functions of Random Variables 56

Definition of Moments about the Origin The rth moment about the origin of the random variable X is r ' EX ( ) r r X f ( x), X discrete r X f ( x) dx, X continuous Sec 5-6 Moment Generating Functions 57

Definition of a Moment-Generating Function The moment-generating function of the random variable X is the expected value of e tx and is denoted by M X (t). That is, tx M ( t) M ( e ) X tx e f ( x), X discrete tx e f ( x) dx, X continuous Let X be a random variable with moment-generating function M X (t). Then Sec 5-6 Moment Generating Functions 58

Example 5-36 Moment-Generating Function for a Binomial Random Variable-1 Let X follows a binomial distribution, that is n x nx f ( x) p (1 p), x 0,1,..., n x Determine the moment generating function and use it to verify that the mean and variance of the binomial random variable are μ=np and σ 2 =np(1-p). The moment-generating function is n n n n M X ( t) e p (1 p) ( pe ) (1 p) x0 x x0 x tx x n x t x nx which is the binomial expansion of [ pe t (1 p)] Now the first and second order derivatives will be M ( t) npe [1 p( e 1)] and ' t t n1 x M ( t) npe (1 p npe )[1 p( e 1)] '' t t t n2 x n Sec 5-6 Moment Generating Functions 59

Example 5-36 Moment-Generating Function for a Binomial Random Variable-2 If we set t = 0 in the above two equations we get M ( t) np and ' x '' x ' 1 M ( t) np(1 p np) ' 2 Now the variance is np(1 p np) ( np) 2 ' 2 2 2 np np 2 np(1 p) 2 Hence, the mean is np and variance is np(1 p). Sec 5-6 Moment Generating Functions 60

Properties of Moment-Generating Function If X is a random variable and a is a constant, then 1 2 at 1. M ( t) e M ( t) X a X 2. M ( t) M ( at) ax X If X, X,..., X are independent random variables with n moment generating functions M ( t), M ( t),..., M ( t) X X X 1 2 respectively, and if Y X X... X then the moment generating function of Y is 1 2 3. M ( t) M ( t). M ( t)..... M ( t) Y X X X 1 2 n n n Sec 5-6 Moment Generating Functions 61

Example 5-38 Distribution of a Sum of Poisson Random Variables Suppose that X 1 and X 2 are two independent Poisson random variables with parameters λ 1 and λ 2, respectively. Determine the probability distribution of Y = X 1 + X 2. The moment-generating function of a Poisson random variable with parameter λ is t ( e 1) M () t e X M ( t) e and M ( t) e ( e 1) ( e 1) X Hence for X 1 and X 2, 1 2 X 1 2 Using MY ( t) M X ( t). M, the moment-generating 1 X ( t)..... M ( ) 2 X n t function of Y = X 1 + X 2 is M ( t) M ( t). M ( t) Y X X e e 1 2 t 1 e 2 ( 1) ( e 1) ( )( e 1) 1 2 e t Sec 5-6 Moment Generating Functions 62 t t t

Important Terms & Concepts for Chapter 5 Bivariate distribution Bivariate normal distribution Conditional mean Conditional probability density function Conditional probability mass function Conditional variance Contour plots Correlation Covariance Error propagation General functions of random variables Independence Joint probability density function Joint probability mass function Linear functions of random variables Marginal probability distribution Multinomial distribution Reproductive property of the normal distribution Chapter 5 Summary 63