Chapter 4 Multiple Random Variables

Similar documents
Chapter 4 Multiple Random Variables

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Ch. 5 Joint Probability Distributions and Random Samples

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

ECON 5350 Class Notes Review of Probability and Distribution Theory

Notes 12 Autumn 2005

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

Chapter 1 Probability Theory

Recitation 2: Probability

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

More on Distribution Function

ORF 245 Fundamentals of Statistics Joint Distributions

Lecture Notes 2 Random Variables. Random Variable

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

MAT 271E Probability and Statistics

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

MAT 271E Probability and Statistics

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term

2. A Basic Statistical Toolbox

Joint Distribution of Two or More Random Variables

Expected value of r.v. s

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

MTH135/STA104: Probability

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Probability Long-Term Memory Review Review 1

Multiple Random Variables

CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. 3.1 Concept of a Random Variable. 3.2 Discrete Probability Distributions

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown

g(y)dy = f(x)dx. y B If we write x = x(y) in the second integral, then the change of variable technique gives g(y)dy = f{x(y)} dx dy dy.

A brief review of basics of probabilities

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

12 1 = = 1

UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Bivariate distributions

18.05 Practice Final Exam

MAS108 Probability I

Class 26: review for final exam 18.05, Spring 2014

Chapter 3. Chapter 3 sections

Multivariate Random Variable

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Exercises with solutions (Set D)

Helpful Concepts for MTH 261 Final. What are the general strategies for determining the domain of a function?

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Review of Elementary Probability Lecture I Hamid R. Rabiee

MATH/STAT 3360, Probability

Continuous Random Variables

Multiple Random Variables

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Independent random variables

Name: Exam 2 Solutions. March 13, 2017

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

3.6 Conditional Distributions

Multivariate Calculus Solution 1

Intermediate Math Circles November 8, 2017 Probability II

Introduction to Probability Theory for Graduate Economics Fall 2008

Conditional Probability (cont'd)

Contents 1. Contents

What does independence look like?

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

1 Random variables and distributions

Review of Probability. CS1538: Introduction to Simulations

Chapter 4. Chapter 4 sections

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Probability and Statisitcs

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

MAT 271E Probability and Statistics

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

k P (X = k)

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

STA 247 Solutions to Assignment #1

Probability Basics. Part 3: Types of Probability. INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder

Math 1313 Experiments, Events and Sample Spaces

1 Probability Theory. 1.1 Introduction

Lecture 2. October 21, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

Notes 6 Autumn Example (One die: part 1) One fair six-sided die is thrown. X is the number showing.

and 1 P (w i)=1. n i N N = P (w i) lim

Math 105 Course Outline

Exam 1 Review With Solutions Instructor: Brian Powers

Multivariate distributions

Brief Review of Probability

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

Chapter 2 Random Variables

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Notes for Math 324, Part 20

TMA4265: Stochastic Processes

Transcription:

Chapter 4 Multiple Random Variables Chapter 41 Joint and Marginal Distributions Definition 411: An n -dimensional random vector is a function from a sample space S into Euclidean space n R, n -dimensional Eample 41 (Sample space for dice) Roll two fair dice Recall that there are 36 equally likely outcomes in this eperiment Define =sum of dice and = difference of dice, then (, ) is a two dimensional random vector To calculate probabilities in terms of (, ), we need to go back to the original sample space For any have: If 1 1 P ((, ) A ) P ( s ) = { s S:( ( s), ( s)) A} A = I I = {(, y) : I, y I } (which is called a cross-product), we have: Specifically, if A = { } { y}, we have: P((, ) A) = P( I, I ) = P( s) 1 { s S : ( s ) I, ( s ) I } 1 P((, ) A) P(, y) P( s) = = = = { s S: ( s) = and ( s) = y} For eample, we can calculate: P((, ) A) = P((1,1)) + P((1,)) + P((,1)) + P((,)) = 4/36 if A = {( y, ) : + y 4} P ( = 5, = 3) = P((4,1)) + P((1,4)) = /36 A R, we 1

Definition 413: Let (, ) be a discrete bivariate random vector Then the function f ( yfrom, ) R into R defined by f ( y, ) = f, ( y, ) = P ( =, = y) is called the joint probability mass function or joint pmf of (, ) In this case, f ( ymust, ) satisfy (1) f ( y, ) 1 and () R f(, y) = 1 ( y, ) Eample: Find f, ( y, ) for (, ) defined in Eample 41 and calculate P((, ) A) y, 3 4 5 6 7 8 9 1 11 1 1/36 1/36 1/36 1/36 1/36 1/36 1 1/18 1/18 1/18 1/18 1/18 1/18 1/18 1/18 1/18 3 1/18 1/18 1/18 4 1/18 1/18 5 1/18 Similarly with the random variable P ( A) = f ( ), we have: A {:( s ( s), ( s)) A} (, y) A, P((, ) A) = P( s) = f (, y) For eample, we can calculate P((, ) A) for A = {( y, ) : = 7 and y 4} = {(7,),(7,1),(7,),(7,3),(7,4)}, We have P((, ) A) = f, (7,) + f, (7,1) + f, (7,) + f, (7,3) + f, (7,4) = f (7,1) + f (7,3) = 1/9,,

Similarly, can calculate P((, ) A) for A = {( y, ) : + y 4} = {(,),(,1),(3,),(3,1),(4,)}, We have P((, ) A) = f, (,) + f, (,1) + f, (3,) + f, (3,1) + f, (4,) = f (,) + f (3,1) + f (4,) = 4/36,,, Eample (Selecting a committee): An ad hoc committee of three is selected randomly from a pool of 1 students consisting of three seniors, three juniors, two sophomores, and two freshmen students Let be the number of seniors and the number of juniors selected What is the joint pmf of (, )? The joint pmf of (, ) is: 1 3 3 4 f, (, y) = n(, y)/ /1 3 = y 3 y,, y =,1,,3 and + y 3 How do we compute epected value of a function g(, )? Eg(, ) = R g(, y) f(, y) ( y, ) Eample 414: Find E( ) for (, ) defined in eample 41 Let gy (, ) = y, then E 11 ( ) = gy (, ) f ( y, ) = yf ( y, ) = 13 ( y, ) R ( y, ) R 18 Theorem: A function f ( y, ) is a pmf of a random vector (, ) if and only if a f ( y, ) for all, y 3

b f(, y ) = 1 ( y, ) Eample 415 (Joint pmf for dice) Define f ( y, ) by f (,) = f (,1) = 1/6, f (1,) = f (,1) = 1/ 3 f ( y, ) = for any other (, y ) For eample, we can calculate (without referring to the original sample space) P ( = ) = P ( =, = ) + P ( = 1, = 1) = 1/6 + 1/3 = 1/ If we toss two fair dice and define = if the first die shows at most and = 1 otherwise and define = if the second die shows an odd number and = 1 other wise Then (, ) has the above pmf Theorem 416: Let (, ) be a discrete bivariate random vector with joint pmf f, (, y ) Then the marginal pmfs of and, f ( ) = P( = ) and f ( y) = P( = y), are given by f ( ) = R f (, y) and f ( y) = R f, (, y) y, Eample 417 (Marginal pmf for dice eample): y, 3 4 5 6 7 8 9 1 11 1 f ( y) 1/36 1/36 1/36 1/36 1/36 1/36 6/36 1 1/18 1/18 1/18 1/18 1/18 1/36 4

1/18 1/18 1/18 1/18 8/36 3 1/18 1/18 1/18 6/36 4 1/18 1/18 4/36 5 1/18 /36 f ( ) 1/36 /36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 /36 1/36 1 Eample 418 (Dice probabilities) Calculate the quantities involved only or using the joint or marginal pmf P ( < 3) = P ( = ) + P ( = 1) + P ( = ) = (6 + 1 + 8) /36 = /3 1 = y= P ( < 3) = P ( =, = y ) = /3 Note: It is possible to have the same marginals but not the same joint pmf Hence, the joint pmf cannot be determined from just the marginals Eample 419 (Same marginals, different joint pmf) Define a joint pmf by (1) f (,) = f(,1) = 1/ 6, f(1,) = f(1,1) = 1/3, and () f (,) = 1/1, f(1,) = 5/1, f(,1) = f(1,1) = 3/1 Then they have same marginal distributions for and 5

(1) f () = f(,) + f(,1) == 1/ 6 + 1/ 6 = 1/3, f () = f(,) + f(1,) = 1/ 6 + 1/ 3 = 1/ () f () = f(,) + f(,1) == 1/1 + 3/1 = 1/3, f () = f(,) + f(1,) = 1/1 + 5/1 = 1/ Definition 411: A function f ( y, ) from continuous bivariate random vector (, ) if for every R into R is called a joint density function or joint pdf of the A R, P((, ) A) f(, y) ddy = A Notes: R A valid joint pdf f ( y, ) must satisfy (1) f ( y, ) and () f (, y) ddy = 1 The epected value of a real-valued function g(, ) is Eg(, ) g(, y) f(, y) ddy = R The marginal pdfs of and are given by f ( ) = f(, y) dy and f ( y) = f(, y) d Definition: The joint cdf of((, ) is defined by F(, y) = P(, y) for all For the continuous case, which by the Fundamental Theorem of Calculus implies y F(, y) f( s, t) dsdt = F(, y) = y f ( y, ) y R (, ) 6

Eample 4111 (Calculating joint probabilities) Define the joint pdf by y < < < y < f(, y) =, otherwise 6, 1 and 1, (1) Show that this is a valid joint pdf () Find f ( ) and f ( y ) (3) Find P ( + 1) (1) () f ( ) =, f ( y) 3y = (3) 1 1 1 1 1 y 1 P ( + 1) = 6 yddy= 6 ydyd= 9/1 Eample 411: Let the continuous random vector (, ) have joint pdf y e, < < y<, f(, y) =, otherwise y Equivalently, f ( y, ) = e I{( uv, ): < u< v<} ( y, ) Please find P ( + 1) and marginal pdf of and 7

1/ 1 y P ( + 1) = 1 P ( + < 1) = 1 e dyd 1/ (1 ) 1/ 1 1 ( e e ) d e e = =, for f ( ) = f (, y) dy= ep( y) dy= ep( ) > y, for f ( y) = f (, y) d= ep( y) d= yep( y) y > Eample (Tensile Strength): The tensile strengths and of two kinds of nylon fibers have joint density function proportional to ep( ( + y)/ λ) for >, y > Then the joint pdf of (, ) is given by f, (, y) = cep( ( + y)/ λ) for >, y > (a) Find c (b) Find the cdf of (, ) (c) Find P ( + > 1) (1) () We have:, ep( / λ) ep( / λ) λ 1 = f (, y) ddy = cep( ( + y)/ λ) ddy = c d y dy= c 8

(3) We have: F (, y) = f ( s, t) dsdt = cep( ( s + t)/ λ) dsdt,, y y = c ep( s/ λ) ds cep( t/ λ) dt = [1 ep( / λ)][1 ep( y/ λ)] y P ( + > λ) = f ( yddy, ) = 1 f ( yddy, ) λ + y> λ, + y λ, λ y = 1 ( cep( ( + y)/ λ) d) dy λ = 1 ( cep( y/ λ)( ep( / λ) d) dy λ y λ 1 = 1 ep( y/ λ)[1 ep( ( λ y) / λ)] dy λ 1 λ = 1 (ep( y/ λ) ep( 1)) dy λ = 1 (1 ep( 1) ep( 1)) = ep( 1) 9

P ( + > λ) = f ( yddy, ) λ + y> λ, = ( cep( ( + y)/ λ) dy) d + ( cep( ( + y)/ λ) dy) d λ λ λ 1 1 1 1 = ep( / λ)( ep( y/ λ) dy) d ep( / λ)( ep( y/ λ) dy) d λ + λ λ λ λ λ λ 1 1 = ep( / λ)ep( ( λ ) / λ) d+ ep( / λ) d λ λ λ 1 λ = ep( 1) d ep( / λ) ep( 1) λ = λ Eample (Obtaining pdf from cdf): Consider the cdf Then the pdf is: F, (, y) = [1 ep( )][1 ep( y)] for > and y > f, ( y, ) = Fy (, ) = ep( )ep( y) y Eample (Obtaining cdf from pdf): Consider the following pdf: f ( y, ) = 1 for < < 1, < y < 1 Then the corresponding cdf is: for > and y > 1

, < or y< ; y, < 1, y < 1; F(, y) = y,1, y< 1;, < 1,1 y; 1,1,1 y Chapter 4 Conditional Distributions and Independence Definition 41: Let (, ) be a discrete bivariate random vector with joint pmf f ( y, ) and marginal pmfs f ( ) and f ( y ) For any such such that P ( = ) = f ( ) >, the conditional pmf of given that = is the function of y denoted by f ( y ) : f ( y ) = P( = y = ) = f(, y)/ f ( ) For any such y such that P ( = y) = f ( y) >, the conditional pmf of given that = y is the function of denoted by f ( y ) : f ( y) = P( = = y) = f(, y)/ f ( y) Note: f ( y ) is a valid pmf since f ( y ), because f ( y, ) and f ( ) being joint and marginal pmfs, and since f( y ) = f(, y)/ f ( ) = f ( )/ f ( ) = 1 y y Eample 4: Define the joint pmf of (, ) by f (,1) = f (,) = /18, f (1,1) = f (1,3) = 3/18, f (1, ) = 4 /18, f (,3) = 4/18, that is y, 1 3 f ( y ) 1 /18 3/18 5/18 11

/18 4/18 6/18 3 3/18 4/18 7/18 f ( ) 4/18 1/18 4/18 1 (a) Obtain the conditional distribution of given (b) Find P ( > 1 = 1) = (a) f (1) = 1/18 f ( = 1 = 1) = 3/1, f ( = = 1) = 4/1, f ( = 3 = 1) = 3/1 (b) P ( > 1 = 1) = f( = = 1) + f( = 3 = 1) = 7/1 P ( > 1 = 1) = P ( > 1, = 1)/ P( = 1) = 7/1 Eample (Coin tossing): Consider tossing a fair coin three times and let be the number of heads and the number of tails before the first head, then the joint pmf of (, ) is given in the following table: 1 3 P ( = y) 1/8 /8 1/8 4/8 1 1/8 1/8 /8 1/8 1/8 3 1/8 1/8 P ( = ) 1/8 3/8 3/8 1/8 1 The conditional pmf f ( y ) is: 1

The epected value and moments: The probability: 1/4, = 1 or 3; f( ) = P ( = = ) = P ( =, = ) / P ( = ) = 1/, = ;, otherwise 1/, = 1 or ; f( 1) = P( = = 1) = P( =, = 1) / P( = 1) =, otherwise E( = ) = 1/4 + 1/* + 1/4*3 = E ( = ) = 1/4 + 1/* + 1/4*3 = 45 P ( = ) = P ( = = ) + P ( = 1 = ) + P ( = = ) = 3/4 Definition 43: Let (, ) be a continuous bivariate random vector with joint pdf f ( y, ) and marginal pdfs f ( ) and f ( y ) For any such such that f ( ) >, the conditional pdf of given that = is the function of y denoted by f ( y ) : f ( y ) = f(, y)/ f ( ) For any y such that f ( y ) >, the conditional pdf of given that = y is the function of denoted by f ( y ) : f ( y) = f(, y)/ f ( y) Eample 44: Let the continuous random vector (, ) have joint pdf f ( y, ) = e I ( y, ) y, {( uv, ): < u< v<} (a) Find the marginal pdf of (b) For any such that f ( ) >, find f ( y ) 13

(a) (, ) (b) f ( ) = e I ( ) f (, y) f y e I y, ( y ) ( ) = = (, ) ( ), > f ( ) Calculating Epected Values Using Conditional pmfs or pdfs: Let (, ) be a discrete (continuous) bivariate random vector with joint pmf (pdf) f ( y, ) and marginal pmfs (pdfs) f ( ) and f ( y ), and g ( ) is a function of, then the conditional epected value of g( ) given that = is denoted by E( g ( ) ) and is given by E( g ( ) ) = g( y) f( y ) [ E( g ( ) ) = g( y) f( y dy ) ] y Eample 44: (continued) (c) Find E( = ) (d) Find Var( = ) ( ) ( ) y 1 (c) E = = ye dy= + (d) ( ) y z E = = y e dy= ( z+ ) e dz = + +, therefore, ( ) Var E E ( = ) = ( = ) ( ( = )) = 1 Eample: A soft drink machine has random amount in supply at the beginning of a given day and dispenses a random amount 1 during the day It has been observed that they have the following joint pdf: 14

f( y1, y ) = 1/ for y1 y Then the conditional pdf of 1 given = y is: f ( y1 = y) = f( y1, y)/ f ( y ) = 1/ y for y1 y The probability P ( 1/ = 15) = 1//15= 1/3 1 Definition 45: Let (, ) be a bivariate random vector with joint pdf or pmf f ( y, ) and marginal pdfs or pmfs f ( ) and f ( y ) Then and are called independent random variables, if for every R and y R, f ( y, ) = f ( f ) ( y) Consequently, if and are independent, f ( y ) = f ( y) and f ( y) = f ( ) Technical Note: If f ( y, ) is the joint pdf for the continuous random vector (, ) where f ( y, ) f( f ) ( y) on a set Asuch that f (, y ) ddy=, then and are still called independent random A variables Eample 46: Consider the discrete bivariate random vector (, ), with joint pmf given by f (1,1) = f(,1) = f(,) = 1/1, f (1,) = f (1,3) = 1/5, and f (,3) = 3/1 Find the marginals of and Are and independent? No Because f(1,3) = 1/ 5 f (1) f (3) = 1/ 4 Question: Can we check for independence without knowing the marginals? 15

Lemma 47: Let (, ) be a bivariate random vector with joint pdf or pmf f (, y ) Then and are independent random variables if and only if there eist functions g( ) and h( y) such that, for every R and y R, f ( y, ) = ghy ( ) ( ) In this case f ( ) = cg( ) and f ( y) = dh( y), where c and d are some constants that would make f ( ) and f ( y ) valid pdfs or pmfs Eample 48: Consider the joint pdf independent random variables f(, y) 1 384 4 y ( /) = y e, > and y > By Lemma 47, and are Notes: 1 Consider the set {(, y) : A and y B}, where A= { : f ( ) > } and B= { y: f ( ) > }, then this set is called a cross-product and denoted by A B If f ( y, ) is a joint pdf or pmf such that the set {(, y) : f(, y ) > } is not a cross-product, then the random variables and with joint pdf or pmf f ( y, ) are not independent 3 If it is known that and are independent random variables with marginal pdfs (or pmfs) f ( ) and f ( y ), then the joint pdf (or pmf) of and is given by f ( y, ) = f( f ) ( y) (See Eample 49 for discrete case) Theorem 41: Let and be independent random variables a For any A R and B R, P ( A, B) = P ( AP ) ( B) b Let g( ) be a function only of and h( y ) be a function only of y Then E( g( ) h( )) = E( g( )) E( h( )) 16

Proof: Eample 411: Let and be independent eponential(1) random variables a Find the joint pdf of and b Find P ( 4, < 3) c Find E( ) Note: Checking Independence: Let (, ) be a random vector, then and are called independent random variables if for every C = {(, y) : A, y B}, we have P((, ) C) = P( A, B) = P( A) P( B) In addition, this definition can be simplified in terms of cdf with certain conditions: and are called independent random variables if F, (, y) = F ( ) F ( y) for and Eample (Checking Independence): Suppose (, ) has the joint pdf: f, ( y, ) = (1 + y)/4 for < 1 and y < 1 Then and is not independent but and are independent The marginal pdf of is: 1 f( ) = f, (, y) dy = [(1 + y)/4] dy = 1/ 1 By symmetry, the marginal pdf of is: f ( y ) = 1/ for y < 1 for 1 < 17

So f, ( y, ) f( f ) ( y) and and is not independent To get the joint pdf of (, ), we first get the joint cdf of (, ): F (, y) = P(, y) (< < 1,< y< 1) So the joint pdf of (, ) is:, = P(, y y) y = { [(1 + st)/4] dt} ds y 1 = = = ( yds ) y P ( P ) ( y) 1 f (, y) = F (, y) =,, y 4 y for < < 1, < y < 1 Theorem 41: Let and be independent random variables with moment generating functions M () t and M () t Then the moment generating function of the random variable Z = + is given by M Z() t = M() t M() t Proof: Eample 413: Let ~ N( μ, σ ) and ~ n( γ, τ ) Find the pdf of Z = + (This is a very important result!!) 18